Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.
Kang, Yun; Lanchier, Nicolas
2011-06-01
We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the probability of extinction for the stochastic model.
Using stochastic models to incorporate spatial and temporal variability [Exercise 14
Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke
2003-01-01
To this point, our analysis of population processes and viability in the western prairie fringed orchid has used only deterministic models. In this exercise, we conduct a similar analysis, using a stochastic model instead. This distinction is of great importance to population biology in general and to conservation biology in particular. In deterministic models,...
Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model
Nené, Nuno R.; Dunham, Alistair S.; Illingworth, Christopher J. R.
2018-01-01
A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. PMID:29500183
Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model.
Nené, Nuno R; Dunham, Alistair S; Illingworth, Christopher J R
2018-05-01
A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. Copyright © 2018 Nené et al.
Demographic noise can reverse the direction of deterministic selection
Constable, George W. A.; Rogers, Tim; McKane, Alan J.; Tarnita, Corina E.
2016-01-01
Deterministic evolutionary theory robustly predicts that populations displaying altruistic behaviors will be driven to extinction by mutant cheats that absorb common benefits but do not themselves contribute. Here we show that when demographic stochasticity is accounted for, selection can in fact act in the reverse direction to that predicted deterministically, instead favoring cooperative behaviors that appreciably increase the carrying capacity of the population. Populations that exist in larger numbers experience a selective advantage by being more stochastically robust to invasions than smaller populations, and this advantage can persist even in the presence of reproductive costs. We investigate this general effect in the specific context of public goods production and find conditions for stochastic selection reversal leading to the success of public good producers. This insight, developed here analytically, is missed by the deterministic analysis as well as by standard game theoretic models that enforce a fixed population size. The effect is found to be amplified by space; in this scenario we find that selection reversal occurs within biologically reasonable parameter regimes for microbial populations. Beyond the public good problem, we formulate a general mathematical framework for models that may exhibit stochastic selection reversal. In this context, we describe a stochastic analog to r−K theory, by which small populations can evolve to higher densities in the absence of disturbance. PMID:27450085
Optimal Vaccination in a Stochastic Epidemic Model of Two Non-Interacting Populations
2015-02-17
of diminishing returns from vacci- nation will generally take place at smaller vaccine allocations V compared to the deterministic model. Optimal...take place and small r0 values where it does not is illustrat- ed in Fig. 4C. As r0 is decreased, the region between the two instances of switching...approximately distribute vaccine in proportion to population size. For large r0 (r0 ≳ 2.9), two switches take place . In the deterministic optimal solution, a
Population density equations for stochastic processes with memory kernels
NASA Astrophysics Data System (ADS)
Lai, Yi Ming; de Kamps, Marc
2017-06-01
We present a method for solving population density equations (PDEs)-a mean-field technique describing homogeneous populations of uncoupled neurons—where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation—a recent result from random network theory—describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.
Creating a stage-based deterministic PVA model - the western prairie fringed orchid [Exercise 12
Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke
2003-01-01
Contemporary efforts to conserve populations and species often employ population viability analysis (PVA), a specific application of population modeling that estimates the effects of environmental and demographic processes on population growth rates. These models can also be used to estimate probabilities that a population will fall below a certain level. This...
Theory and applications of a deterministic approximation to the coalescent model
Jewett, Ethan M.; Rosenberg, Noah A.
2014-01-01
Under the coalescent model, the random number nt of lineages ancestral to a sample is nearly deterministic as a function of time when nt is moderate to large in value, and it is well approximated by its expectation E[nt]. In turn, this expectation is well approximated by simple deterministic functions that are easy to compute. Such deterministic functions have been applied to estimate allele age, effective population size, and genetic diversity, and they have been used to study properties of models of infectious disease dynamics. Although a number of simple approximations of E[nt] have been derived and applied to problems of population-genetic inference, the theoretical accuracy of the formulas and the inferences obtained using these approximations is not known, and the range of problems to which they can be applied is not well understood. Here, we demonstrate general procedures by which the approximation nt ≈ E[nt] can be used to reduce the computational complexity of coalescent formulas, and we show that the resulting approximations converge to their true values under simple assumptions. Such approximations provide alternatives to exact formulas that are computationally intractable or numerically unstable when the number of sampled lineages is moderate or large. We also extend an existing class of approximations of E[nt] to the case of multiple populations of time-varying size with migration among them. Our results facilitate the use of the deterministic approximation nt ≈ E[nt] for deriving functionally simple, computationally efficient, and numerically stable approximations of coalescent formulas under complicated demographic scenarios. PMID:24412419
Lv, Qiming; Schneider, Manuel K; Pitchford, Jonathan W
2008-08-01
We study individual plant growth and size hierarchy formation in an experimental population of Arabidopsis thaliana, within an integrated analysis that explicitly accounts for size-dependent growth, size- and space-dependent competition, and environmental stochasticity. It is shown that a Gompertz-type stochastic differential equation (SDE) model, involving asymmetric competition kernels and a stochastic term which decreases with the logarithm of plant weight, efficiently describes individual plant growth, competition, and variability in the studied population. The model is evaluated within a Bayesian framework and compared to its deterministic counterpart, and to several simplified stochastic models, using distributional validation. We show that stochasticity is an important determinant of size hierarchy and that SDE models outperform the deterministic model if and only if structural components of competition (asymmetry; size- and space-dependence) are accounted for. Implications of these results are discussed in the context of plant ecology and in more general modelling situations.
Uniqueness of Nash equilibrium in vaccination games.
Bai, Fan
2016-12-01
One crucial condition for the uniqueness of Nash equilibrium set in vaccination games is that the attack ratio monotonically decreases as the vaccine coverage level increasing. We consider several deterministic vaccination models in homogeneous mixing population and in heterogeneous mixing population. Based on the final size relations obtained from the deterministic epidemic models, we prove that the attack ratios can be expressed in terms of the vaccine coverage levels, and also prove that the attack ratios are decreasing functions of vaccine coverage levels. Some thresholds are presented, which depend on the vaccine efficacy. It is proved that for vaccination games in homogeneous mixing population, there is a unique Nash equilibrium for each game.
Stochastic population dynamic models as probability networks
M.E. and D.C. Lee Borsuk
2009-01-01
The dynamics of a population and its response to environmental change depend on the balance of birth, death and age-at-maturity, and there have been many attempts to mathematically model populations based on these characteristics. Historically, most of these models were deterministic, meaning that the results were strictly determined by the equations of the model and...
Population-level effects of the mysid, Americamysis bahia, exposed to varying thiobencarb concentrations were estimated using stage-structured matrix models. A deterministic density-independent matrix model estimated the decrease in population growth rate, l, with increas...
Parameter Estimation in Epidemiology: from Simple to Complex Dynamics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Ballesteros, Sebastién; Boto, João Pedro; Kooi, Bob W.; Mateus, Luís; Stollenwerk, Nico
2011-09-01
We revisit the parameter estimation framework for population biological dynamical systems, and apply it to calibrate various models in epidemiology with empirical time series, namely influenza and dengue fever. When it comes to more complex models like multi-strain dynamics to describe the virus-host interaction in dengue fever, even most recently developed parameter estimation techniques, like maximum likelihood iterated filtering, come to their computational limits. However, the first results of parameter estimation with data on dengue fever from Thailand indicate a subtle interplay between stochasticity and deterministic skeleton. The deterministic system on its own already displays complex dynamics up to deterministic chaos and coexistence of multiple attractors.
The Office of Pesticide Programs models daily aquatic pesticide exposure values for 30 years in its risk assessments. However, only a fraction of that information is typically used in these assessments. The population model employed herein is a deterministic, density-dependent pe...
A Stochastic Tick-Borne Disease Model: Exploring the Probability of Pathogen Persistence.
Maliyoni, Milliward; Chirove, Faraimunashe; Gaff, Holly D; Govinder, Keshlan S
2017-09-01
We formulate and analyse a stochastic epidemic model for the transmission dynamics of a tick-borne disease in a single population using a continuous-time Markov chain approach. The stochastic model is based on an existing deterministic metapopulation tick-borne disease model. We compare the disease dynamics of the deterministic and stochastic models in order to determine the effect of randomness in tick-borne disease dynamics. The probability of disease extinction and that of a major outbreak are computed and approximated using the multitype Galton-Watson branching process and numerical simulations, respectively. Analytical and numerical results show some significant differences in model predictions between the stochastic and deterministic models. In particular, we find that a disease outbreak is more likely if the disease is introduced by infected deer as opposed to infected ticks. These insights demonstrate the importance of host movement in the expansion of tick-borne diseases into new geographic areas.
Pest persistence and eradication conditions in a deterministic model for sterile insect release.
Gordillo, Luis F
2015-01-01
The release of sterile insects is an environment friendly pest control method used in integrated pest management programmes. Difference or differential equations based on Knipling's model often provide satisfactory qualitative descriptions of pest populations subject to sterile release at relatively high densities with large mating encounter rates, but fail otherwise. In this paper, I derive and explore numerically deterministic population models that include sterile release together with scarce mating encounters in the particular case of species with long lifespan and multiple matings. The differential equations account separately the effects of mating failure due to sterile male release and the frequency of mating encounters. When insects spatial spread is incorporated through diffusion terms, computations reveal the possibility of steady pest persistence in finite size patches. In the presence of density dependence regulation, it is observed that sterile release might contribute to induce sudden suppression of the pest population.
Dual Roles for Spike Signaling in Cortical Neural Populations
Ballard, Dana H.; Jehee, Janneke F. M.
2011-01-01
A prominent feature of signaling in cortical neurons is that of randomness in the action potential. The output of a typical pyramidal cell can be well fit with a Poisson model, and variations in the Poisson rate repeatedly have been shown to be correlated with stimuli. However while the rate provides a very useful characterization of neural spike data, it may not be the most fundamental description of the signaling code. Recent data showing γ frequency range multi-cell action potential correlations, together with spike timing dependent plasticity, are spurring a re-examination of the classical model, since precise timing codes imply that the generation of spikes is essentially deterministic. Could the observed Poisson randomness and timing determinism reflect two separate modes of communication, or do they somehow derive from a single process? We investigate in a timing-based model whether the apparent incompatibility between these probabilistic and deterministic observations may be resolved by examining how spikes could be used in the underlying neural circuits. The crucial component of this model draws on dual roles for spike signaling. In learning receptive fields from ensembles of inputs, spikes need to behave probabilistically, whereas for fast signaling of individual stimuli, the spikes need to behave deterministically. Our simulations show that this combination is possible if deterministic signals using γ latency coding are probabilistically routed through different members of a cortical cell population at different times. This model exhibits standard features characteristic of Poisson models such as orientation tuning and exponential interval histograms. In addition, it makes testable predictions that follow from the γ latency coding. PMID:21687798
The Stochastic Modelling of Endemic Diseases
NASA Astrophysics Data System (ADS)
Susvitasari, Kurnia; Siswantining, Titin
2017-01-01
A study about epidemic has been conducted since a long time ago, but genuine progress was hardly forthcoming until the end of the 19th century (Bailey, 1975). Both deterministic and stochastic models were used to describe these. Then, from 1927 to 1939 Kermack and McKendrick introduced a generality of this model, including some variables to consider such as rate of infection and recovery. The purpose of this project is to investigate the behaviour of the models when we set the basic reproduction number, R0. This quantity is defined as the expected number of contacts made by a typical infective to susceptibles in the population. According to the epidemic threshold theory, when R0 ≤ 1, minor epidemic occurs with probability one in both approaches, but when R0 > 1, the deterministic and stochastic models have different interpretation. In the deterministic approach, major epidemic occurs with probability one when R0 > 1 and predicts that the disease will settle down to an endemic equilibrium. Stochastic models, on the other hand, identify that the minor epidemic can possibly occur. If it does, then the epidemic will die out quickly. Moreover, if we let the population size be large and the major epidemic occurs, then it will take off and then reach the endemic level and move randomly around the deterministic’s equilibrium.
NASA Astrophysics Data System (ADS)
Liu, Xiangdong; Li, Qingze; Pan, Jianxin
2018-06-01
Modern medical studies show that chemotherapy can help most cancer patients, especially for those diagnosed early, to stabilize their disease conditions from months to years, which means the population of tumor cells remained nearly unchanged in quite a long time after fighting against immune system and drugs. In order to better understand the dynamics of tumor-immune responses under chemotherapy, deterministic and stochastic differential equation models are constructed to characterize the dynamical change of tumor cells and immune cells in this paper. The basic dynamical properties, such as boundedness, existence and stability of equilibrium points, are investigated in the deterministic model. Extended stochastic models include stochastic differential equations (SDEs) model and continuous-time Markov chain (CTMC) model, which accounts for the variability in cellular reproduction, growth and death, interspecific competitions, and immune response to chemotherapy. The CTMC model is harnessed to estimate the extinction probability of tumor cells. Numerical simulations are performed, which confirms the obtained theoretical results.
Boskova, Veronika; Bonhoeffer, Sebastian; Stadler, Tanja
2014-01-01
Quantifying epidemiological dynamics is crucial for understanding and forecasting the spread of an epidemic. The coalescent and the birth-death model are used interchangeably to infer epidemiological parameters from the genealogical relationships of the pathogen population under study, which in turn are inferred from the pathogen genetic sequencing data. To compare the performance of these widely applied models, we performed a simulation study. We simulated phylogenetic trees under the constant rate birth-death model and the coalescent model with a deterministic exponentially growing infected population. For each tree, we re-estimated the epidemiological parameters using both a birth-death and a coalescent based method, implemented as an MCMC procedure in BEAST v2.0. In our analyses that estimate the growth rate of an epidemic based on simulated birth-death trees, the point estimates such as the maximum a posteriori/maximum likelihood estimates are not very different. However, the estimates of uncertainty are very different. The birth-death model had a higher coverage than the coalescent model, i.e. contained the true value in the highest posterior density (HPD) interval more often (2–13% vs. 31–75% error). The coverage of the coalescent decreases with decreasing basic reproductive ratio and increasing sampling probability of infecteds. We hypothesize that the biases in the coalescent are due to the assumption of deterministic rather than stochastic population size changes. Both methods performed reasonably well when analyzing trees simulated under the coalescent. The methods can also identify other key epidemiological parameters as long as one of the parameters is fixed to its true value. In summary, when using genetic data to estimate epidemic dynamics, our results suggest that the birth-death method will be less sensitive to population fluctuations of early outbreaks than the coalescent method that assumes a deterministic exponentially growing infected population. PMID:25375100
Sanz, Luis; Alonso, Juan Antonio
2017-12-01
In this work we develop approximate aggregation techniques in the context of slow-fast linear population models governed by stochastic differential equations and apply the results to the treatment of populations with spatial heterogeneity. Approximate aggregation techniques allow one to transform a complex system involving many coupled variables and in which there are processes with different time scales, by a simpler reduced model with a fewer number of 'global' variables, in such a way that the dynamics of the former can be approximated by that of the latter. In our model we contemplate a linear fast deterministic process together with a linear slow process in which the parameters are affected by additive noise, and give conditions for the solutions corresponding to positive initial conditions to remain positive for all times. By letting the fast process reach equilibrium we build a reduced system with a lesser number of variables, and provide results relating the asymptotic behaviour of the first- and second-order moments of the population vector for the original and the reduced system. The general technique is illustrated by analysing a multiregional stochastic system in which dispersal is deterministic and the rate growth of the populations in each patch is affected by additive noise.
The Stochastic Multi-strain Dengue Model: Analysis of the Dynamics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Stollenwerk, Nico; Kooi, Bob W.
2011-09-01
Dengue dynamics is well known to be particularly complex with large fluctuations of disease incidences. An epidemic multi-strain model motivated by dengue fever epidemiology shows deterministic chaos in wide parameter regions. The addition of seasonal forcing, mimicking the vectorial dynamics, and a low import of infected individuals, which is realistic in the dynamics of infectious diseases epidemics show complex dynamics and qualitatively a good agreement between empirical DHF monitoring data and the obtained model simulation. The addition of noise can explain the fluctuations observed in the empirical data and for large enough population size, the stochastic system can be well described by the deterministic skeleton.
Hybrid deterministic/stochastic simulation of complex biochemical systems.
Lecca, Paola; Bagagiolo, Fabio; Scarpa, Marina
2017-11-21
In a biological cell, cellular functions and the genetic regulatory apparatus are implemented and controlled by complex networks of chemical reactions involving genes, proteins, and enzymes. Accurate computational models are indispensable means for understanding the mechanisms behind the evolution of a complex system, not always explored with wet lab experiments. To serve their purpose, computational models, however, should be able to describe and simulate the complexity of a biological system in many of its aspects. Moreover, it should be implemented by efficient algorithms requiring the shortest possible execution time, to avoid enlarging excessively the time elapsing between data analysis and any subsequent experiment. Besides the features of their topological structure, the complexity of biological networks also refers to their dynamics, that is often non-linear and stiff. The stiffness is due to the presence of molecular species whose abundance fluctuates by many orders of magnitude. A fully stochastic simulation of a stiff system is computationally time-expensive. On the other hand, continuous models are less costly, but they fail to capture the stochastic behaviour of small populations of molecular species. We introduce a new efficient hybrid stochastic-deterministic computational model and the software tool MoBioS (MOlecular Biology Simulator) implementing it. The mathematical model of MoBioS uses continuous differential equations to describe the deterministic reactions and a Gillespie-like algorithm to describe the stochastic ones. Unlike the majority of current hybrid methods, the MoBioS algorithm divides the reactions' set into fast reactions, moderate reactions, and slow reactions and implements a hysteresis switching between the stochastic model and the deterministic model. Fast reactions are approximated as continuous-deterministic processes and modelled by deterministic rate equations. Moderate reactions are those whose reaction waiting time is greater than the fast reaction waiting time but smaller than the slow reaction waiting time. A moderate reaction is approximated as a stochastic (deterministic) process if it was classified as a stochastic (deterministic) process at the time at which it crosses the threshold of low (high) waiting time. A Gillespie First Reaction Method is implemented to select and execute the slow reactions. The performances of MoBios were tested on a typical example of hybrid dynamics: that is the DNA transcription regulation. The simulated dynamic profile of the reagents' abundance and the estimate of the error introduced by the fully deterministic approach were used to evaluate the consistency of the computational model and that of the software tool.
Psychological effect on single-species population models in a polluted environment.
Wei, Fengying; Chen, Lihong
2017-08-01
We formulate and investigate the psychological effect of single-species population models in a polluted environment in this paper. For the deterministic single-species population model, the conditions that guarantee the local extinction and persistence in the mean are derived firstly. We then show that, around the pollution-free equilibrium, the stochastic single-species population is weakly persistent in the mean, and is stochastically permanent under some conditions. As a consequence, some numerical simulations demonstrate the efficiency of the main results. Copyright © 2017 Elsevier Inc. All rights reserved.
A white-box model of S-shaped and double S-shaped single-species population growth
Kalmykov, Lev V.
2015-01-01
Complex systems may be mechanistically modelled by white-box modeling with using logical deterministic individual-based cellular automata. Mathematical models of complex systems are of three types: black-box (phenomenological), white-box (mechanistic, based on the first principles) and grey-box (mixtures of phenomenological and mechanistic models). Most basic ecological models are of black-box type, including Malthusian, Verhulst, Lotka–Volterra models. In black-box models, the individual-based (mechanistic) mechanisms of population dynamics remain hidden. Here we mechanistically model the S-shaped and double S-shaped population growth of vegetatively propagated rhizomatous lawn grasses. Using purely logical deterministic individual-based cellular automata we create a white-box model. From a general physical standpoint, the vegetative propagation of plants is an analogue of excitation propagation in excitable media. Using the Monte Carlo method, we investigate a role of different initial positioning of an individual in the habitat. We have investigated mechanisms of the single-species population growth limited by habitat size, intraspecific competition, regeneration time and fecundity of individuals in two types of boundary conditions and at two types of fecundity. Besides that, we have compared the S-shaped and J-shaped population growth. We consider this white-box modeling approach as a method of artificial intelligence which works as automatic hyper-logical inference from the first principles of the studied subject. This approach is perspective for direct mechanistic insights into nature of any complex systems. PMID:26038717
Study of selected phenotype switching strategies in time varying environment
NASA Astrophysics Data System (ADS)
Horvath, Denis; Brutovsky, Branislav
2016-03-01
Population heterogeneity plays an important role across many research, as well as the real-world, problems. The population heterogeneity relates to the ability of a population to cope with an environment change (or uncertainty) preventing its extinction. However, this ability is not always desirable as can be exemplified by an intratumor heterogeneity which positively correlates with the development of resistance to therapy. Causation of population heterogeneity is therefore in biology and medicine an intensively studied topic. In this paper the evolution of a specific strategy of population diversification, the phenotype switching, is studied at a conceptual level. The presented simulation model studies evolution of a large population of asexual organisms in a time-varying environment represented by a stochastic Markov process. Each organism disposes with a stochastic or nonlinear deterministic switching strategy realized by discrete-time models with evolvable parameters. We demonstrate that under rapidly varying exogenous conditions organisms operate in the vicinity of the bet-hedging strategy, while the deterministic patterns become relevant as the environmental variations are less frequent. Statistical characterization of the steady state regimes of the populations is done using the Hellinger and Kullback-Leibler functional distances and the Hamming distance.
Mathematical demography of spotted owls in the Pacific Northwest
B.R. Noon; C.M. Biles
1990-01-01
We examined the mathematical demography of northern spotted owls (Strix occidentalis caurina) using simple deterministic population models. Our goals were to gain insights into the life history strategy, to determine demographic attributes most affecting changes in population size, and to provide guidelines for effective management of spotted owl...
Incorporating GIS and remote sensing for census population disaggregation
NASA Astrophysics Data System (ADS)
Wu, Shuo-Sheng'derek'
Census data are the primary source of demographic data for a variety of researches and applications. For confidentiality issues and administrative purposes, census data are usually released to the public by aggregated areal units. In the United States, the smallest census unit is census blocks. Due to data aggregation, users of census data may have problems in visualizing population distribution within census blocks and estimating population counts for areas not coinciding with census block boundaries. The main purpose of this study is to develop methodology for estimating sub-block areal populations and assessing the estimation errors. The City of Austin, Texas was used as a case study area. Based on tax parcel boundaries and parcel attributes derived from ancillary GIS and remote sensing data, detailed urban land use classes were first classified using a per-field approach. After that, statistical models by land use classes were built to infer population density from other predictor variables, including four census demographic statistics (the Hispanic percentage, the married percentage, the unemployment rate, and per capita income) and three physical variables derived from remote sensing images and building footprints vector data (a landscape heterogeneity statistics, a building pattern statistics, and a building volume statistics). In addition to statistical models, deterministic models were proposed to directly infer populations from building volumes and three housing statistics, including the average space per housing unit, the housing unit occupancy rate, and the average household size. After population models were derived or proposed, how well the models predict populations for another set of sample blocks was assessed. The results show that deterministic models were more accurate than statistical models. Further, by simulating the base unit for modeling from aggregating blocks, I assessed how well the deterministic models estimate sub-unit-level populations. I also assessed the aggregation effects and the resealing effects on sub-unit estimates. Lastly, from another set of mixed-land-use sample blocks, a mixed-land-use model was derived and compared with a residential-land-use model. The results of per-field land use classification are satisfactory with a Kappa accuracy statistics of 0.747. Model Assessments by land use show that population estimates for multi-family land use areas have higher errors than those for single-family land use areas, and population estimates for mixed land use areas have higher errors than those for residential land use areas. The assessments of sub-unit estimates using a simulation approach indicate that smaller areas show higher estimation errors, estimation errors do not relate to the base unit size, and resealing improves all levels of sub-unit estimates.
Stochastic von Bertalanffy models, with applications to fish recruitment.
Lv, Qiming; Pitchford, Jonathan W
2007-02-21
We consider three individual-based models describing growth in stochastic environments. Stochastic differential equations (SDEs) with identical von Bertalanffy deterministic parts are formulated, with a stochastic term which decreases, remains constant, or increases with organism size, respectively. Probability density functions for hitting times are evaluated in the context of fish growth and mortality. Solving the hitting time problem analytically or numerically shows that stochasticity can have a large positive impact on fish recruitment probability. It is also demonstrated that the observed mean growth rate of surviving individuals always exceeds the mean population growth rate, which itself exceeds the growth rate of the equivalent deterministic model. The consequences of these results in more general biological situations are discussed.
Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis.
Kurhekar, Manish; Deshpande, Umesh
2016-01-01
Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website.
Mathematical analysis of epidemiological models with heterogeneity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Ark, J.W.
1992-01-01
For many diseases in human populations the disease shows dissimilar characteristics in separate subgroups of the population; for example, the probability of disease transmission for gonorrhea or AIDS is much higher from male to female than from female to male. There is reason to construct and analyze epidemiological models which allow this heterogeneity of population, and to use these models to run computer simulations of the disease to predict the incidence and prevalence of the disease. In the models considered here the heterogeneous population is separated into subpopulations whose internal and external interactions are homogeneous in the sense that eachmore » person in the population can be assumed to have all average actions for the people of that subpopulation. The first model considered is an SIRS models; i.e., the Susceptible can become Infected, and if so he eventually Recovers with temporary immunity, and after a period of time becomes Susceptible again. Special cases allow for permanent immunity or other variations. This model is analyzed and threshold conditions are given which determine whether the disease dies out or persists. A deterministic model is presented; this model is constructed using difference equations, and it has been used in computer simulations for the AIDS epidemic in the homosexual population in San Francisco. The homogeneous version and the heterogeneous version of the differential-equations and difference-equations versions of the deterministic model are analyzed mathematically. In the analysis, equilibria are identified and threshold conditions are set forth for the disease to die out if the disease is below the threshold so that the disease-free equilibrium is globally asymptotically stable. Above the threshold the disease persists so that the disease-free equilibrium is unstable and there is a unique endemic equilibrium.« less
Synchrony and entrainment properties of robust circadian oscillators
Bagheri, Neda; Taylor, Stephanie R.; Meeker, Kirsten; Petzold, Linda R.; Doyle, Francis J.
2008-01-01
Systems theoretic tools (i.e. mathematical modelling, control, and feedback design) advance the understanding of robust performance in complex biological networks. We highlight phase entrainment as a key performance measure used to investigate dynamics of a single deterministic circadian oscillator for the purpose of generating insight into the behaviour of a population of (synchronized) oscillators. More specifically, the analysis of phase characteristics may facilitate the identification of appropriate coupling mechanisms for the ensemble of noisy (stochastic) circadian clocks. Phase also serves as a critical control objective to correct mismatch between the biological clock and its environment. Thus, we introduce methods of investigating synchrony and entrainment in both stochastic and deterministic frameworks, and as a property of a single oscillator or population of coupled oscillators. PMID:18426774
Acceptability of the Kalman filter to monitor pronghorn population size
Raymond L. Czaplewski
1986-01-01
Pronghorn antelope are important components of grassland and steppe ecosystems in Wyoming. Monitoring data on the size and population dynamics of these herds are expensive and gathered only a few times each year. Reliable data include estimates of animals harvested and proportion of bucks, does, and fawns. A deterministic simulation model has been used to improve...
A stochastic chemostat model with an inhibitor and noise independent of population sizes
NASA Astrophysics Data System (ADS)
Sun, Shulin; Zhang, Xiaolu
2018-02-01
In this paper, a stochastic chemostat model with an inhibitor is considered, here the inhibitor is input from an external source and two organisms in chemostat compete for a nutrient. Firstly, we show that the system has a unique global positive solution. Secondly, by constructing some suitable Lyapunov functions, we investigate that the average in time of the second moment of the solutions of the stochastic model is bounded for a relatively small noise. That is, the asymptotic behaviors of the stochastic system around the equilibrium points of the deterministic system are studied. However, the sufficient large noise can make the microorganisms become extinct with probability one, although the solutions to the original deterministic model may be persistent. Finally, the obtained analytical results are illustrated by computer simulations.
Flint, Paul L.; Grand, James B.; Petersen, Margaret; Rockwell, Robert F.
2016-01-01
Spectacled eider Somateria fischeri numbers have declined and they are considered threatened in accordance with the US Endangered Species Act throughout their range. We synthesized the available information for spectacled eiders to construct deterministic, stochastic, and metapopulation models for this species that incorporated current estimates of vital rates such as nest success, adult survival, and the impact of lead poisoning on survival. Elasticities of our deterministic models suggested that the populations would respond most dramatically to changes in adult female survival and that the reductions in adult female survival related to lead poisoning were locally important. We also examined the sensitivity of the population to changes in lead exposure rates. With the knowledge that some vital rates vary with environmental conditions, we cast stochastic models that mimicked observed variation in productivity. We also used the stochastic model to examine the probability that a specific population will persist for periods of up to 50 y. Elasticity analysis of these models was consistent with that for the deterministic models, with perturbations to adult female survival having the greatest effect on population projections. When used in single population models, demographic data for some localities predicted rapid declines that were inconsistent with our observations in the field. Thus, we constructed a metapopulation model and examined the predictions for local subpopulations and the metapopulation over a wide range of dispersal rates. Using the metapopulation model, we were able to simulate the observed stability of local subpopulations as well as that of the metapopulation. Finally, we developed a global metapopulation model that simulates periodic winter habitat limitation, similar to that which might be experienced in years of heavy sea ice in the core wintering area of spectacled eiders in the central Bering Sea. Our metapopulation analyses suggested that no subpopulation is independent and that future management actions may be improved through a metapopulation framework. For example, management actions could include displacement of breeding females from"sink" areas that reduce the growth potential of the population as a whole. However, this action is contingent upon dispersal among local populations, for which there is limited information. Thus, we recommend that researchers examine dispersal behavior among areas on the Yukon-Kuskokwim Delta in western Alaska. The metapopulation framework could also be applied at the rangewide scale to address the density-dependent limitation of available polynya habitat during winter that may limit the recovery of small subpopulations, such as that on the Yukon-Kuskokwim Delta. Reductions in other subpopulations may be necessary to ensure an increase in the Yukon-Kuskokwim Delta population. Thus, we recommend that managers consider the interpopulation dynamics of spectacled eiders at different spatial scales in future management actions.
Chao, Lin; Rang, Camilla Ulla; Proenca, Audrey Menegaz; Chao, Jasper Ubirajara
2016-01-01
Non-genetic phenotypic variation is common in biological organisms. The variation is potentially beneficial if the environment is changing. If the benefit is large, selection can favor the evolution of genetic assimilation, the process by which the expression of a trait is transferred from environmental to genetic control. Genetic assimilation is an important evolutionary transition, but it is poorly understood because the fitness costs and benefits of variation are often unknown. Here we show that the partitioning of damage by a mother bacterium to its two daughters can evolve through genetic assimilation. Bacterial phenotypes are also highly variable. Because gene-regulating elements can have low copy numbers, the variation is attributed to stochastic sampling. Extant Escherichia coli partition asymmetrically and deterministically more damage to the old daughter, the one receiving the mother’s old pole. By modeling in silico damage partitioning in a population, we show that deterministic asymmetry is advantageous because it increases fitness variance and hence the efficiency of natural selection. However, we find that symmetrical but stochastic partitioning can be similarly beneficial. To examine why bacteria evolved deterministic asymmetry, we modeled the effect of damage anchored to the mother’s old pole. While anchored damage strengthens selection for asymmetry by creating additional fitness variance, it has the opposite effect on symmetry. The difference results because anchored damage reinforces the polarization of partitioning in asymmetric bacteria. In symmetric bacteria, it dilutes the polarization. Thus, stochasticity alone may have protected early bacteria from damage, but deterministic asymmetry has evolved to be equally important in extant bacteria. We estimate that 47% of damage partitioning is deterministic in E. coli. We suggest that the evolution of deterministic asymmetry from stochasticity offers an example of Waddington’s genetic assimilation. Our model is able to quantify the evolution of the assimilation because it characterizes the fitness consequences of variation. PMID:26761487
Chao, Lin; Rang, Camilla Ulla; Proenca, Audrey Menegaz; Chao, Jasper Ubirajara
2016-01-01
Non-genetic phenotypic variation is common in biological organisms. The variation is potentially beneficial if the environment is changing. If the benefit is large, selection can favor the evolution of genetic assimilation, the process by which the expression of a trait is transferred from environmental to genetic control. Genetic assimilation is an important evolutionary transition, but it is poorly understood because the fitness costs and benefits of variation are often unknown. Here we show that the partitioning of damage by a mother bacterium to its two daughters can evolve through genetic assimilation. Bacterial phenotypes are also highly variable. Because gene-regulating elements can have low copy numbers, the variation is attributed to stochastic sampling. Extant Escherichia coli partition asymmetrically and deterministically more damage to the old daughter, the one receiving the mother's old pole. By modeling in silico damage partitioning in a population, we show that deterministic asymmetry is advantageous because it increases fitness variance and hence the efficiency of natural selection. However, we find that symmetrical but stochastic partitioning can be similarly beneficial. To examine why bacteria evolved deterministic asymmetry, we modeled the effect of damage anchored to the mother's old pole. While anchored damage strengthens selection for asymmetry by creating additional fitness variance, it has the opposite effect on symmetry. The difference results because anchored damage reinforces the polarization of partitioning in asymmetric bacteria. In symmetric bacteria, it dilutes the polarization. Thus, stochasticity alone may have protected early bacteria from damage, but deterministic asymmetry has evolved to be equally important in extant bacteria. We estimate that 47% of damage partitioning is deterministic in E. coli. We suggest that the evolution of deterministic asymmetry from stochasticity offers an example of Waddington's genetic assimilation. Our model is able to quantify the evolution of the assimilation because it characterizes the fitness consequences of variation.
Modeling a SI epidemic with stochastic transmission: hyperbolic incidence rate.
Christen, Alejandra; Maulén-Yañez, M Angélica; González-Olivares, Eduardo; Curé, Michel
2018-03-01
In this paper a stochastic susceptible-infectious (SI) epidemic model is analysed, which is based on the model proposed by Roberts and Saha (Appl Math Lett 12: 37-41, 1999), considering a hyperbolic type nonlinear incidence rate. Assuming the proportion of infected population varies with time, our new model is described by an ordinary differential equation, which is analogous to the equation that describes the double Allee effect. The limit of the solution of this equation (deterministic model) is found when time tends to infinity. Then, the asymptotic behaviour of a stochastic fluctuation due to the environmental variation in the coefficient of disease transmission is studied. Thus a stochastic differential equation (SDE) is obtained and the existence of a unique solution is proved. Moreover, the SDE is analysed through the associated Fokker-Planck equation to obtain the invariant measure when the proportion of the infected population reaches steady state. An explicit expression for invariant measure is found and we study some of its properties. The long time behaviour of deterministic and stochastic models are compared by simulations. According to our knowledge this incidence rate has not been previously used for this type of epidemic models.
Stochastic oscillations in models of epidemics on a network of cities
NASA Astrophysics Data System (ADS)
Rozhnova, G.; Nunes, A.; McKane, A. J.
2011-11-01
We carry out an analytic investigation of stochastic oscillations in a susceptible-infected-recovered model of disease spread on a network of n cities. In the model a fraction fjk of individuals from city k commute to city j, where they may infect, or be infected by, others. Starting from a continuous-time Markov description of the model the deterministic equations, which are valid in the limit when the population of each city is infinite, are recovered. The stochastic fluctuations about the fixed point of these equations are derived by use of the van Kampen system-size expansion. The fixed point structure of the deterministic equations is remarkably simple: A unique nontrivial fixed point always exists and has the feature that the fraction of susceptible, infected, and recovered individuals is the same for each city irrespective of its size. We find that the stochastic fluctuations have an analogously simple dynamics: All oscillations have a single frequency, equal to that found in the one-city case. We interpret this phenomenon in terms of the properties of the spectrum of the matrix of the linear approximation of the deterministic equations at the fixed point.
Evolution with Stochastic Fitness and Stochastic Migration
Rice, Sean H.; Papadopoulos, Anthony
2009-01-01
Background Migration between local populations plays an important role in evolution - influencing local adaptation, speciation, extinction, and the maintenance of genetic variation. Like other evolutionary mechanisms, migration is a stochastic process, involving both random and deterministic elements. Many models of evolution have incorporated migration, but these have all been based on simplifying assumptions, such as low migration rate, weak selection, or large population size. We thus have no truly general and exact mathematical description of evolution that incorporates migration. Methodology/Principal Findings We derive an exact equation for directional evolution, essentially a stochastic Price equation with migration, that encompasses all processes, both deterministic and stochastic, contributing to directional change in an open population. Using this result, we show that increasing the variance in migration rates reduces the impact of migration relative to selection. This means that models that treat migration as a single parameter tend to be biassed - overestimating the relative impact of immigration. We further show that selection and migration interact in complex ways, one result being that a strategy for which fitness is negatively correlated with migration rates (high fitness when migration is low) will tend to increase in frequency, even if it has lower mean fitness than do other strategies. Finally, we derive an equation for the effective migration rate, which allows some of the complex stochastic processes that we identify to be incorporated into models with a single migration parameter. Conclusions/Significance As has previously been shown with selection, the role of migration in evolution is determined by the entire distributions of immigration and emigration rates, not just by the mean values. The interactions of stochastic migration with stochastic selection produce evolutionary processes that are invisible to deterministic evolutionary theory. PMID:19816580
The critical domain size of stochastic population models.
Reimer, Jody R; Bonsall, Michael B; Maini, Philip K
2017-02-01
Identifying the critical domain size necessary for a population to persist is an important question in ecology. Both demographic and environmental stochasticity impact a population's ability to persist. Here we explore ways of including this variability. We study populations with distinct dispersal and sedentary stages, which have traditionally been modelled using a deterministic integrodifference equation (IDE) framework. Individual-based models (IBMs) are the most intuitive stochastic analogues to IDEs but yield few analytic insights. We explore two alternate approaches; one is a scaling up to the population level using the Central Limit Theorem, and the other a variation on both Galton-Watson branching processes and branching processes in random environments. These branching process models closely approximate the IBM and yield insight into the factors determining the critical domain size for a given population subject to stochasticity.
Manchanda, Ranjit; Legood, Rosa; Burnell, Matthew; McGuire, Alistair; Raikou, Maria; Loggenberg, Kelly; Wardle, Jane; Sanderson, Saskia; Gessler, Sue; Side, Lucy; Balogun, Nyala; Desai, Rakshit; Kumar, Ajith; Dorkins, Huw; Wallis, Yvonne; Chapman, Cyril; Taylor, Rohan; Jacobs, Chris; Tomlinson, Ian; Beller, Uziel; Menon, Usha
2015-01-01
Background: Population-based testing for BRCA1/2 mutations detects the high proportion of carriers not identified by cancer family history (FH)–based testing. We compared the cost-effectiveness of population-based BRCA testing with the standard FH-based approach in Ashkenazi Jewish (AJ) women. Methods: A decision-analytic model was developed to compare lifetime costs and effects amongst AJ women in the UK of BRCA founder-mutation testing amongst: 1) all women in the population age 30 years or older and 2) just those with a strong FH (≥10% mutation risk). The model assumes that BRCA carriers are offered risk-reducing salpingo-oophorectomy and annual MRI/mammography screening or risk-reducing mastectomy. Model probabilities utilize the Genetic Cancer Prediction through Population Screening trial/published literature to estimate total costs, effects in terms of quality-adjusted life-years (QALYs), cancer incidence, incremental cost-effectiveness ratio (ICER), and population impact. Costs are reported at 2010 prices. Costs/outcomes were discounted at 3.5%. We used deterministic/probabilistic sensitivity analysis (PSA) to evaluate model uncertainty. Results: Compared with FH-based testing, population-screening saved 0.090 more life-years and 0.101 more QALYs resulting in 33 days’ gain in life expectancy. Population screening was found to be cost saving with a baseline-discounted ICER of -£2079/QALY. Population-based screening lowered ovarian and breast cancer incidence by 0.34% and 0.62%. Assuming 71% testing uptake, this leads to 276 fewer ovarian and 508 fewer breast cancer cases. Overall, reduction in treatment costs led to a discounted cost savings of £3.7 million. Deterministic sensitivity analysis and 94% of simulations on PSA (threshold £20000) indicated that population screening is cost-effective, compared with current NHS policy. Conclusion: Population-based screening for BRCA mutations is highly cost-effective compared with an FH-based approach in AJ women age 30 years and older. PMID:25435542
Effects of Noise on Ecological Invasion Processes: Bacteriophage-mediated Competition in Bacteria
NASA Astrophysics Data System (ADS)
Joo, Jaewook; Eric, Harvill; Albert, Reka
2007-03-01
Pathogen-mediated competition, through which an invasive species carrying and transmitting a pathogen can be a superior competitor to a more vulnerable resident species, is one of the principle driving forces influencing biodiversity in nature. Using an experimental system of bacteriophage-mediated competition in bacterial populations and a deterministic model, we have shown in [Joo et al 2005] that the competitive advantage conferred by the phage depends only on the relative phage pathology and is independent of the initial phage concentration and other phage and host parameters such as the infection-causing contact rate, the spontaneous and infection-induced lysis rates, and the phage burst size. Here we investigate the effects of stochastic fluctuations on bacterial invasion facilitated by bacteriophage, and examine the validity of the deterministic approach. We use both numerical and analytical methods of stochastic processes to identify the source of noise and assess its magnitude. We show that the conclusions obtained from the deterministic model are robust against stochastic fluctuations, yet deviations become prominently large when the phage are more pathological to the invading bacterial strain.
Stochastic foundations in nonlinear density-regulation growth
NASA Astrophysics Data System (ADS)
Méndez, Vicenç; Assaf, Michael; Horsthemke, Werner; Campos, Daniel
2017-08-01
In this work we construct individual-based models that give rise to the generalized logistic model at the mean-field deterministic level and that allow us to interpret the parameters of these models in terms of individual interactions. We also study the effect of internal fluctuations on the long-time dynamics for the different models that have been widely used in the literature, such as the theta-logistic and Savageau models. In particular, we determine the conditions for population extinction and calculate the mean time to extinction. If the population does not become extinct, we obtain analytical expressions for the population abundance distribution. Our theoretical results are based on WKB theory and the probability generating function formalism and are verified by numerical simulations.
Climate change threatens polar bear populations: a stochastic demographic analysis.
Hunter, Christine M; Caswell, Hal; Runge, Michael C; Regehr, Eric V; Amstrup, Steve C; Stirling, Ian
2010-10-01
The polar bear (Ursus maritimus) depends on sea ice for feeding, breeding, and movement. Significant reductions in Arctic sea ice are forecast to continue because of climate warming. We evaluated the impacts of climate change on polar bears in the southern Beaufort Sea by means of a demographic analysis, combining deterministic, stochastic, environment-dependent matrix population models with forecasts of future sea ice conditions from IPCC general circulation models (GCMs). The matrix population models classified individuals by age and breeding status; mothers and dependent cubs were treated as units. Parameter estimates were obtained from a capture-recapture study conducted from 2001 to 2006. Candidate statistical models allowed vital rates to vary with time and as functions of a sea ice covariate. Model averaging was used to produce the vital rate estimates, and a parametric bootstrap procedure was used to quantify model selection and parameter estimation uncertainty. Deterministic models projected population growth in years with more extensive ice coverage (2001-2003) and population decline in years with less ice coverage (2004-2005). LTRE (life table response experiment) analysis showed that the reduction in lambda in years with low sea ice was due primarily to reduced adult female survival, and secondarily to reduced breeding. A stochastic model with two environmental states, good and poor sea ice conditions, projected a declining stochastic growth rate, log lambdas, as the frequency of poor ice years increased. The observed frequency of poor ice years since 1979 would imply log lambdas approximately - 0.01, which agrees with available (albeit crude) observations of population size. The stochastic model was linked to a set of 10 GCMs compiled by the IPCC; the models were chosen for their ability to reproduce historical observations of sea ice and were forced with "business as usual" (A1B) greenhouse gas emissions. The resulting stochastic population projections showed drastic declines in the polar bear population by the end of the 21st century. These projections were instrumental in the decision to list the polar bear as a threatened species under the U.S. Endangered Species Act.
Climate change threatens polar bear populations: A stochastic demographic analysis
Hunter, C.M.; Caswell, H.; Runge, M.C.; Regehr, E.V.; Amstrup, Steven C.; Stirling, I.
2010-01-01
The polar bear (Ursus maritimus) depends on sea ice for feeding, breeding, and movement. Significant reductions in Arctic sea ice are forecast to continue because of climate warming. We evaluated the impacts of climate change on polar bears in the southern Beaufort Sea by means of a demographic analysis, combining deterministic, stochastic, environment-dependent matrix population models with forecasts of future sea ice conditions from IPCC general circulation models (GCMs). The matrix population models classified individuals by age and breeding status; mothers and dependent cubs were treated as units. Parameter estimates were obtained from a capture-recapture study conducted from 2001 to 2006. Candidate statistical models allowed vital rates to vary with time and as functions of a sea ice covariate. Model averaging was used to produce the vital rate estimates, and a parametric bootstrap procedure was used to quantify model selection and parameter estimation uncertainty. Deterministic models projected population growth in years with more extensive ice coverage (2001-2003) and population decline in years with less ice coverage (2004-2005). LTRE (life table response experiment) analysis showed that the reduction in ?? in years with low sea ice was due primarily to reduced adult female survival, and secondarily to reduced breeding. A stochastic model with two environmental states, good and poor sea ice conditions, projected a declining stochastic growth rate, log ??s, as the frequency of poor ice years increased. The observed frequency of poor ice years since 1979 would imply log ??s ' - 0.01, which agrees with available (albeit crude) observations of population size. The stochastic model was linked to a set of 10 GCMs compiled by the IPCC; the models were chosen for their ability to reproduce historical observations of sea ice and were forced with "business as usual" (A1B) greenhouse gas emissions. The resulting stochastic population projections showed drastic declines in the polar bear population by the end of the 21st century. These projections were instrumental in the decision to list the polar bear as a threatened species under the U.S. Endangered Species Act. ?? 2010 by the Ecological Society of America.
Did the ever dead outnumber the living and when? A birth-and-death approach
NASA Astrophysics Data System (ADS)
Avan, Jean; Grosjean, Nicolas; Huillet, Thierry
2015-02-01
This paper is an attempt to formalize analytically the question raised in 'World Population Explained: Do Dead People Outnumber Living, Or Vice Versa?' Huffington Post, Howard (2012). We start developing simple deterministic Malthusian growth models of the problem (with birth and death rates either constant or time-dependent) before running into both linear birth and death Markov chain models and age-structured models.
Implementation speed of deterministic population passages compared to that of Rabi pulses
NASA Astrophysics Data System (ADS)
Chen, Jingwei; Wei, L. F.
2015-02-01
Fast Rabi π -pulse technique has been widely applied to various coherent quantum manipulations, although it requires precise designs of the pulse areas. Relaxing the precise pulse designs, various rapid adiabatic passage (RAP) approaches have been alternatively utilized to implement various population passages deterministically. However, the usual RAP protocol could not be implemented desirably fast, as the relevant adiabatic condition should be robustly satisfied during the passage. Here, we propose a modified shortcut to adiabaticity (STA) technique to accelerate significantly the desired deterministic quantum state population passages. This transitionless technique is beyond the usual rotating wave approximation (RWA) performed in the recent STA protocols, and thus can be applied to deliver various fast quantum evolutions wherein the relevant counter-rotating effects cannot be neglected. The proposal is demonstrated specifically with the driven two- and three-level systems. Numerical results show that with the present STA technique beyond the RWA the usual Stark-chirped RAPs and stimulated Raman adiabatic passages could be significantly speeded up; the deterministic population passages could be implemented as fast as the widely used fast Rabi π pulses, but are insensitive to the applied pulse areas.
The role of population inertia in predicting the outcome of stage-structured biological invasions.
Guiver, Chris; Dreiwi, Hanan; Filannino, Donna-Maria; Hodgson, Dave; Lloyd, Stephanie; Townley, Stuart
2015-07-01
Deterministic dynamic models for coupled resident and invader populations are considered with the purpose of finding quantities that are effective at predicting when the invasive population will become established asymptotically. A key feature of the models considered is the stage-structure, meaning that the populations are described by vectors of discrete developmental stage- or age-classes. The vector structure permits exotic transient behaviour-phenomena not encountered in scalar models. Analysis using a linear Lyapunov function demonstrates that for the class of population models considered, a large so-called population inertia is indicative of successful invasion. Population inertia is an indicator of transient growth or decline. Furthermore, for the class of models considered, we find that the so-called invasion exponent, an existing index used in models for invasion, is not always a reliable comparative indicator of successful invasion. We highlight these findings through numerical examples and a biological interpretation of why this might be the case is discussed. Copyright © 2015. Published by Elsevier Inc.
Stochastic models for regulatory networks of the genetic toggle switch.
Tian, Tianhai; Burrage, Kevin
2006-05-30
Bistability arises within a wide range of biological systems from the lambda phage switch in bacteria to cellular signal transduction pathways in mammalian cells. Changes in regulatory mechanisms may result in genetic switching in a bistable system. Recently, more and more experimental evidence in the form of bimodal population distributions indicates that noise plays a very important role in the switching of bistable systems. Although deterministic models have been used for studying the existence of bistability properties under various system conditions, these models cannot realize cell-to-cell fluctuations in genetic switching. However, there is a lag in the development of stochastic models for studying the impact of noise in bistable systems because of the lack of detailed knowledge of biochemical reactions, kinetic rates, and molecular numbers. In this work, we develop a previously undescribed general technique for developing quantitative stochastic models for large-scale genetic regulatory networks by introducing Poisson random variables into deterministic models described by ordinary differential equations. Two stochastic models have been proposed for the genetic toggle switch interfaced with either the SOS signaling pathway or a quorum-sensing signaling pathway, and we have successfully realized experimental results showing bimodal population distributions. Because the introduced stochastic models are based on widely used ordinary differential equation models, the success of this work suggests that this approach is a very promising one for studying noise in large-scale genetic regulatory networks.
Stochastic models for regulatory networks of the genetic toggle switch
Tian, Tianhai; Burrage, Kevin
2006-01-01
Bistability arises within a wide range of biological systems from the λ phage switch in bacteria to cellular signal transduction pathways in mammalian cells. Changes in regulatory mechanisms may result in genetic switching in a bistable system. Recently, more and more experimental evidence in the form of bimodal population distributions indicates that noise plays a very important role in the switching of bistable systems. Although deterministic models have been used for studying the existence of bistability properties under various system conditions, these models cannot realize cell-to-cell fluctuations in genetic switching. However, there is a lag in the development of stochastic models for studying the impact of noise in bistable systems because of the lack of detailed knowledge of biochemical reactions, kinetic rates, and molecular numbers. In this work, we develop a previously undescribed general technique for developing quantitative stochastic models for large-scale genetic regulatory networks by introducing Poisson random variables into deterministic models described by ordinary differential equations. Two stochastic models have been proposed for the genetic toggle switch interfaced with either the SOS signaling pathway or a quorum-sensing signaling pathway, and we have successfully realized experimental results showing bimodal population distributions. Because the introduced stochastic models are based on widely used ordinary differential equation models, the success of this work suggests that this approach is a very promising one for studying noise in large-scale genetic regulatory networks. PMID:16714385
Coevolution Maintains Diversity in the Stochastic "Kill the Winner" Model
NASA Astrophysics Data System (ADS)
Xue, Chi; Goldenfeld, Nigel
2017-12-01
The "kill the winner" hypothesis is an attempt to address the problem of diversity in biology. It argues that host-specific predators control the population of each prey, preventing a winner from emerging and thus maintaining the coexistence of all species in the system. We develop a stochastic model for the kill the winner paradigm and show that the stable coexistence state of the deterministic kill the winner model is destroyed by demographic stochasticity, through a cascade of extinction events. We formulate an individual-level stochastic model in which predator-prey coevolution promotes the high diversity of the ecosystem by generating a persistent population flux of species.
Front propagation and clustering in the stochastic nonlocal Fisher equation
NASA Astrophysics Data System (ADS)
Ganan, Yehuda A.; Kessler, David A.
2018-04-01
In this work, we study the problem of front propagation and pattern formation in the stochastic nonlocal Fisher equation. We find a crossover between two regimes: a steadily propagating regime for not too large interaction range and a stochastic punctuated spreading regime for larger ranges. We show that the former regime is well described by the heuristic approximation of the system by a deterministic system where the linear growth term is cut off below some critical density. This deterministic system is seen not only to give the right front velocity, but also predicts the onset of clustering for interaction kernels which give rise to stable uniform states, such as the Gaussian kernel, for sufficiently large cutoff. Above the critical cutoff, distinct clusters emerge behind the front. These same features are present in the stochastic model for sufficiently small carrying capacity. In the latter, punctuated spreading, regime, the population is concentrated on clusters, as in the infinite range case, which divide and separate as a result of the stochastic noise. Due to the finite interaction range, if a fragment at the edge of the population separates sufficiently far, it stabilizes as a new cluster, and the processes begins anew. The deterministic cutoff model does not have this spreading for large interaction ranges, attesting to its purely stochastic origins. We show that this mode of spreading has an exponentially small mean spreading velocity, decaying with the range of the interaction kernel.
Front propagation and clustering in the stochastic nonlocal Fisher equation.
Ganan, Yehuda A; Kessler, David A
2018-04-01
In this work, we study the problem of front propagation and pattern formation in the stochastic nonlocal Fisher equation. We find a crossover between two regimes: a steadily propagating regime for not too large interaction range and a stochastic punctuated spreading regime for larger ranges. We show that the former regime is well described by the heuristic approximation of the system by a deterministic system where the linear growth term is cut off below some critical density. This deterministic system is seen not only to give the right front velocity, but also predicts the onset of clustering for interaction kernels which give rise to stable uniform states, such as the Gaussian kernel, for sufficiently large cutoff. Above the critical cutoff, distinct clusters emerge behind the front. These same features are present in the stochastic model for sufficiently small carrying capacity. In the latter, punctuated spreading, regime, the population is concentrated on clusters, as in the infinite range case, which divide and separate as a result of the stochastic noise. Due to the finite interaction range, if a fragment at the edge of the population separates sufficiently far, it stabilizes as a new cluster, and the processes begins anew. The deterministic cutoff model does not have this spreading for large interaction ranges, attesting to its purely stochastic origins. We show that this mode of spreading has an exponentially small mean spreading velocity, decaying with the range of the interaction kernel.
Probabilistic population projections with migration uncertainty
Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.
2016-01-01
We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571
Vanwonterghem, Inka; Jensen, Paul D; Dennis, Paul G; Hugenholtz, Philip; Rabaey, Korneel; Tyson, Gene W
2014-01-01
A replicate long-term experiment was conducted using anaerobic digestion (AD) as a model process to determine the relative role of niche and neutral theory on microbial community assembly, and to link community dynamics to system performance. AD is performed by a complex network of microorganisms and process stability relies entirely on the synergistic interactions between populations belonging to different functional guilds. In this study, three independent replicate anaerobic digesters were seeded with the same diverse inoculum, supplied with a model substrate, α-cellulose, and operated for 362 days at a 10-day hydraulic residence time under mesophilic conditions. Selective pressure imposed by the operational conditions and model substrate caused large reproducible changes in community composition including an overall decrease in richness in the first month of operation, followed by synchronised population dynamics that correlated with changes in reactor performance. This included the synchronised emergence and decline of distinct Ruminococcus phylotypes at day 148, and emergence of a Clostridium and Methanosaeta phylotype at day 178, when performance became stable in all reactors. These data suggest that many dynamic functional niches are predictably filled by phylogenetically coherent populations over long time scales. Neutral theory would predict that a complex community with a high degree of recognised functional redundancy would lead to stochastic changes in populations and community divergence over time. We conclude that deterministic processes may play a larger role in microbial community dynamics than currently appreciated, and under controlled conditions it may be possible to reliably predict community structural and functional changes over time. PMID:24739627
The past, present and future of cyber-physical systems: a focus on models.
Lee, Edward A
2015-02-26
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.
The Past, Present and Future of Cyber-Physical Systems: A Focus on Models
Lee, Edward A.
2015-01-01
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical. PMID:25730486
Host mating system and the spread of a disease-resistant allele in a population
DeAngelis, D.L.; Koslow, Jennifer M.; Jiang, J.; Ruan, S.
2008-01-01
The model presented here modifies a susceptible-infected (SI) host-pathogen model to determine the influence of mating system on the outcome of a host-pathogen interaction. Both deterministic and stochastic (individual-based) versions of the model were used. This model considers the potential consequences of varying mating systems on the rate of spread of both the pathogen and resistance alleles within the population. We assumed that a single allele for disease resistance was sufficient to confer complete resistance in an individual, and that both homozygote and heterozygote resistant individuals had the same mean birth and death rates. When disease invaded a population with only an initial small fraction of resistant genes, inbreeding (selfing) tended to increase the probability that the disease would soon be eliminated from a small population rather than become endemic, while outcrossing greatly increased the probability that the population would become extinct due to the disease.
Wu, S.-S.; Wang, L.; Qiu, X.
2008-01-01
This article presents a deterministic model for sub-block-level population estimation based on the total building volumes derived from geographic information system (GIS) building data and three census block-level housing statistics. To assess the model, we generated artificial blocks by aggregating census block areas and calculating the respective housing statistics. We then applied the model to estimate populations for sub-artificial-block areas and assessed the estimates with census populations of the areas. Our analyses indicate that the average percent error of population estimation for sub-artificial-block areas is comparable to those for sub-census-block areas of the same size relative to associated blocks. The smaller the sub-block-level areas, the higher the population estimation errors. For example, the average percent error for residential areas is approximately 0.11 percent for 100 percent block areas and 35 percent for 5 percent block areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okladnikova, N.; Pesternikova, V.; Sumina, M.
1998-12-01
Phase 1 of Project 2.3, a short-term collaborative Feasibility Study, was funded for 12 months starting on 1 February 1996. The overall aim of the study was to determine the practical feasibility of using the dosimetric and clinical data on the MAYAK worker population to study the deterministic effects of exposure to external gamma radiation and to internal alpha radiation from inhaled plutonium. Phase 1 efforts were limited to the period of greatest worker exposure (1948--1954) and focused on collaboratively: assessing the comprehensiveness, availability, quality, and suitability of the Russian clinical and dosimetric data for the study of deterministic effects;more » creating an electronic data base containing complete clinical and dosimetric data on a small, representative sample of MAYAK workers; developing computer software for the testing of a currently used health risk model of hematopoietic effects; and familiarizing the US team with the Russian diagnostic criteria and techniques used in the identification of Chronic Radiation Sickness.« less
Modeling seasonal measles transmission in China
NASA Astrophysics Data System (ADS)
Bai, Zhenguo; Liu, Dan
2015-08-01
A discrete-time deterministic measles model with periodic transmission rate is formulated and studied. The basic reproduction number R0 is defined and used as the threshold parameter in determining the dynamics of the model. It is shown that the disease will die out if R0 < 1 , and the disease will persist in the population if R0 > 1 . Parameters in the model are estimated on the basis of demographic and epidemiological data. Numerical simulations are presented to describe the seasonal fluctuation of measles infection in China.
Deterministic and stochastic CTMC models from Zika disease transmission
NASA Astrophysics Data System (ADS)
Zevika, Mona; Soewono, Edy
2018-03-01
Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.
Sanchez-Palencia, Evariste; Lherminier, Philippe; Françoise, Jean-Pierre
2016-12-01
The present work is a contribution to the understanding of the sempiternal problem of the "burden of factor two" implied by sexual reproduction versus asexual one, as males are energy consumers not contributing to the production of offspring. We construct a deterministic mathematical model in population dynamics where a species enjoys both sexual and parthenogenetic capabilities of reproduction and lives on a limited resource. We then show how polygamy implies instability of a parthenogenetic population with a small number of sexually born males. This instability implies evolution of the system towards an attractor involving both (sexual and asexual) populations (which does not imply optimality of the population). We also exhibit the analogy with a parasite/host system.
Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate
NASA Astrophysics Data System (ADS)
Wang, Zhi-Gang; Gao, Rui-Mei; Fan, Xiao-Ming; Han, Qi-Xing
2014-09-01
We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ0, a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ0, when the stochastic system obeys some conditions and ℛ0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations.
Health safety nets can break cycles of poverty and disease: a stochastic ecological model.
Plucinski, Mateusz M; Ngonghala, Calistus N; Bonds, Matthew H
2011-12-07
The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a 'safety net', defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium.
NASA Astrophysics Data System (ADS)
Scherstjanoi, M.; Kaplan, J. O.; Thürig, E.; Lischke, H.
2013-09-01
Models of vegetation dynamics that are designed for application at spatial scales larger than individual forest gaps suffer from several limitations. Typically, either a population average approximation is used that results in unrealistic tree allometry and forest stand structure, or models have a high computational demand because they need to simulate both a series of age-based cohorts and a number of replicate patches to account for stochastic gap-scale disturbances. The detail required by the latter method increases the number of calculations by two to three orders of magnitude compared to the less realistic population average approach. In an effort to increase the efficiency of dynamic vegetation models without sacrificing realism, we developed a new method for simulating stand-replacing disturbances that is both accurate and faster than approaches that use replicate patches. The GAPPARD (approximating GAP model results with a Probabilistic Approach to account for stand Replacing Disturbances) method works by postprocessing the output of deterministic, undisturbed simulations of a cohort-based vegetation model by deriving the distribution of patch ages at any point in time on the basis of a disturbance probability. With this distribution, the expected value of any output variable can be calculated from the output values of the deterministic undisturbed run at the time corresponding to the patch age. To account for temporal changes in model forcing (e.g., as a result of climate change), GAPPARD performs a series of deterministic simulations and interpolates between the results in the postprocessing step. We integrated the GAPPARD method in the vegetation model LPJ-GUESS, and evaluated it in a series of simulations along an altitudinal transect of an inner-Alpine valley. We obtained results very similar to the output of the original LPJ-GUESS model that uses 100 replicate patches, but simulation time was reduced by approximately the factor 10. Our new method is therefore highly suited for rapidly approximating LPJ-GUESS results, and provides the opportunity for future studies over large spatial domains, allows easier parameterization of tree species, faster identification of areas of interesting simulation results, and comparisons with large-scale datasets and results of other forest models.
Second Cancers After Fractionated Radiotherapy: Stochastic Population Dynamics Effects
NASA Technical Reports Server (NTRS)
Sachs, Rainer K.; Shuryak, Igor; Brenner, David; Fakir, Hatim; Hahnfeldt, Philip
2007-01-01
When ionizing radiation is used in cancer therapy it can induce second cancers in nearby organs. Mainly due to longer patient survival times, these second cancers have become of increasing concern. Estimating the risk of solid second cancers involves modeling: because of long latency times, available data is usually for older, obsolescent treatment regimens. Moreover, modeling second cancers gives unique insights into human carcinogenesis, since the therapy involves administering well characterized doses of a well studied carcinogen, followed by long-term monitoring. In addition to putative radiation initiation that produces pre-malignant cells, inactivation (i.e. cell killing), and subsequent cell repopulation by proliferation can be important at the doses relevant to second cancer situations. A recent initiation/inactivation/proliferation (IIP) model characterized quantitatively the observed occurrence of second breast and lung cancers, using a deterministic cell population dynamics approach. To analyze ifradiation-initiated pre-malignant clones become extinct before full repopulation can occur, we here give a stochastic version of this I I model. Combining Monte Carlo simulations with standard solutions for time-inhomogeneous birth-death equations, we show that repeated cycles of inactivation and repopulation, as occur during fractionated radiation therapy, can lead to distributions of pre-malignant cells per patient with variance >> mean, even when pre-malignant clones are Poisson-distributed. Thus fewer patients would be affected, but with a higher probability, than a deterministic model, tracking average pre-malignant cell numbers, would predict. Our results are applied to data on breast cancers after radiotherapy for Hodgkin disease. The stochastic IIP analysis, unlike the deterministic one, indicates: a) initiated, pre-malignant cells can have a growth advantage during repopulation, not just during the longer tumor latency period that follows; b) weekend treatment gaps during radiotherapy, apart from decreasing the probability of eradicating the primary cancer, substantially increase the risk of later second cancers.
Trichotomous noise controlled signal amplification in a generalized Verhulst model
NASA Astrophysics Data System (ADS)
Mankin, Romi; Soika, Erkki; Lumi, Neeme
2014-10-01
The long-time limit of the probability distribution and statistical moments for a population size are studied by means of a stochastic growth model with generalized Verhulst self-regulation. The effect of variable environment on the carrying capacity of a population is modeled by a multiplicative three-level Markovian noise and by a time periodic deterministic component. Exact expressions for the moments of the population size have been calculated. It is shown that an interplay of a small periodic forcing and colored noise can cause large oscillations of the mean population size. The conditions for the appearance of such a phenomenon are found and illustrated by graphs. Implications of the results on models of symbiotic metapopulations are also discussed. Particularly, it is demonstrated that the effect of noise-generated amplification of an input signal gets more pronounced as the intensity of symbiotic interaction increases.
Stochastic gain in finite populations
NASA Astrophysics Data System (ADS)
Röhl, Torsten; Traulsen, Arne; Claussen, Jens Christian; Schuster, Heinz Georg
2008-08-01
Flexible learning rates can lead to increased payoffs under the influence of noise. In a previous paper [Traulsen , Phys. Rev. Lett. 93, 028701 (2004)], we have demonstrated this effect based on a replicator dynamics model which is subject to external noise. Here, we utilize recent advances on finite population dynamics and their connection to the replicator equation to extend our findings and demonstrate the stochastic gain effect in finite population systems. Finite population dynamics is inherently stochastic, depending on the population size and the intensity of selection, which measures the balance between the deterministic and the stochastic parts of the dynamics. This internal noise can be exploited by a population using an appropriate microscopic update process, even if learning rates are constant.
Yunusova, Anastasia M.; Fishman, Veniamin S.; Vasiliev, Gennady V.
2017-01-01
Factor-mediated reprogramming of somatic cells towards pluripotency is a low-efficiency process during which only small subsets of cells are successfully reprogrammed. Previous analyses of the determinants of the reprogramming potential are based on average measurements across a large population of cells or on monitoring a relatively small number of single cells with live imaging. Here, we applied lentiviral genetic barcoding, a powerful tool enabling the identification of familiar relationships in thousands of cells. High-throughput sequencing of barcodes from successfully reprogrammed cells revealed a significant number of barcodes from related cells. We developed a computer model, according to which a probability of synchronous reprogramming of sister cells equals 10–30%. We conclude that the reprogramming success is pre-established in some particular cells and, being a heritable trait, can be maintained through cell division. Thus, reprogramming progresses in a deterministic manner, at least at the level of cell lineages. PMID:28446707
Guymon, Gary L.; Yen, Chung-Cheng
1990-01-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Astrophysics Data System (ADS)
Guymon, Gary L.; Yen, Chung-Cheng
1990-07-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
Development of Cell Models as a Basis for Bioreactor Design for Genetically Modified Bacteria
1986-10-30
of future behavior based on specifying the current state vector . Generally a total population greater than 10,000 is sufficient to allow treatment of...specifying the current state vector (essentially values for all variables in the model). Deterministic models become increasingly valid as the number of...host I A) and therein PARASItIS converts the host’s biomaterial or activities into its own + A and B are in physical contact. SYMBIOSIS (or perhaps Oi
Fisher-Wright model with deterministic seed bank and selection.
Koopmann, Bendix; Müller, Johannes; Tellier, Aurélien; Živković, Daniel
2017-04-01
Seed banks are common characteristics to many plant species, which allow storage of genetic diversity in the soil as dormant seeds for various periods of time. We investigate an above-ground population following a Fisher-Wright model with selection coupled with a deterministic seed bank assuming the length of the seed bank is kept constant and the number of seeds is large. To assess the combined impact of seed banks and selection on genetic diversity, we derive a general diffusion model. The applied techniques outline a path of approximating a stochastic delay differential equation by an appropriately rescaled stochastic differential equation. We compute the equilibrium solution of the site-frequency spectrum and derive the times to fixation of an allele with and without selection. Finally, it is demonstrated that seed banks enhance the effect of selection onto the site-frequency spectrum while slowing down the time until the mutation-selection equilibrium is reached. Copyright © 2016 Elsevier Inc. All rights reserved.
The relationship between stochastic and deterministic quasi-steady state approximations.
Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R
2015-11-23
The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.
Grodwohl, Jean-Baptiste
2017-08-01
Describing the theoretical population geneticists of the 1960s, Joseph Felsenstein reminisced: "our central obsession was finding out what function evolution would try to maximize. Population geneticists used to think, following Sewall Wright, that mean relative fitness, W, would be maximized by natural selection" (Felsenstein 2000). The present paper describes the genesis, diffusion and fall of this "obsession", by giving a biography of the mean fitness function in population genetics. This modeling method devised by Sewall Wright in the 1930s found its heyday in the late 1950s and early 1960s, in the wake of Motoo Kimura's and Richard Lewontin's works. It seemed a reliable guide in the mathematical study of deterministic effects (the study of natural selection in populations of infinite size, with no drift), leading to powerful generalizations presenting law-like properties. Progress in population genetics theory, it then seemed, would come from the application of this method to the study of systems with several genes. This ambition came to a halt in the context of the influential objections made by the Australian mathematician Patrick Moran in 1963. These objections triggered a controversy between mathematically- and biologically-inclined geneticists, with affected both the formal standards and the aims of population genetics as a science. Over the course of the 1960s, the mean fitness method withered with the ambition of developing the deterministic theory. The mathematical theory became increasingly complex. Kimura re-focused his modeling work on the theory of random processes; as a result of his computer simulations, Lewontin became the staunchest critic of maximizing principles in evolutionary biology. The mean fitness method then migrated to other research areas, being refashioned and used in evolutionary quantitative genetics and behavioral ecology.
NASA Astrophysics Data System (ADS)
Golinski, M. R.
2006-07-01
Ecologists have observed that environmental noise affects population variance in the logistic equation for one-species growth. Interactions between deterministic and stochastic dynamics in a one-dimensional system result in increased variance in species population density over time. Since natural populations do not live in isolation, the present paper simulates a discrete-time two-species competition model with environmental noise to determine the type of colored population noise generated by extreme conditions in the long-term population dynamics of competing populations. Discrete Fourier analysis is applied to the simulation results and the calculated Hurst exponent ( H) is used to determine how the color of population noise for the two species corresponds to extreme conditions in population dynamics. To interpret the biological meaning of the color of noise generated by the two-species model, the paper determines the color of noise generated by three reference models: (1) A two-dimensional discrete-time white noise model (0⩽ H<1/2); (2) A two-dimensional fractional Brownian motion model (H=1/2); and (3) A two-dimensional discrete-time model with noise for unbounded growth of two uncoupled species (1/2< H⩽1).
Moving forward in circles: challenges and opportunities in modelling population cycles.
Barraquand, Frédéric; Louca, Stilianos; Abbott, Karen C; Cobbold, Christina A; Cordoleani, Flora; DeAngelis, Donald L; Elderd, Bret D; Fox, Jeremy W; Greenwood, Priscilla; Hilker, Frank M; Murray, Dennis L; Stieha, Christopher R; Taylor, Rachel A; Vitense, Kelsey; Wolkowicz, Gail S K; Tyson, Rebecca C
2017-08-01
Population cycling is a widespread phenomenon, observed across a multitude of taxa in both laboratory and natural conditions. Historically, the theory associated with population cycles was tightly linked to pairwise consumer-resource interactions and studied via deterministic models, but current empirical and theoretical research reveals a much richer basis for ecological cycles. Stochasticity and seasonality can modulate or create cyclic behaviour in non-intuitive ways, the high-dimensionality in ecological systems can profoundly influence cycling, and so can demographic structure and eco-evolutionary dynamics. An inclusive theory for population cycles, ranging from ecosystem-level to demographic modelling, grounded in observational or experimental data, is therefore necessary to better understand observed cyclical patterns. In turn, by gaining better insight into the drivers of population cycles, we can begin to understand the causes of cycle gain and loss, how biodiversity interacts with population cycling, and how to effectively manage wildly fluctuating populations, all of which are growing domains of ecological research. © 2017 John Wiley & Sons Ltd/CNRS.
Moving forward in circles: Challenges and opportunities in modeling population cycles
Barraquand, Frederic; Louca, Stilianos; Abbott, Karen C; Cobbold, Christina A; Cordoleani, Flora; DeAngelis, Donald L.; Elderd, Bret D; Fox, Jeremy W; Greenwood, Priscilla; Hilker, Frank M; Murray, Dennis; Stieha, Christopher R; Taylor, Rachel A; Vitense, Kelsey; Wolkowicz, Gail; Tyson, Rebecca C
2017-01-01
Population cycling is a widespread phenomenon, observed across a multitude of taxa in both laboratory and natural conditions. Historically, the theory associated with population cycles was tightly linked to pairwise consumer–resource interactions and studied via deterministic models, but current empirical and theoretical research reveals a much richer basis for ecological cycles. Stochasticity and seasonality can modulate or create cyclic behaviour in non-intuitive ways, the high-dimensionality in ecological systems can profoundly influence cycling, and so can demographic structure and eco-evolutionary dynamics. An inclusive theory for population cycles, ranging from ecosystem-level to demographic modelling, grounded in observational or experimental data, is therefore necessary to better understand observed cyclical patterns. In turn, by gaining better insight into the drivers of population cycles, we can begin to understand the causes of cycle gain and loss, how biodiversity interacts with population cycling, and how to effectively manage wildly fluctuating populations, all of which are growing domains of ecological research.
Phenotypic switching of populations of cells in a stochastic environment
NASA Astrophysics Data System (ADS)
Hufton, Peter G.; Lin, Yen Ting; Galla, Tobias
2018-02-01
In biology phenotypic switching is a common bet-hedging strategy in the face of uncertain environmental conditions. Existing mathematical models often focus on periodically changing environments to determine the optimal phenotypic response. We focus on the case in which the environment switches randomly between discrete states. Starting from an individual-based model we derive stochastic differential equations to describe the dynamics, and obtain analytical expressions for the mean instantaneous growth rates based on the theory of piecewise-deterministic Markov processes. We show that optimal phenotypic responses are non-trivial for slow and intermediate environmental processes, and systematically compare the cases of periodic and random environments. The best response to random switching is more likely to be heterogeneity than in the case of deterministic periodic environments, net growth rates tend to be higher under stochastic environmental dynamics. The combined system of environment and population of cells can be interpreted as host-pathogen interaction, in which the host tries to choose environmental switching so as to minimise growth of the pathogen, and in which the pathogen employs a phenotypic switching optimised to increase its growth rate. We discuss the existence of Nash-like mutual best-response scenarios for such host-pathogen games.
The importance of environmental variability and management control error to optimal harvest policies
Hunter, C.M.; Runge, M.C.
2004-01-01
State-dependent strategies (SDSs) are the most general form of harvest policy because they allow the harvest rate to depend, without constraint, on the state of the system. State-dependent strategies that provide an optimal harvest rate for any system state can be calculated, and stochasticity can be appropriately accommodated in this optimization. Stochasticity poses 2 challenges to harvest policies: (1) the population will never be at the equilibrium state; and (2) stochasticity induces uncertainty about future states. We investigated the effects of 2 types of stochasticity, environmental variability and management control error, on SDS harvest policies for a white-tailed deer (Odocoileus virginianus) model, and contrasted these with a harvest policy based on maximum sustainable yield (MSY). Increasing stochasticity resulted in more conservative SDSs; that is, higher population densities were required to support the same harvest rate, but these effects were generally small. As stochastic effects increased, SDSs performed much better than MSY. Both deterministic and stochastic SDSs maintained maximum mean annual harvest yield (AHY) and optimal equilibrium population size (Neq) in a stochastic environment, whereas an MSY policy could not. We suggest 3 rules of thumb for harvest management of long-lived vertebrates in stochastic systems: (1) an SDS is advantageous over an MSY policy, (2) using an SDS rather than an MSY is more important than whether a deterministic or stochastic SDS is used, and (3) for SDSs, rankings of the variability in management outcomes (e.g., harvest yield) resulting from parameter stochasticity can be predicted by rankings of the deterministic elasticities.
Stochastic evolution in populations of ideas
Nicole, Robin; Sollich, Peter; Galla, Tobias
2017-01-01
It is known that learning of players who interact in a repeated game can be interpreted as an evolutionary process in a population of ideas. These analogies have so far mostly been established in deterministic models, and memory loss in learning has been seen to act similarly to mutation in evolution. We here propose a representation of reinforcement learning as a stochastic process in finite ‘populations of ideas’. The resulting birth-death dynamics has absorbing states and allows for the extinction or fixation of ideas, marking a key difference to mutation-selection processes in finite populations. We characterize the outcome of evolution in populations of ideas for several classes of symmetric and asymmetric games. PMID:28098244
Stochastic evolution in populations of ideas
NASA Astrophysics Data System (ADS)
Nicole, Robin; Sollich, Peter; Galla, Tobias
2017-01-01
It is known that learning of players who interact in a repeated game can be interpreted as an evolutionary process in a population of ideas. These analogies have so far mostly been established in deterministic models, and memory loss in learning has been seen to act similarly to mutation in evolution. We here propose a representation of reinforcement learning as a stochastic process in finite ‘populations of ideas’. The resulting birth-death dynamics has absorbing states and allows for the extinction or fixation of ideas, marking a key difference to mutation-selection processes in finite populations. We characterize the outcome of evolution in populations of ideas for several classes of symmetric and asymmetric games.
NASA Astrophysics Data System (ADS)
Esquível, Manuel L.; Fernandes, José Moniz; Guerreiro, Gracinda R.
2016-06-01
We introduce a schematic formalism for the time evolution of a random population entering some set of classes and such that each member of the population evolves among these classes according to a scheme based on a Markov chain model. We consider that the flow of incoming members is modeled by a time series and we detail the time series structure of the elements in each of the classes. We present a practical application to data from a credit portfolio of a Cape Verdian bank; after modeling the entering population in two different ways - namely as an ARIMA process and as a deterministic sigmoid type trend plus a SARMA process for the residues - we simulate the behavior of the population and compare the results. We get that the second method is more accurate in describing the behavior of the populations when compared to the observed values in a direct simulation of the Markov chain.
Stochasticity and determinism in models of hematopoiesis.
Kimmel, Marek
2014-01-01
This chapter represents a novel view of modeling in hematopoiesis, synthesizing both deterministic and stochastic approaches. Whereas the stochastic models work in situations where chance dominates, for example when the number of cells is small, or under random mutations, the deterministic models are more important for large-scale, normal hematopoiesis. New types of models are on the horizon. These models attempt to account for distributed environments such as hematopoietic niches and their impact on dynamics. Mixed effects of such structures and chance events are largely unknown and constitute both a challenge and promise for modeling. Our discussion is presented under the separate headings of deterministic and stochastic modeling; however, the connections between both are frequently mentioned. Four case studies are included to elucidate important examples. We also include a primer of deterministic and stochastic dynamics for the reader's use.
Stochastic Petri Net extension of a yeast cell cycle model.
Mura, Ivan; Csikász-Nagy, Attila
2008-10-21
This paper presents the definition, solution and validation of a stochastic model of the budding yeast cell cycle, based on Stochastic Petri Nets (SPN). A specific family of SPNs is selected for building a stochastic version of a well-established deterministic model. We describe the procedure followed in defining the SPN model from the deterministic ODE model, a procedure that can be largely automated. The validation of the SPN model is conducted with respect to both the results provided by the deterministic one and the experimental results available from literature. The SPN model catches the behavior of the wild type budding yeast cells and a variety of mutants. We show that the stochastic model matches some characteristics of budding yeast cells that cannot be found with the deterministic model. The SPN model fine-tunes the simulation results, enriching the breadth and the quality of its outcome.
Antibiotic-induced population fluctuations and stochastic clearance of bacteria
Le, Dai; Şimşek, Emrah; Chaudhry, Waqas
2018-01-01
Effective antibiotic use that minimizes treatment failures remains a challenge. A better understanding of how bacterial populations respond to antibiotics is necessary. Previous studies of large bacterial populations established the deterministic framework of pharmacodynamics. Here, characterizing the dynamics of population extinction, we demonstrated the stochastic nature of eradicating bacteria with antibiotics. Antibiotics known to kill bacteria (bactericidal) induced population fluctuations. Thus, at high antibiotic concentrations, the dynamics of bacterial clearance were heterogeneous. At low concentrations, clearance still occurred with a non-zero probability. These striking outcomes of population fluctuations were well captured by our probabilistic model. Our model further suggested a strategy to facilitate eradication by increasing extinction probability. We experimentally tested this prediction for antibiotic-susceptible and clinically-isolated resistant bacteria. This new knowledge exposes fundamental limits in our ability to predict bacterial eradication. Additionally, it demonstrates the potential of using antibiotic concentrations that were previously deemed inefficacious to eradicate bacteria. PMID:29508699
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...
2015-03-17
Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less
Distribution and regulation of stochasticity and plasticity in Saccharomyces cerevisiae
Dar, R. D.; Karig, D. K.; Cooke, J. F.; ...
2010-09-01
Stochasticity is an inherent feature of complex systems with nanoscale structure. In such systems information is represented by small collections of elements (e.g. a few electrons on a quantum dot), and small variations in the populations of these elements may lead to big uncertainties in the information. Unfortunately, little is known about how to work within this inherently noisy environment to design robust functionality into complex nanoscale systems. Here, we look to the biological cell as an intriguing model system where evolution has mediated the trade-offs between fluctuations and function, and in particular we look at the relationships and trade-offsmore » between stochastic and deterministic responses in the gene expression of budding yeast (Saccharomyces cerevisiae). We find gene regulatory arrangements that control the stochastic and deterministic components of expression, and show that genes that have evolved to respond to stimuli (stress) in the most strongly deterministic way exhibit the most noise in the absence of the stimuli. We show that this relationship is consistent with a bursty 2-state model of gene expression, and demonstrate that this regulatory motif generates the most uncertainty in gene expression when there is the greatest uncertainty in the optimal level of gene expression.« less
NASA Astrophysics Data System (ADS)
Yang, Hyun Mo
2015-12-01
Currently, discrete modellings are largely accepted due to the access to computers with huge storage capacity and high performance processors and easy implementation of algorithms, allowing to develop and simulate increasingly sophisticated models. Wang et al. [7] present a review of dynamics in complex networks, focusing on the interaction between disease dynamics and human behavioral and social dynamics. By doing an extensive review regarding to the human behavior responding to disease dynamics, the authors briefly describe the complex dynamics found in the literature: well-mixed populations networks, where spatial structure can be neglected, and other networks considering heterogeneity on spatially distributed populations. As controlling mechanisms are implemented, such as social distancing due 'social contagion', quarantine, non-pharmaceutical interventions and vaccination, adaptive behavior can occur in human population, which can be easily taken into account in the dynamics formulated by networked populations.
Finite-size effects and switching times for Moran process with mutation.
DeVille, Lee; Galiardi, Meghan
2017-04-01
We consider the Moran process with two populations competing under an iterated Prisoner's Dilemma in the presence of mutation, and concentrate on the case where there are multiple evolutionarily stable strategies. We perform a complete bifurcation analysis of the deterministic system which arises in the infinite population size. We also study the Master equation and obtain asymptotics for the invariant distribution and metastable switching times for the stochastic process in the case of large but finite population. We also show that the stochastic system has asymmetries in the form of a skew for parameter values where the deterministic limit is symmetric.
Time-dependent earthquake probabilities
Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.
2005-01-01
We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.
Stochastic mixed-mode oscillations in a three-species predator-prey model
NASA Astrophysics Data System (ADS)
Sadhu, Susmita; Kuehn, Christian
2018-03-01
The effect of demographic stochasticity, in the form of Gaussian white noise, in a predator-prey model with one fast and two slow variables is studied. We derive the stochastic differential equations (SDEs) from a discrete model. For suitable parameter values, the deterministic drift part of the model admits a folded node singularity and exhibits a singular Hopf bifurcation. We focus on the parameter regime near the Hopf bifurcation, where small amplitude oscillations exist as stable dynamics in the absence of noise. In this regime, the stochastic model admits noise-driven mixed-mode oscillations (MMOs), which capture the intermediate dynamics between two cycles of population outbreaks. We perform numerical simulations to calculate the distribution of the random number of small oscillations between successive spikes for varying noise intensities and distance to the Hopf bifurcation. We also study the effect of noise on a suitable Poincaré map. Finally, we prove that the stochastic model can be transformed into a normal form near the folded node, which can be linked to recent results on the interplay between deterministic and stochastic small amplitude oscillations. The normal form can also be used to study the parameter influence on the noise level near folded singularities.
Health safety nets can break cycles of poverty and disease: a stochastic ecological model
Pluciński, Mateusz M.; Ngonghala, Calistus N.; Bonds, Matthew H.
2011-01-01
The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a ‘safety net’, defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium. PMID:21593026
Synchronisation of chaos and its applications
NASA Astrophysics Data System (ADS)
Eroglu, Deniz; Lamb, Jeroen S. W.; Pereira, Tiago
2017-07-01
Dynamical networks are important models for the behaviour of complex systems, modelling physical, biological and societal systems, including the brain, food webs, epidemic disease in populations, power grids and many other. Such dynamical networks can exhibit behaviour in which deterministic chaos, exhibiting unpredictability and disorder, coexists with synchronisation, a classical paradigm of order. We survey the main theory behind complete, generalised and phase synchronisation phenomena in simple as well as complex networks and discuss applications to secure communications, parameter estimation and the anticipation of chaos.
NASA Astrophysics Data System (ADS)
García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.
2018-07-01
In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.
Numerical modeling of the transmission dynamics of drug-sensitive and drug-resistant HSV-2
NASA Astrophysics Data System (ADS)
Gumel, A. B.
2001-03-01
A competitive finite-difference method will be constructed and used to solve a modified deterministic model for the spread of herpes simplex virus type-2 (HSV-2) within a given population. The model monitors the transmission dynamics and control of drug-sensitive and drug-resistant HSV-2. Unlike the fourth-order Runge-Kutta method (RK4), which fails when the discretization parameters exceed certain values, the novel numerical method to be developed in this paper gives convergent results for all parameter values.
Mutant number distribution in an exponentially growing population
NASA Astrophysics Data System (ADS)
Keller, Peter; Antal, Tibor
2015-01-01
We present an explicit solution to a classic model of cell-population growth introduced by Luria and Delbrück (1943 Genetics 28 491-511) 70 years ago to study the emergence of mutations in bacterial populations. In this model a wild-type population is assumed to grow exponentially in a deterministic fashion. Proportional to the wild-type population size, mutants arrive randomly and initiate new sub-populations of mutants that grow stochastically according to a supercritical birth and death process. We give an exact expression for the generating function of the total number of mutants at a given wild-type population size. We present a simple expression for the probability of finding no mutants, and a recursion formula for the probability of finding a given number of mutants. In the ‘large population-small mutation’ limit we recover recent results of Kessler and Levine (2014 J. Stat. Phys. doi:10.1007/s10955-014-1143-3) for a fully stochastic version of the process.
Cancer dormancy and criticality from a game theory perspective.
Wu, Amy; Liao, David; Kirilin, Vlamimir; Lin, Ke-Chih; Torga, Gonzalo; Qu, Junle; Liu, Liyu; Sturm, James C; Pienta, Kenneth; Austin, Robert
2018-01-01
The physics of cancer dormancy, the time between initial cancer treatment and re-emergence after a protracted period, is a puzzle. Cancer cells interact with host cells via complex, non-linear population dynamics, which can lead to very non-intuitive but perhaps deterministic and understandable progression dynamics of cancer and dormancy. We explore here the dynamics of host-cancer cell populations in the presence of (1) payoffs gradients and (2) perturbations due to cell migration. We determine to what extent the time-dependence of the populations can be quantitively understood in spite of the underlying complexity of the individual agents and model the phenomena of dormancy.
Cognitive Diagnostic Analysis Using Hierarchically Structured Skills
ERIC Educational Resources Information Center
Su, Yu-Lan
2013-01-01
This dissertation proposes two modified cognitive diagnostic models (CDMs), the deterministic, inputs, noisy, "and" gate with hierarchy (DINA-H) model and the deterministic, inputs, noisy, "or" gate with hierarchy (DINO-H) model. Both models incorporate the hierarchical structures of the cognitive skills in the model estimation…
Boldin, Barbara; Kisdi, Éva
2016-03-01
Evolutionary suicide is a riveting phenomenon in which adaptive evolution drives a viable population to extinction. Gyllenberg and Parvinen (Bull Math Biol 63(5):981-993, 2001) showed that, in a wide class of deterministic population models, a discontinuous transition to extinction is a necessary condition for evolutionary suicide. An implicit assumption of their proof is that the invasion fitness of a rare strategy is well-defined also in the extinction state of the population. Epidemic models with frequency-dependent incidence, which are often used to model the spread of sexually transmitted infections or the dynamics of infectious diseases within herds, violate this assumption. In these models, evolutionary suicide can occur through a non-catastrophic bifurcation whereby pathogen adaptation leads to a continuous decline of host (and consequently pathogen) population size to zero. Evolutionary suicide of pathogens with frequency-dependent transmission can occur in two ways, with pathogen strains evolving either higher or lower virulence.
The re-incarnation, re-interpretation and re-demise of the transition probability model.
Koch, A L
1999-05-28
There are two classes of models for the cell cycle that have both a deterministic and a stochastic part; they are the transition probability (TP) models and sloppy size control (SSC) models. The hallmark of the basic TP model are two graphs: the alpha and beta plots. The former is the semi-logarithmic plot of the percentage of cell divisions yet to occur, this results in a horizontal line segment at 100% corresponding to the deterministic phase and a straight line sloping tail corresponding to the stochastic part. The beta plot concerns the differences of the age-at-division of sisters (the beta curve) and gives a straight line parallel to the tail of the alpha curve. For the SC models the deterministic part is the time needed for the cell to accumulate a critical amount of some substance(s). The variable part differs in the various variants of the general model, but they do not give alpha and beta curves with linear tails as postulated by the TP model. This paper argues against TP and for an elaboration of SSC type of model. The main argument against TP is that it assumes that the probability of the transition from the stochastic phase is time invariant even though it is certain that the cells are growing and metabolizing throughout the cell cycle; a fact that should make the transition probability be variable. The SSC models presume that cell division is triggered by the cell's success in growing and not simply the result of elapsed time. The extended model proposed here to accommodate the predictions of the SSC to the straight tailed parts of the alpha and beta plots depends on the existence of a few percent of the cell in a growing culture that are not growing normally, these are growing much slower or are temporarily quiescent. The bulk of the cells, however, grow nearly exponentially. Evidence for a slow growing component comes from experimental analyses of population size distributions for a variety of cell types by the Collins-Richmond technique. These subpopulations existence is consistent with the new concept that there are a large class of rapidly reversible mutations occurring in many organisms and at many loci serving a large range of purposes to enable the cell to survive environmental challenges. These mutations yield special subpopulations of cells within a population. The reversible mutational changes, relevant to the elaboration of SSC models, produce slow-growing cells that are either very large or very small in size; these later revert to normal growth and division. The subpopulations, however, distort the population distribution in such a way as to fit better the exponential tails of the alpha and beta curves of the TP model.
NASA Astrophysics Data System (ADS)
Scherstjanoi, M.; Kaplan, J. O.; Thürig, E.; Lischke, H.
2013-02-01
Models of vegetation dynamics that are designed for application at spatial scales larger than individual forest gaps suffer from several limitations. Typically, either a population average approximation is used that results in unrealistic tree allometry and forest stand structure, or models have a high computational demand because they need to simulate both a series of age-based cohorts and a number of replicate patches to account for stochastic gap-scale disturbances. The detail required by the latter method increases the number of calculations by two to three orders of magnitude compared to the less realistic population average approach. In an effort to increase the efficiency of dynamic vegetation models without sacrificing realism, and to explore patterns of spatial scaling in forests, we developed a new method for simulating stand-replacing disturbances that is both accurate and 10-50x faster than approaches that use replicate patches. The GAPPARD (approximating GAP model results with a Probabilistic Approach to account for stand Replacing Disturbances) method works by postprocessing the output of deterministic, undisturbed simulations of a cohort-based vegetation model by deriving the distribution of patch ages at any point in time on the basis of a disturbance probability. With this distribution, the expected value of any output variable can be calculated from the output values of the deterministic undisturbed run at the time corresponding to the patch age. To account for temporal changes in model forcing, e.g., as a result of climate change, GAPPARD performs a series of deterministic simulations and interpolates between the results in the postprocessing step. We integrated the GAPPARD method in the forest models LPJ-GUESS and TreeM-LPJ, and evaluated these in a series of simulations along an altitudinal transect of an inner-alpine valley. With GAPPARD applied to LPJ-GUESS results were insignificantly different from the output of the original model LPJ-GUESS using 100 replicate patches, but simulation time was reduced by approximately the factor 10. Our new method is therefore highly suited rapidly approximating LPJ-GUESS results, and provides the opportunity for future studies over large spatial domains, allows easier parameterization of tree species, faster identification of areas of interesting simulation results, and comparisons with large-scale datasets and forest models.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.
Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M
2016-12-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.
Dynamics of stochastic SEIS epidemic model with varying population size
NASA Astrophysics Data System (ADS)
Liu, Jiamin; Wei, Fengying
2016-12-01
We introduce the stochasticity into a deterministic model which has state variables susceptible-exposed-infected with varying population size in this paper. The infected individuals could return into susceptible compartment after recovering. We show that the stochastic model possesses a unique global solution under building up a suitable Lyapunov function and using generalized Itô's formula. The densities of the exposed and infected tend to extinction when some conditions are being valid. Moreover, the conditions of persistence to a global solution are derived when the parameters are subject to some simple criteria. The stochastic model admits a stationary distribution around the endemic equilibrium, which means that the disease will prevail. To check the validity of the main results, numerical simulations are demonstrated as end of this contribution.
Application of Stochastic and Deterministic Approaches to Modeling Interstellar Chemistry
NASA Astrophysics Data System (ADS)
Pei, Yezhe
This work is about simulations of interstellar chemistry using the deterministic rate equation (RE) method and the stochastic moment equation (ME) method. Primordial metal-poor interstellar medium (ISM) is of our interest and the socalled “Population-II” stars could have been formed in this environment during the “Epoch of Reionization” in the baby universe. We build a gas phase model using the RE scheme to describe the ionization-powered interstellar chemistry. We demonstrate that OH replaces CO as the most abundant metal-bearing molecule in such interstellar clouds of the early universe. Grain surface reactions play an important role in the studies of astrochemistry. But the lack of an accurate yet effective simulation method still presents a challenge, especially for large, practical gas-grain system. We develop a hybrid scheme of moment equations and rate equations (HMR) for large gas-grain network to model astrochemical reactions in the interstellar clouds. Specifically, we have used a large chemical gas-grain model, with stochastic moment equations to treat the surface chemistry and deterministic rate equations to treat the gas phase chemistry, to simulate astrochemical systems as of the ISM in the Milky Way, the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC). We compare the results to those of pure rate equations and modified rate equations and present a discussion about how moment equations improve our theoretical modeling and how the abundances of the assorted species are changed by varied metallicity. We also model the observed composition of H2O, CO and CO2 ices toward Young Stellar Objects in the LMC and show that the HMR method gives a better match to the observation than the pure RE method.
Mayo, Christie; Shelley, Courtney; MacLachlan, N. James; Gardner, Ian; Hartley, David; Barker, Christopher
2016-01-01
The global distribution of bluetongue virus (BTV) has been changing recently, perhaps as a result of climate change. To evaluate the risk of BTV infection and transmission in a BTV-endemic region of California, sentinel dairy cows were evaluated for BTV infection, and populations of Culicoides vectors were collected at different sites using carbon dioxide. A deterministic model was developed to quantify risk and guide future mitigation strategies to reduce BTV infection in California dairy cattle. The greatest risk of BTV transmission was predicted within the warm Central Valley of California that contains the highest density of dairy cattle in the United States. Temperature and parameters associated with Culicoides vectors (transmission probabilities, carrying capacity, and survivorship) had the greatest effect on BTV’s basic reproduction number, R0. Based on these analyses, optimal control strategies for reducing BTV infection risk in dairy cattle will be highly reliant upon early efforts to reduce vector abundance during the months prior to peak transmission. PMID:27812161
A Stochastic Differential Equation Model for the Spread of HIV amongst People Who Inject Drugs.
Liang, Yanfeng; Greenhalgh, David; Mao, Xuerong
2016-01-01
We introduce stochasticity into the deterministic differential equation model for the spread of HIV amongst people who inject drugs (PWIDs) studied by Greenhalgh and Hay (1997). This was based on the original model constructed by Kaplan (1989) which analyses the behaviour of HIV/AIDS amongst a population of PWIDs. We derive a stochastic differential equation (SDE) for the fraction of PWIDs who are infected with HIV at time. The stochasticity is introduced using the well-known standard technique of parameter perturbation. We first prove that the resulting SDE for the fraction of infected PWIDs has a unique solution in (0, 1) provided that some infected PWIDs are initially present and next construct the conditions required for extinction and persistence. Furthermore, we show that there exists a stationary distribution for the persistence case. Simulations using realistic parameter values are then constructed to illustrate and support our theoretical results. Our results provide new insight into the spread of HIV amongst PWIDs. The results show that the introduction of stochastic noise into a model for the spread of HIV amongst PWIDs can cause the disease to die out in scenarios where deterministic models predict disease persistence.
Weinberg, Seth H.; Smith, Gregory D.
2012-01-01
Cardiac myocyte calcium signaling is often modeled using deterministic ordinary differential equations (ODEs) and mass-action kinetics. However, spatially restricted “domains” associated with calcium influx are small enough (e.g., 10−17 liters) that local signaling may involve 1–100 calcium ions. Is it appropriate to model the dynamics of subspace calcium using deterministic ODEs or, alternatively, do we require stochastic descriptions that account for the fundamentally discrete nature of these local calcium signals? To address this question, we constructed a minimal Markov model of a calcium-regulated calcium channel and associated subspace. We compared the expected value of fluctuating subspace calcium concentration (a result that accounts for the small subspace volume) with the corresponding deterministic model (an approximation that assumes large system size). When subspace calcium did not regulate calcium influx, the deterministic and stochastic descriptions agreed. However, when calcium binding altered channel activity in the model, the continuous deterministic description often deviated significantly from the discrete stochastic model, unless the subspace volume is unrealistically large and/or the kinetics of the calcium binding are sufficiently fast. This principle was also demonstrated using a physiologically realistic model of calmodulin regulation of L-type calcium channels introduced by Yue and coworkers. PMID:23509597
Enviromental influences on the {sup 137}Cs kinetics of the yellow-bellied turtle (Trachemys Scripta)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, E.L.; Brisbin, L.I. Jr.
1996-02-01
Assessments of ecological risk require accurate predictions of contaminant dynamics in natural populations. However, simple deterministic models that assume constant uptake rates and elimination fractions may compromise both their ecological realism and their general application to animals with variable metabolism or diets. In particular, the temperature-dependent model of metabolic rates characteristic of ectotherms may lead to significant differences between observed and predicted contaminant kinetics. We examined the influence of a seasonally variable thermal environment on predicting the uptake and annual cycling of contaminants by ectotherms, using a temperature-dependent model of {sup 137}Cs kinetics in free-living yellow-bellied turtles, Trachemys scripta. Wemore » compared predictions from this model with those of deterministics negative exponential and flexibly shaped Richards sigmoidal models. Concentrations of {sup 137}Cs in a population if this species in Pond B, a radionuclide-contaminated nuclear reactor cooling reservoir, and {sup 137}Cs uptake by the uncontaminated turtles held captive in Pond B for 4 yr confirmed both the pattern of uptake and the equilibrium concentrations predicted by the temperature-dependent model. Almost 90% of the variance on the predicted time-integrated {sup 137}Cs concentration was explainable by linear relationships with model paramaters. The model was also relatively insensitive to uncertainties in the estimates of ambient temperature, suggesting that adequate estimates of temperature-dependent ingestion and elimination may require relatively few measurements of ambient conditions at sites of interest. Analyses of Richards sigmoidal models of {sup 137}Cs uptake indicated significant differences from a negative exponential trajectory in the 1st yr after the turtles` release into Pond B. 76 refs., 7 figs., 5 tabs.« less
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology
Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.
2016-01-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915
LEGEND, a LEO-to-GEO Environment Debris Model
NASA Technical Reports Server (NTRS)
Liou, Jer Chyi; Hall, Doyle T.
2013-01-01
LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.
A stochastic-field description of finite-size spiking neural networks
Longtin, André
2017-01-01
Neural network dynamics are governed by the interaction of spiking neurons. Stochastic aspects of single-neuron dynamics propagate up to the network level and shape the dynamical and informational properties of the population. Mean-field models of population activity disregard the finite-size stochastic fluctuations of network dynamics and thus offer a deterministic description of the system. Here, we derive a stochastic partial differential equation (SPDE) describing the temporal evolution of the finite-size refractory density, which represents the proportion of neurons in a given refractory state at any given time. The population activity—the density of active neurons per unit time—is easily extracted from this refractory density. The SPDE includes finite-size effects through a two-dimensional Gaussian white noise that acts both in time and along the refractory dimension. For an infinite number of neurons the standard mean-field theory is recovered. A discretization of the SPDE along its characteristic curves allows direct simulations of the activity of large but finite spiking networks; this constitutes the main advantage of our approach. Linearizing the SPDE with respect to the deterministic asynchronous state allows the theoretical investigation of finite-size activity fluctuations. In particular, analytical expressions for the power spectrum and autocorrelation of activity fluctuations are obtained. Moreover, our approach can be adapted to incorporate multiple interacting populations and quasi-renewal single-neuron dynamics. PMID:28787447
Sensitive response of a model of symbiotic ecosystem to seasonal periodic drive
NASA Astrophysics Data System (ADS)
Rekker, A.; Lumi, N.; Mankin, R.
2014-11-01
A symbiotic ecosysytem (metapopulation) is studied by means of the stochastic Lotka-Volterra model with generalized Verhulst self-regulation. The effect of variable environment on the carrying capacities of populations is taken into account as an asymmetric dichotomous noise and as a deterministic periodic stimulus. In the framework of the mean-field theory an explicit self-consistency equation for the system in the long-time limit is presented. Also, expressions for the probability distribution and for the moments of the population size are found. In certain cases the mean population size exhibits large oscillations in time, even if the amplitude of the seasonal environmental drive is small. Particularly, it is shown that the occurrence of large oscillations of the mean population size can be controlled by noise parameters (such as amplitude and correlation time) and by the coupling strength of the symbiotic interaction between species.
Deterministic models for traffic jams
NASA Astrophysics Data System (ADS)
Nagel, Kai; Herrmann, Hans J.
1993-10-01
We study several deterministic one-dimensional traffic models. For integer positions and velocities we find the typical high and low density phases separated by a simple transition. If positions and velocities are continuous variables the model shows self-organized critically driven by the slowest car.
Benedetti-Cecchi, Lisandro; Canepa, Antonio; Fuentes, Veronica; Tamburello, Laura; Purcell, Jennifer E; Piraino, Stefano; Roberts, Jason; Boero, Ferdinando; Halpin, Patrick
2015-01-01
Jellyfish outbreaks are increasingly viewed as a deterministic response to escalating levels of environmental degradation and climate extremes. However, a comprehensive understanding of the influence of deterministic drivers and stochastic environmental variations favouring population renewal processes has remained elusive. This study quantifies the deterministic and stochastic components of environmental change that lead to outbreaks of the jellyfish Pelagia noctiluca in the Mediterranen Sea. Using data of jellyfish abundance collected at 241 sites along the Catalan coast from 2007 to 2010 we: (1) tested hypotheses about the influence of time-varying and spatial predictors of jellyfish outbreaks; (2) evaluated the relative importance of stochastic vs. deterministic forcing of outbreaks through the environmental bootstrap method; and (3) quantified return times of extreme events. Outbreaks were common in May and June and less likely in other summer months, which resulted in a negative relationship between outbreaks and SST. Cross- and along-shore advection by geostrophic flow were important concentrating forces of jellyfish, but most outbreaks occurred in the proximity of two canyons in the northern part of the study area. This result supported the recent hypothesis that canyons can funnel P. noctiluca blooms towards shore during upwelling. This can be a general, yet unappreciated mechanism leading to outbreaks of holoplanktonic jellyfish species. The environmental bootstrap indicated that stochastic environmental fluctuations have negligible effects on return times of outbreaks. Our analysis emphasized the importance of deterministic processes leading to jellyfish outbreaks compared to the stochastic component of environmental variation. A better understanding of how environmental drivers affect demographic and population processes in jellyfish species will increase the ability to anticipate jellyfish outbreaks in the future.
Hartmann, Christoph; Lazar, Andreea; Nessler, Bernhard; Triesch, Jochen
2015-01-01
Even in the absence of sensory stimulation the brain is spontaneously active. This background “noise” seems to be the dominant cause of the notoriously high trial-to-trial variability of neural recordings. Recent experimental observations have extended our knowledge of trial-to-trial variability and spontaneous activity in several directions: 1. Trial-to-trial variability systematically decreases following the onset of a sensory stimulus or the start of a motor act. 2. Spontaneous activity states in sensory cortex outline the region of evoked sensory responses. 3. Across development, spontaneous activity aligns itself with typical evoked activity patterns. 4. The spontaneous brain activity prior to the presentation of an ambiguous stimulus predicts how the stimulus will be interpreted. At present it is unclear how these observations relate to each other and how they arise in cortical circuits. Here we demonstrate that all of these phenomena can be accounted for by a deterministic self-organizing recurrent neural network model (SORN), which learns a predictive model of its sensory environment. The SORN comprises recurrently coupled populations of excitatory and inhibitory threshold units and learns via a combination of spike-timing dependent plasticity (STDP) and homeostatic plasticity mechanisms. Similar to balanced network architectures, units in the network show irregular activity and variable responses to inputs. Additionally, however, the SORN exhibits sequence learning abilities matching recent findings from visual cortex and the network’s spontaneous activity reproduces the experimental findings mentioned above. Intriguingly, the network’s behaviour is reminiscent of sampling-based probabilistic inference, suggesting that correlates of sampling-based inference can develop from the interaction of STDP and homeostasis in deterministic networks. We conclude that key observations on spontaneous brain activity and the variability of neural responses can be accounted for by a simple deterministic recurrent neural network which learns a predictive model of its sensory environment via a combination of generic neural plasticity mechanisms. PMID:26714277
Hpm of Estrogen Model on the Dynamics of Breast Cancer
NASA Astrophysics Data System (ADS)
Govindarajan, A.; Balamuralitharan, S.; Sundaresan, T.
2018-04-01
We enhance a deterministic mathematical model involving universal dynamics on breast cancer with immune response. This is population model so includes Normal cells class, Tumor cells, Immune cells and Estrogen. The eects regarding Estrogen are below incorporated in the model. The effects show to that amount the arrival of greater Estrogen increases the danger over growing breast cancer. Furthermore, approximate solution regarding nonlinear differential equations is arrived by Homotopy Perturbation Method (HPM). Hes HPM is good and correct technique after solve nonlinear differential equation directly. Approximate solution learnt with the support of that method is suitable same as like the actual results in accordance with this models.
Seven challenges for metapopulation models of epidemics, including households models.
Ball, Frank; Britton, Tom; House, Thomas; Isham, Valerie; Mollison, Denis; Pellis, Lorenzo; Scalia Tomba, Gianpaolo
2015-03-01
This paper considers metapopulation models in the general sense, i.e. where the population is partitioned into sub-populations (groups, patches,...), irrespective of the biological interpretation they have, e.g. spatially segregated large sub-populations, small households or hosts themselves modelled as populations of pathogens. This framework has traditionally provided an attractive approach to incorporating more realistic contact structure into epidemic models, since it often preserves analytic tractability (in stochastic as well as deterministic models) but also captures the most salient structural inhomogeneity in contact patterns in many applied contexts. Despite the progress that has been made in both the theory and application of such metapopulation models, we present here several major challenges that remain for future work, focusing on models that, in contrast to agent-based ones, are amenable to mathematical analysis. The challenges range from clarifying the usefulness of systems of weakly-coupled large sub-populations in modelling the spread of specific diseases to developing a theory for endemic models with household structure. They include also developing inferential methods for data on the emerging phase of epidemics, extending metapopulation models to more complex forms of human social structure, developing metapopulation models to reflect spatial population structure, developing computationally efficient methods for calculating key epidemiological model quantities, and integrating within- and between-host dynamics in models. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Modelling the effect of an alternative host population on the spread of citrus Huanglongbing
NASA Astrophysics Data System (ADS)
d'A. Vilamiu, Raphael G.; Ternes, Sonia; Laranjeira, Francisco F.; de C. Santos, Tâmara T.
2013-10-01
The objective of this work was to model the spread of citrus Huanglongbing (HLB) considering the presence of a population of alternative hosts (Murraya paniculata). We developed a compartmental deterministic mathematical model for representing the dynamics of HLB disease in a citrus orchard, including delays in the latency and incubation phases of the disease in the plants and a delay period on the nymphal stage of Diaphorina citri, the insect vector of HLB in Brazil. The results of numerical simulations indicate that alternative hosts should not play a crucial role on HLB dynamics considering a typical scenario for the Recôncavo Baiano region in Brazil . Also, the current policy of removing symptomatic plants every three months should not be expected to significantly hinder HLB spread.
Chigidi, Esther; Lungu, Edward M
2009-07-01
We formulate an HIV/AIDS deterministic model which incorporates differential infectivity and disease progression for treatment-naive and treatment-experienced HIV/AIDS infectives. To illustrate our model, we have applied it to estimate adult HIV prevalence, the HIV population, the number of new infectives and the number of AIDS deaths for Botswana for the period 1984 to 2012. It is found that the prevalence peaked in the year 2000 and the HIV population is now decreasing. We have also found that under the current conditions, the reproduction number is Rc approximately 13, which is less than the 2004 estimate of Rc approximately equal 4 by [11] and [13]. The results in this study suggest that the HAART program has yielded positive results for Botswana.
Kin groups and trait groups: population structure and epidemic disease selection.
Fix, A G
1984-10-01
A Monte Carlo simulation based on the population structure of a small-scale human population, the Semai Senoi of Malaysia, has been developed to study the combined effects of group, kin, and individual selection. The population structure resembles D.S. Wilson's structured deme model in that local breeding populations (Semai settlements) are subdivided into trait groups (hamlets) that may be kin-structured and are not themselves demes. Additionally, settlement breeding populations are connected by two-dimensional stepping-stone migration approaching 30% per generation. Group and kin-structured group selection occur among hamlets the survivors of which then disperse to breed within the settlement population. Genetic drift is modeled by the process of hamlet formation; individual selection as a deterministic process, and stepping-stone migration as either random or kin-structured migrant groups. The mechanism for group selection is epidemics of infectious disease that can wipe out small hamlets particularly if most adults become sick and social life collapses. Genetic resistance to a disease is an individual attribute; however, hamlet groups with several resistant adults are less likely to disintegrate and experience high social mortality. A specific human gene, hemoglobin E, which confers resistance to malaria, is studied as an example of the process. The results of the simulations show that high genetic variance among hamlet groups may be generated by moderate degrees of kin-structuring. This strong microdifferentiation provides the potential for group selection. The effect of group selection in this case is rapid increase in gene frequencies among the total set of populations. In fact, group selection in concert with individual selection produced a faster rate of gene frequency increase among a set of 25 populations than the rate within a single unstructured population subject to deterministic individual selection. Such rapid evolution with plausible rates of extinction, individual selection, and migration and a population structure realistic in its general form, has implications for specific human polymorphisms such as hemoglobin variants and for the more general problem of the tempo of evolution as well.
Realistic Simulation for Body Area and Body-To-Body Networks
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D’Errico, Raffaele
2016-01-01
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices. PMID:27104537
Realistic Simulation for Body Area and Body-To-Body Networks.
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D'Errico, Raffaele
2016-04-20
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices.
A parallel implementation of an off-lattice individual-based model of multicellular populations
NASA Astrophysics Data System (ADS)
Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe
2015-07-01
As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.
Quantifying and Mitigating the Effect of Preferential Sampling on Phylodynamic Inference
Karcher, Michael D.; Palacios, Julia A.; Bedford, Trevor; Suchard, Marc A.; Minin, Vladimir N.
2016-01-01
Phylodynamics seeks to estimate effective population size fluctuations from molecular sequences of individuals sampled from a population of interest. One way to accomplish this task formulates an observed sequence data likelihood exploiting a coalescent model for the sampled individuals’ genealogy and then integrating over all possible genealogies via Monte Carlo or, less efficiently, by conditioning on one genealogy estimated from the sequence data. However, when analyzing sequences sampled serially through time, current methods implicitly assume either that sampling times are fixed deterministically by the data collection protocol or that their distribution does not depend on the size of the population. Through simulation, we first show that, when sampling times do probabilistically depend on effective population size, estimation methods may be systematically biased. To correct for this deficiency, we propose a new model that explicitly accounts for preferential sampling by modeling the sampling times as an inhomogeneous Poisson process dependent on effective population size. We demonstrate that in the presence of preferential sampling our new model not only reduces bias, but also improves estimation precision. Finally, we compare the performance of the currently used phylodynamic methods with our proposed model through clinically-relevant, seasonal human influenza examples. PMID:26938243
Jewett, Ethan M.; Steinrücken, Matthias; Song, Yun S.
2016-01-01
Many approaches have been developed for inferring selection coefficients from time series data while accounting for genetic drift. These approaches have been motivated by the intuition that properly accounting for the population size history can significantly improve estimates of selective strengths. However, the improvement in inference accuracy that can be attained by modeling drift has not been characterized. Here, by comparing maximum likelihood estimates of selection coefficients that account for the true population size history with estimates that ignore drift by assuming allele frequencies evolve deterministically in a population of infinite size, we address the following questions: how much can modeling the population size history improve estimates of selection coefficients? How much can mis-inferred population sizes hurt inferences of selection coefficients? We conduct our analysis under the discrete Wright–Fisher model by deriving the exact probability of an allele frequency trajectory in a population of time-varying size and we replicate our results under the diffusion model. For both models, we find that ignoring drift leads to estimates of selection coefficients that are nearly as accurate as estimates that account for the true population history, even when population sizes are small and drift is high. This result is of interest because inference methods that ignore drift are widely used in evolutionary studies and can be many orders of magnitude faster than methods that account for population sizes. PMID:27550904
The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...
Spread of a disease and its effect on population dynamics in an eco-epidemiological system
NASA Astrophysics Data System (ADS)
Upadhyay, Ranjit Kumar; Roy, Parimita
2014-12-01
In this paper, an eco-epidemiological model with simple law of mass action and modified Holling type II functional response has been proposed and analyzed to understand how a disease may spread among natural populations. The proposed model is a modification of the model presented by Upadhyay et al. (2008) [1]. Existence of the equilibria and their stability analysis (linear and nonlinear) has been studied. The dynamical transitions in the model have been studied by identifying the existence of backward Hopf-bifurcations and demonstrated the period-doubling route to chaos when the death rate of predator (μ1) and the growth rate of susceptible prey population (r) are treated as bifurcation parameters. Our studies show that the system exhibits deterministic chaos when some control parameters attain their critical values. Chaotic dynamics is depicted using the 2D parameter scans and bifurcation analysis. Possible implications of the results for disease eradication or its control are discussed.
Extinction in neutrally stable stochastic Lotka-Volterra models
NASA Astrophysics Data System (ADS)
Dobrinevski, Alexander; Frey, Erwin
2012-05-01
Populations of competing biological species exhibit a fascinating interplay between the nonlinear dynamics of evolutionary selection forces and random fluctuations arising from the stochastic nature of the interactions. The processes leading to extinction of species, whose understanding is a key component in the study of evolution and biodiversity, are influenced by both of these factors. Here, we investigate a class of stochastic population dynamics models based on generalized Lotka-Volterra systems. In the case of neutral stability of the underlying deterministic model, the impact of intrinsic noise on the survival of species is dramatic: It destroys coexistence of interacting species on a time scale proportional to the population size. We introduce a new method based on stochastic averaging which allows one to understand this extinction process quantitatively by reduction to a lower-dimensional effective dynamics. This is performed analytically for two highly symmetrical models and can be generalized numerically to more complex situations. The extinction probability distributions and other quantities of interest we obtain show excellent agreement with simulations.
Extinction in neutrally stable stochastic Lotka-Volterra models.
Dobrinevski, Alexander; Frey, Erwin
2012-05-01
Populations of competing biological species exhibit a fascinating interplay between the nonlinear dynamics of evolutionary selection forces and random fluctuations arising from the stochastic nature of the interactions. The processes leading to extinction of species, whose understanding is a key component in the study of evolution and biodiversity, are influenced by both of these factors. Here, we investigate a class of stochastic population dynamics models based on generalized Lotka-Volterra systems. In the case of neutral stability of the underlying deterministic model, the impact of intrinsic noise on the survival of species is dramatic: It destroys coexistence of interacting species on a time scale proportional to the population size. We introduce a new method based on stochastic averaging which allows one to understand this extinction process quantitatively by reduction to a lower-dimensional effective dynamics. This is performed analytically for two highly symmetrical models and can be generalized numerically to more complex situations. The extinction probability distributions and other quantities of interest we obtain show excellent agreement with simulations.
Adaptation to Temporally Fluctuating Environments by the Evolution of Maternal Effects.
Dey, Snigdhadip; Proulx, Stephen R; Teotónio, Henrique
2016-02-01
All organisms live in temporally fluctuating environments. Theory predicts that the evolution of deterministic maternal effects (i.e., anticipatory maternal effects or transgenerational phenotypic plasticity) underlies adaptation to environments that fluctuate in a predictably alternating fashion over maternal-offspring generations. In contrast, randomizing maternal effects (i.e., diversifying and conservative bet-hedging), are expected to evolve in response to unpredictably fluctuating environments. Although maternal effects are common, evidence for their adaptive significance is equivocal since they can easily evolve as a correlated response to maternal selection and may or may not increase the future fitness of offspring. Using the hermaphroditic nematode Caenorhabditis elegans, we here show that the experimental evolution of maternal glycogen provisioning underlies adaptation to a fluctuating normoxia-anoxia hatching environment by increasing embryo survival under anoxia. In strictly alternating environments, we found that hermaphrodites evolved the ability to increase embryo glycogen provisioning when they experienced normoxia and to decrease embryo glycogen provisioning when they experienced anoxia. At odds with existing theory, however, populations facing irregularly fluctuating normoxia-anoxia hatching environments failed to evolve randomizing maternal effects. Instead, adaptation in these populations may have occurred through the evolution of fitness effects that percolate over multiple generations, as they maintained considerably high expected growth rates during experimental evolution despite evolving reduced fecundity and reduced embryo survival under one or two generations of anoxia. We develop theoretical models that explain why adaptation to a wide range of patterns of environmental fluctuations hinges on the existence of deterministic maternal effects, and that such deterministic maternal effects are more likely to contribute to adaptation than randomizing maternal effects.
Adaptation to Temporally Fluctuating Environments by the Evolution of Maternal Effects
Dey, Snigdhadip; Proulx, Stephen R.; Teotónio, Henrique
2016-01-01
All organisms live in temporally fluctuating environments. Theory predicts that the evolution of deterministic maternal effects (i.e., anticipatory maternal effects or transgenerational phenotypic plasticity) underlies adaptation to environments that fluctuate in a predictably alternating fashion over maternal-offspring generations. In contrast, randomizing maternal effects (i.e., diversifying and conservative bet-hedging), are expected to evolve in response to unpredictably fluctuating environments. Although maternal effects are common, evidence for their adaptive significance is equivocal since they can easily evolve as a correlated response to maternal selection and may or may not increase the future fitness of offspring. Using the hermaphroditic nematode Caenorhabditis elegans, we here show that the experimental evolution of maternal glycogen provisioning underlies adaptation to a fluctuating normoxia–anoxia hatching environment by increasing embryo survival under anoxia. In strictly alternating environments, we found that hermaphrodites evolved the ability to increase embryo glycogen provisioning when they experienced normoxia and to decrease embryo glycogen provisioning when they experienced anoxia. At odds with existing theory, however, populations facing irregularly fluctuating normoxia–anoxia hatching environments failed to evolve randomizing maternal effects. Instead, adaptation in these populations may have occurred through the evolution of fitness effects that percolate over multiple generations, as they maintained considerably high expected growth rates during experimental evolution despite evolving reduced fecundity and reduced embryo survival under one or two generations of anoxia. We develop theoretical models that explain why adaptation to a wide range of patterns of environmental fluctuations hinges on the existence of deterministic maternal effects, and that such deterministic maternal effects are more likely to contribute to adaptation than randomizing maternal effects. PMID:26910440
NASA Astrophysics Data System (ADS)
Lambrou, George I.; Chatziioannou, Aristotelis; Vlahopoulos, Spiros; Moschovi, Maria; Chrousos, George P.
Biological systems are dynamic and possess properties that depend on two key elements: initial conditions and the response of the system over time. Conceptualizing this on tumor models will influence conclusions drawn with regard to disease initiation and progression. Alterations in initial conditions dynamically reshape the properties of proliferating tumor cells. The present work aims to test the hypothesis of Wolfrom et al., that proliferation shows evidence for deterministic chaos in a manner such that subtle differences in the initial conditions give rise to non-linear response behavior of the system. Their hypothesis, tested on adherent Fao rat hepatoma cells, provides evidence that these cells manifest aperiodic oscillations in their proliferation rate. We have tested this hypothesis with some modifications to the proposed experimental setup. We have used the acute lymphoblastic leukemia cell line CCRF-CEM, as it provides an excellent substrate for modeling proliferation dynamics. Measurements were taken at time points varying from 24h to 48h, extending the assayed populations beyond that of previous published reports that dealt with the complex dynamic behavior of animal cell populations. We conducted flow cytometry studies to examine the apoptotic and necrotic rate of the system, as well as DNA content changes of the cells over time. The cells exhibited a proliferation rate of nonlinear nature, as this rate presented oscillatory behavior. The obtained data have been fit in known models of growth, such as logistic and Gompertzian growth.
Grenfell, B T; Lonergan, M E; Harwood, J
1992-04-20
This paper uses simple mathematical models to examine the long-term dynamic consequences of the 1988 epizootic of phocine distemper virus (PDV) infection in Northern European common seal populations. In a preliminary analysis of single outbreaks of infection deterministic compartmental models are used to estimate feasible ranges for the transmission rate of the infection and the level of disease-induced mortality. These results also indicate that the level of transmission in 1988 was probably sufficient to eradicate the infection throughout the Northern European common seal populations by the end of the first outbreak. An analysis of longer-term infection dynamics, which takes account of the density-dependent recovery of seal population levels, corroborates this finding. It also indicates that a reintroduction of the virus would be unlikely to cause an outbreak on the scale of the 1988 epizootic until the seal population had recovered for at least 10 years. The general ecological implications of these results are discussed.
NASA Astrophysics Data System (ADS)
Addor, N.; Jaun, S.; Fundel, F.; Zappa, M.
2012-04-01
The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This model chain relies on deterministic (COSMO-7) and probabilistic (COSMO-LEPS) atmospheric forecasts, which are used to force a semi-distributed hydrological model (PREVAH) coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework with which we assessed the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added value conveyed by the probability information, a 31-month reforecast was produced for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain is of up to 2 days lead time for the catchment considered. Brier skill scores show that probabilistic hydrological forecasts outperform their deterministic counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. We finally highlight challenges for making decisions on the basis of hydrological predictions, and discuss the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment.
A kinetic theory for age-structured stochastic birth-death processes
NASA Astrophysics Data System (ADS)
Chou, Tom; Greenman, Chris
Classical age-structured mass-action models such as the McKendrick-von Foerster equation have been extensively studied but they are structurally unable to describe stochastic fluctuations or population-size-dependent birth and death rates. Conversely, current theories that include size-dependent population dynamics (e.g., carrying capacity) cannot be easily extended to take into account age-dependent birth and death rates. In this paper, we present a systematic derivation of a new fully stochastic kinetic theory for interacting age-structured populations. By defining multiparticle probability density functions, we derive a hierarchy of kinetic equations for the stochastic evolution of an aging population undergoing birth and death. We show that the fully stochastic age-dependent birth-death process precludes factorization of the corresponding probability densities, which then must be solved by using a BBGKY-like hierarchy. Our results generalize both deterministic models and existing master equation approaches by providing an intuitive and efficient way to simultaneously model age- and population-dependent stochastic dynamics applicable to the study of demography, stem cell dynamics, and disease evolution. NSF.
Stochastic and deterministic models for agricultural production networks.
Bai, P; Banks, H T; Dediu, S; Govan, A Y; Last, M; Lloyd, A L; Nguyen, H K; Olufsen, M S; Rempala, G; Slenning, B D
2007-07-01
An approach to modeling the impact of disturbances in an agricultural production network is presented. A stochastic model and its approximate deterministic model for averages over sample paths of the stochastic system are developed. Simulations, sensitivity and generalized sensitivity analyses are given. Finally, it is shown how diseases may be introduced into the network and corresponding simulations are discussed.
NASA Astrophysics Data System (ADS)
Gupta, Anubhav; Banerjee, Tanmoy; Dutta, Partha Sharathi
2017-10-01
Understanding the influence of the structure of a dispersal network on the species persistence and modeling a realistic species dispersal in nature are two central issues in spatial ecology. A realistic dispersal structure which favors the persistence of interacting ecological systems was studied [M. D. Holland and A. Hastings, Nature (London) 456, 792 (2008), 10.1038/nature07395], where it was shown that a randomization of the structure of a dispersal network in a metapopulation model of prey and predator increases the species persistence via clustering, prolonged transient dynamics, and amplitudes of population fluctuations. In this paper, by contrast, we show that a deterministic network topology in a metapopulation can also favor asynchrony and prolonged transient dynamics if species dispersal obeys a long-range interaction governed by a distance-dependent power law. To explore the effects of power-law coupling, we take a realistic ecological model, namely, the Rosenzweig-MacArthur model in each patch (node) of the network of oscillators, and show that the coupled system is driven from synchrony to asynchrony with an increase in the power-law exponent. Moreover, to understand the relationship between species persistence and variations in power-law exponent, we compute a correlation coefficient to characterize cluster formation, a synchrony order parameter, and median predator amplitude. We further show that smaller metapopulations with fewer patches are more vulnerable to extinction as compared to larger metapopulations with a higher number of patches. We believe that the present work improves our understanding of the interconnection between the random network and the deterministic network in theoretical ecology.
Characterizing Uncertainty and Variability in PBPK Models ...
Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro
Mejlholm, Ole; Bøknæs, Niels; Dalgaard, Paw
2015-02-01
A new stochastic model for the simultaneous growth of Listeria monocytogenes and lactic acid bacteria (LAB) was developed and validated on data from naturally contaminated samples of cold-smoked Greenland halibut (CSGH) and cold-smoked salmon (CSS). During industrial processing these samples were added acetic and/or lactic acids. The stochastic model was developed from an existing deterministic model including the effect of 12 environmental parameters and microbial interaction (O. Mejlholm and P. Dalgaard, Food Microbiology, submitted for publication). Observed maximum population density (MPD) values of L. monocytogenes in naturally contaminated samples of CSGH and CSS were accurately predicted by the stochastic model based on measured variability in product characteristics and storage conditions. Results comparable to those from the stochastic model were obtained, when product characteristics of the least and most preserved sample of CSGH and CSS were used as input for the existing deterministic model. For both modelling approaches, it was shown that lag time and the effect of microbial interaction needs to be included to accurately predict MPD values of L. monocytogenes. Addition of organic acids to CSGH and CSS was confirmed as a suitable mitigation strategy against the risk of growth by L. monocytogenes as both types of products were in compliance with the EU regulation on ready-to-eat foods. Copyright © 2014 Elsevier Ltd. All rights reserved.
Gérard, Claude; Gonze, Didier; Lemaigre, Frédéric; Novák, Béla
2014-01-01
Recently, a molecular pathway linking inflammation to cell transformation has been discovered. This molecular pathway rests on a positive inflammatory feedback loop between NF-κB, Lin28, Let-7 microRNA and IL6, which leads to an epigenetic switch allowing cell transformation. A transient activation of an inflammatory signal, mediated by the oncoprotein Src, activates NF-κB, which elicits the expression of Lin28. Lin28 decreases the expression of Let-7 microRNA, which results in higher level of IL6 than achieved directly by NF-κB. In turn, IL6 can promote NF-κB activation. Finally, IL6 also elicits the synthesis of STAT3, which is a crucial activator for cell transformation. Here, we propose a computational model to account for the dynamical behavior of this positive inflammatory feedback loop. By means of a deterministic model, we show that an irreversible bistable switch between a transformed and a non-transformed state of the cell is at the core of the dynamical behavior of the positive feedback loop linking inflammation to cell transformation. The model indicates that inhibitors (tumor suppressors) or activators (oncogenes) of this positive feedback loop regulate the occurrence of the epigenetic switch by modulating the threshold of inflammatory signal (Src) needed to promote cell transformation. Both stochastic simulations and deterministic simulations of a heterogeneous cell population suggest that random fluctuations (due to molecular noise or cell-to-cell variability) are able to trigger cell transformation. Moreover, the model predicts that oncogenes/tumor suppressors respectively decrease/increase the robustness of the non-transformed state of the cell towards random fluctuations. Finally, the model accounts for the potential effect of competing endogenous RNAs, ceRNAs, on the dynamics of the epigenetic switch. Depending on their microRNA targets, the model predicts that ceRNAs could act as oncogenes or tumor suppressors by regulating the occurrence of cell transformation. PMID:24499937
Estimating the epidemic threshold on networks by deterministic connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Kezan, E-mail: lkzzr@sohu.com; Zhu, Guanghu; Fu, Xinchu
2014-12-15
For many epidemic networks some connections between nodes are treated as deterministic, while the remainder are random and have different connection probabilities. By applying spectral analysis to several constructed models, we find that one can estimate the epidemic thresholds of these networks by investigating information from only the deterministic connections. Nonetheless, in these models, generic nonuniform stochastic connections and heterogeneous community structure are also considered. The estimation of epidemic thresholds is achieved via inequalities with upper and lower bounds, which are found to be in very good agreement with numerical simulations. Since these deterministic connections are easier to detect thanmore » those stochastic connections, this work provides a feasible and effective method to estimate the epidemic thresholds in real epidemic networks.« less
Stochastic dynamics and logistic population growth
NASA Astrophysics Data System (ADS)
Méndez, Vicenç; Assaf, Michael; Campos, Daniel; Horsthemke, Werner
2015-06-01
The Verhulst model is probably the best known macroscopic rate equation in population ecology. It depends on two parameters, the intrinsic growth rate and the carrying capacity. These parameters can be estimated for different populations and are related to the reproductive fitness and the competition for limited resources, respectively. We investigate analytically and numerically the simplest possible microscopic scenarios that give rise to the logistic equation in the deterministic mean-field limit. We provide a definition of the two parameters of the Verhulst equation in terms of microscopic parameters. In addition, we derive the conditions for extinction or persistence of the population by employing either the momentum-space spectral theory or the real-space Wentzel-Kramers-Brillouin approximation to determine the probability distribution function and the mean time to extinction of the population. Our analytical results agree well with numerical simulations.
Complex Population Dynamics and the Coalescent Under Neutrality
Volz, Erik M.
2012-01-01
Estimates of the coalescent effective population size Ne can be poorly correlated with the true population size. The relationship between Ne and the population size is sensitive to the way in which birth and death rates vary over time. The problem of inference is exacerbated when the mechanisms underlying population dynamics are complex and depend on many parameters. In instances where nonparametric estimators of Ne such as the skyline struggle to reproduce the correct demographic history, model-based estimators that can draw on prior information about population size and growth rates may be more efficient. A coalescent model is developed for a large class of populations such that the demographic history is described by a deterministic nonlinear dynamical system of arbitrary dimension. This class of demographic model differs from those typically used in population genetics. Birth and death rates are not fixed, and no assumptions are made regarding the fraction of the population sampled. Furthermore, the population may be structured in such a way that gene copies reproduce both within and across demes. For this large class of models, it is shown how to derive the rate of coalescence, as well as the likelihood of a gene genealogy with heterochronous sampling and labeled taxa, and how to simulate a coalescent tree conditional on a complex demographic history. This theoretical framework encapsulates many of the models used by ecologists and epidemiologists and should facilitate the integration of population genetics with the study of mathematical population dynamics. PMID:22042576
Effect of sample volume on metastable zone width and induction time
NASA Astrophysics Data System (ADS)
Kubota, Noriaki
2012-04-01
The metastable zone width (MSZW) and the induction time, measured for a large sample (say>0.1 L) are reproducible and deterministic, while, for a small sample (say<1 mL), these values are irreproducible and stochastic. Such behaviors of MSZW and induction time were theoretically discussed both with stochastic and deterministic models. Equations for the distribution of stochastic MSZW and induction time were derived. The average values of stochastic MSZW and induction time both decreased with an increase in sample volume, while, the deterministic MSZW and induction time remained unchanged. Such different behaviors with variation in sample volume were explained in terms of detection sensitivity of crystallization events. The average values of MSZW and induction time in the stochastic model were compared with the deterministic MSZW and induction time, respectively. Literature data reported for paracetamol aqueous solution were explained theoretically with the presented models.
Distinguishing Error from Chaos in Ecological Time Series
NASA Astrophysics Data System (ADS)
Sugihara, George; Grenfell, Bryan; May, Robert M.
1990-11-01
Over the years, there has been much discussion about the relative importance of environmental and biological factors in regulating natural populations. Often it is thought that environmental factors are associated with stochastic fluctuations in population density, and biological ones with deterministic regulation. We revisit these ideas in the light of recent work on chaos and nonlinear systems. We show that completely deterministic regulatory factors can lead to apparently random fluctuations in population density, and we then develop a new method (that can be applied to limited data sets) to make practical distinctions between apparently noisy dynamics produced by low-dimensional chaos and population variation that in fact derives from random (high-dimensional)noise, such as environmental stochasticity or sampling error. To show its practical use, the method is first applied to models where the dynamics are known. We then apply the method to several sets of real data, including newly analysed data on the incidence of measles in the United Kingdom. Here the additional problems of secular trends and spatial effects are explored. In particular, we find that on a city-by-city scale measles exhibits low-dimensional chaos (as has previously been found for measles in New York City), whereas on a larger, country-wide scale the dynamics appear as a noisy two-year cycle. In addition to shedding light on the basic dynamics of some nonlinear biological systems, this work dramatizes how the scale on which data is collected and analysed can affect the conclusions drawn.
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...
2015-06-30
The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less
Deterministic and stochastic models for middle east respiratory syndrome (MERS)
NASA Astrophysics Data System (ADS)
Suryani, Dessy Rizki; Zevika, Mona; Nuraini, Nuning
2018-03-01
World Health Organization (WHO) data stated that since September 2012, there were 1,733 cases of Middle East Respiratory Syndrome (MERS) with 628 death cases that occurred in 27 countries. MERS was first identified in Saudi Arabia in 2012 and the largest cases of MERS outside Saudi Arabia occurred in South Korea in 2015. MERS is a disease that attacks the respiratory system caused by infection of MERS-CoV. MERS-CoV transmission occurs directly through direct contact between infected individual with non-infected individual or indirectly through contaminated object by the free virus. Suspected, MERS can spread quickly because of the free virus in environment. Mathematical modeling is used to illustrate the transmission of MERS disease using deterministic model and stochastic model. Deterministic model is used to investigate the temporal dynamic from the system to analyze the steady state condition. Stochastic model approach using Continuous Time Markov Chain (CTMC) is used to predict the future states by using random variables. From the models that were built, the threshold value for deterministic models and stochastic models obtained in the same form and the probability of disease extinction can be computed by stochastic model. Simulations for both models using several of different parameters are shown, and the probability of disease extinction will be compared with several initial conditions.
Spatial dynamics of invasion: the geometry of introduced species.
Korniss, Gyorgy; Caraco, Thomas
2005-03-07
Many exotic species combine low probability of establishment at each introduction with rapid population growth once introduction does succeed. To analyse this phenomenon, we note that invaders often cluster spatially when rare, and consequently an introduced exotic's population dynamics should depend on locally structured interactions. Ecological theory for spatially structured invasion relies on deterministic approximations, and determinism does not address the observed uncertainty of the exotic-introduction process. We take a new approach to the population dynamics of invasion and, by extension, to the general question of invasibility in any spatial ecology. We apply the physical theory for nucleation of spatial systems to a lattice-based model of competition between plant species, a resident and an invader, and the analysis reaches conclusions that differ qualitatively from the standard ecological theories. Nucleation theory distinguishes between dynamics of single- and multi-cluster invasion. Low introduction rates and small system size produce single-cluster dynamics, where success or failure of introduction is inherently stochastic. Single-cluster invasion occurs only if the cluster reaches a critical size, typically preceded by a number of failed attempts. For this case, we identify the functional form of the probability distribution of time elapsing until invasion succeeds. Although multi-cluster invasion for sufficiently large systems exhibits spatial averaging and almost-deterministic dynamics of the global densities, an analytical approximation from nucleation theory, known as Avrami's law, describes our simulation results far better than standard ecological approximations.
Aspen succession in the Intermountain West: A deterministic model
Dale L. Bartos; Frederick R. Ward; George S. Innis
1983-01-01
A deterministic model of succession in aspen forests was developed using existing data and intuition. The degree of uncertainty, which was determined by allowing the parameter values to vary at random within limits, was larger than desired. This report presents results of an analysis of model sensitivity to changes in parameter values. These results have indicated...
Impact of refining the assessment of dietary exposure to cadmium in the European adult population.
Ferrari, Pietro; Arcella, Davide; Heraud, Fanny; Cappé, Stefano; Fabiansson, Stefan
2013-01-01
Exposure assessment constitutes an important step in any risk assessment of potentially harmful substances present in food. The European Food Safety Authority (EFSA) first assessed dietary exposure to cadmium in Europe using a deterministic framework, resulting in mean values of exposure in the range of health-based guidance values. Since then, the characterisation of foods has been refined to better match occurrence and consumption data, and a new strategy to handle left-censoring in occurrence data was devised. A probabilistic assessment was performed and compared with deterministic estimates, using occurrence values at the European level and consumption data from 14 national dietary surveys. Mean estimates in the probabilistic assessment ranged from 1.38 (95% CI = 1.35-1.44) to 2.08 (1.99-2.23) µg kg⁻¹ bodyweight (bw) week⁻¹ across the different surveys, which were less than 10% lower than deterministic (middle bound) mean values that ranged from 1.50 to 2.20 µg kg⁻¹ bw week⁻¹. Probabilistic 95th percentile estimates of dietary exposure ranged from 2.65 (2.57-2.72) to 4.99 (4.62-5.38) µg kg⁻¹ bw week⁻¹, which were, with the exception of one survey, between 3% and 17% higher than middle-bound deterministic estimates. Overall, the proportion of subjects exceeding the tolerable weekly intake of 2.5 µg kg⁻¹ bw ranged from 14.8% (13.6-16.0%) to 31.2% (29.7-32.5%) according to the probabilistic assessment. The results of this work indicate that mean values of dietary exposure to cadmium in the European population were of similar magnitude using determinist or probabilistic assessments. For higher exposure levels, probabilistic estimates were almost consistently larger than deterministic counterparts, thus reflecting the impact of using the full distribution of occurrence values to determine exposure levels. It is considered prudent to use probabilistic methodology should exposure estimates be close to or exceeding health-based guidance values.
Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow
NASA Astrophysics Data System (ADS)
Gupta, Atma Ram; Kumar, Ashwani
2017-12-01
Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.
Hierarchical spatiotemporal matrix models for characterizing invasions
Hooten, M.B.; Wikle, C.K.; Dorazio, R.M.; Royle, J. Andrew
2007-01-01
The growth and dispersal of biotic organisms is an important subject in ecology. Ecologists are able to accurately describe survival and fecundity in plant and animal populations and have developed quantitative approaches to study the dynamics of dispersal and population size. Of particular interest are the dynamics of invasive species. Such nonindigenous animals and plants can levy significant impacts on native biotic communities. Effective models for relative abundance have been developed; however, a better understanding of the dynamics of actual population size (as opposed to relative abundance) in an invasion would be beneficial to all branches of ecology. In this article, we adopt a hierarchical Bayesian framework for modeling the invasion of such species while addressing the discrete nature of the data and uncertainty associated with the probability of detection. The nonlinear dynamics between discrete time points are intuitively modeled through an embedded deterministic population model with density-dependent growth and dispersal components. Additionally, we illustrate the importance of accommodating spatially varying dispersal rates. The method is applied to the specific case of the Eurasian Collared-Dove, an invasive species at mid-invasion in the United States at the time of this writing.
Hierarchical spatiotemporal matrix models for characterizing invasions
Hooten, M.B.; Wikle, C.K.; Dorazio, R.M.; Royle, J. Andrew
2007-01-01
The growth and dispersal of biotic organisms is an important subject in ecology. Ecologists are able to accurately describe survival and fecundity in plant and animal populations and have developed quantitative approaches to study the dynamics of dispersal and population size. Of particular interest are the dynamics of invasive species. Such nonindigenous animals and plants can levy significant impacts on native biotic communities. Effective models for relative abundance have been developed; however, a better understanding of the dynamics of actual population size (as opposed to relative abundance) in an invasion would be beneficial to all branches of ecology. In this article, we adopt a hierarchical Bayesian framework for modeling the invasion of such species while addressing the discrete nature of the data and uncertainty associated with the probability of detection. The nonlinear dynamics between discrete time points are intuitively modeled through an embedded deterministic population model with density-dependent growth and dispersal components. Additionally, we illustrate the importance of accommodating spatially varying dispersal rates. The method is applied to the specific case of the Eurasian Collared-Dove, an invasive species at mid-invasion in the United States at the time of this writing. ?? 2006, The International Biometric Society.
Jewett, Ethan M; Steinrücken, Matthias; Song, Yun S
2016-11-01
Many approaches have been developed for inferring selection coefficients from time series data while accounting for genetic drift. These approaches have been motivated by the intuition that properly accounting for the population size history can significantly improve estimates of selective strengths. However, the improvement in inference accuracy that can be attained by modeling drift has not been characterized. Here, by comparing maximum likelihood estimates of selection coefficients that account for the true population size history with estimates that ignore drift by assuming allele frequencies evolve deterministically in a population of infinite size, we address the following questions: how much can modeling the population size history improve estimates of selection coefficients? How much can mis-inferred population sizes hurt inferences of selection coefficients? We conduct our analysis under the discrete Wright-Fisher model by deriving the exact probability of an allele frequency trajectory in a population of time-varying size and we replicate our results under the diffusion model. For both models, we find that ignoring drift leads to estimates of selection coefficients that are nearly as accurate as estimates that account for the true population history, even when population sizes are small and drift is high. This result is of interest because inference methods that ignore drift are widely used in evolutionary studies and can be many orders of magnitude faster than methods that account for population sizes. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
NASA Astrophysics Data System (ADS)
Tolson, B.; Matott, L. S.; Gaffoor, T. A.; Asadzadeh, M.; Shafii, M.; Pomorski, P.; Xu, X.; Jahanpour, M.; Razavi, S.; Haghnegahdar, A.; Craig, J. R.
2015-12-01
We introduce asynchronous parallel implementations of the Dynamically Dimensioned Search (DDS) family of algorithms including DDS, discrete DDS, PA-DDS and DDS-AU. These parallel algorithms are unique from most existing parallel optimization algorithms in the water resources field in that parallel DDS is asynchronous and does not require an entire population (set of candidate solutions) to be evaluated before generating and then sending a new candidate solution for evaluation. One key advance in this study is developing the first parallel PA-DDS multi-objective optimization algorithm. The other key advance is enhancing the computational efficiency of solving optimization problems (such as model calibration) by combining a parallel optimization algorithm with the deterministic model pre-emption concept. These two efficiency techniques can only be combined because of the asynchronous nature of parallel DDS. Model pre-emption functions to terminate simulation model runs early, prior to completely simulating the model calibration period for example, when intermediate results indicate the candidate solution is so poor that it will definitely have no influence on the generation of further candidate solutions. The computational savings of deterministic model preemption available in serial implementations of population-based algorithms (e.g., PSO) disappear in synchronous parallel implementations as these algorithms. In addition to the key advances above, we implement the algorithms across a range of computation platforms (Windows and Unix-based operating systems from multi-core desktops to a supercomputer system) and package these for future modellers within a model-independent calibration software package called Ostrich as well as MATLAB versions. Results across multiple platforms and multiple case studies (from 4 to 64 processors) demonstrate the vast improvement over serial DDS-based algorithms and highlight the important role model pre-emption plays in the performance of parallel, pre-emptable DDS algorithms. Case studies include single- and multiple-objective optimization problems in water resources model calibration and in many cases linear or near linear speedups are observed.
Price-Dynamics of Shares and Bohmian Mechanics: Deterministic or Stochastic Model?
NASA Astrophysics Data System (ADS)
Choustova, Olga
2007-02-01
We apply the mathematical formalism of Bohmian mechanics to describe dynamics of shares. The main distinguishing feature of the financial Bohmian model is the possibility to take into account market psychology by describing expectations of traders by the pilot wave. We also discuss some objections (coming from conventional financial mathematics of stochastic processes) against the deterministic Bohmian model. In particular, the objection that such a model contradicts to the efficient market hypothesis which is the cornerstone of the modern market ideology. Another objection is of pure mathematical nature: it is related to the quadratic variation of price trajectories. One possibility to reply to this critique is to consider the stochastic Bohm-Vigier model, instead of the deterministic one. We do this in the present note.
Modeling heterogeneous responsiveness of intrinsic apoptosis pathway
2013-01-01
Background Apoptosis is a cell suicide mechanism that enables multicellular organisms to maintain homeostasis and to eliminate individual cells that threaten the organism’s survival. Dependent on the type of stimulus, apoptosis can be propagated by extrinsic pathway or intrinsic pathway. The comprehensive understanding of the molecular mechanism of apoptotic signaling allows for development of mathematical models, aiming to elucidate dynamical and systems properties of apoptotic signaling networks. There have been extensive efforts in modeling deterministic apoptosis network accounting for average behavior of a population of cells. Cellular networks, however, are inherently stochastic and significant cell-to-cell variability in apoptosis response has been observed at single cell level. Results To address the inevitable randomness in the intrinsic apoptosis mechanism, we develop a theoretical and computational modeling framework of intrinsic apoptosis pathway at single-cell level, accounting for both deterministic and stochastic behavior. Our deterministic model, adapted from the well-accepted Fussenegger model, shows that an additional positive feedback between the executioner caspase and the initiator caspase plays a fundamental role in yielding the desired property of bistability. We then examine the impact of intrinsic fluctuations of biochemical reactions, viewed as intrinsic noise, and natural variation of protein concentrations, viewed as extrinsic noise, on behavior of the intrinsic apoptosis network. Histograms of the steady-state output at varying input levels show that the intrinsic noise could elicit a wider region of bistability over that of the deterministic model. However, the system stochasticity due to intrinsic fluctuations, such as the noise of steady-state response and the randomness of response delay, shows that the intrinsic noise in general is insufficient to produce significant cell-to-cell variations at physiologically relevant level of molecular numbers. Furthermore, the extrinsic noise represented by random variations of two key apoptotic proteins, namely Cytochrome C and inhibitor of apoptosis proteins (IAP), is modeled separately or in combination with intrinsic noise. The resultant stochasticity in the timing of intrinsic apoptosis response shows that the fluctuating protein variations can induce cell-to-cell stochastic variability at a quantitative level agreeing with experiments. Finally, simulations illustrate that the mean abundance of fluctuating IAP protein is positively correlated with the degree of cellular stochasticity of the intrinsic apoptosis pathway. Conclusions Our theoretical and computational study shows that the pronounced non-genetic heterogeneity in intrinsic apoptosis responses among individual cells plausibly arises from extrinsic rather than intrinsic origin of fluctuations. In addition, it predicts that the IAP protein could serve as a potential therapeutic target for suppression of the cell-to-cell variation in the intrinsic apoptosis responsiveness. PMID:23875784
Erbe, Malena; Gredler, Birgit; Seefried, Franz Reinhold; Bapst, Beat; Simianer, Henner
2013-01-01
Prediction of genomic breeding values is of major practical relevance in dairy cattle breeding. Deterministic equations have been suggested to predict the accuracy of genomic breeding values in a given design which are based on training set size, reliability of phenotypes, and the number of independent chromosome segments ([Formula: see text]). The aim of our study was to find a general deterministic equation for the average accuracy of genomic breeding values that also accounts for marker density and can be fitted empirically. Two data sets of 5'698 Holstein Friesian bulls genotyped with 50 K SNPs and 1'332 Brown Swiss bulls genotyped with 50 K SNPs and imputed to ∼600 K SNPs were available. Different k-fold (k = 2-10, 15, 20) cross-validation scenarios (50 replicates, random assignment) were performed using a genomic BLUP approach. A maximum likelihood approach was used to estimate the parameters of different prediction equations. The highest likelihood was obtained when using a modified form of the deterministic equation of Daetwyler et al. (2010), augmented by a weighting factor (w) based on the assumption that the maximum achievable accuracy is [Formula: see text]. The proportion of genetic variance captured by the complete SNP sets ([Formula: see text]) was 0.76 to 0.82 for Holstein Friesian and 0.72 to 0.75 for Brown Swiss. When modifying the number of SNPs, w was found to be proportional to the log of the marker density up to a limit which is population and trait specific and was found to be reached with ∼20'000 SNPs in the Brown Swiss population studied.
Two-strain competition in quasineutral stochastic disease dynamics.
Kogan, Oleg; Khasin, Michael; Meerson, Baruch; Schneider, David; Myers, Christopher R
2014-10-01
We develop a perturbation method for studying quasineutral competition in a broad class of stochastic competition models and apply it to the analysis of fixation of competing strains in two epidemic models. The first model is a two-strain generalization of the stochastic susceptible-infected-susceptible (SIS) model. Here we extend previous results due to Parsons and Quince [Theor. Popul. Biol. 72, 468 (2007)], Parsons et al. [Theor. Popul. Biol. 74, 302 (2008)], and Lin, Kim, and Doering [J. Stat. Phys. 148, 646 (2012)]. The second model, a two-strain generalization of the stochastic susceptible-infected-recovered (SIR) model with population turnover, has not been studied previously. In each of the two models, when the basic reproduction numbers of the two strains are identical, a system with an infinite population size approaches a point on the deterministic coexistence line (CL): a straight line of fixed points in the phase space of subpopulation sizes. Shot noise drives one of the strain populations to fixation, and the other to extinction, on a time scale proportional to the total population size. Our perturbation method explicitly tracks the dynamics of the probability distribution of the subpopulations in the vicinity of the CL. We argue that, whereas the slow strain has a competitive advantage for mathematically "typical" initial conditions, it is the fast strain that is more likely to win in the important situation when a few infectives of both strains are introduced into a susceptible population.
Stochastic and deterministic model of microbial heat inactivation.
Corradini, Maria G; Normand, Mark D; Peleg, Micha
2010-03-01
Microbial inactivation is described by a model based on the changing survival probabilities of individual cells or spores. It is presented in a stochastic and discrete form for small groups, and as a continuous deterministic model for larger populations. If the underlying mortality probability function remains constant throughout the treatment, the model generates first-order ("log-linear") inactivation kinetics. Otherwise, it produces survival patterns that include Weibullian ("power-law") with upward or downward concavity, tailing with a residual survival level, complete elimination, flat "shoulder" with linear or curvilinear continuation, and sigmoid curves. In both forms, the same algorithm or model equation applies to isothermal and dynamic heat treatments alike. Constructing the model does not require assuming a kinetic order or knowledge of the inactivation mechanism. The general features of its underlying mortality probability function can be deduced from the experimental survival curve's shape. Once identified, the function's coefficients, the survival parameters, can be estimated directly from the experimental survival ratios by regression. The model is testable in principle but matching the estimated mortality or inactivation probabilities with those of the actual cells or spores can be a technical challenge. The model is not intended to replace current models to calculate sterility. Its main value, apart from connecting the various inactivation patterns to underlying probabilities at the cellular level, might be in simulating the irregular survival patterns of small groups of cells and spores. In principle, it can also be used for nonthermal methods of microbial inactivation and their combination with heat.
Tularosa Basin Play Fairway Analysis Data and Models
Nash, Greg
2017-07-11
This submission includes raster datasets for each layer of evidence used for weights of evidence analysis as well as the deterministic play fairway analysis (PFA). Data representative of heat, permeability and groundwater comprises some of the raster datasets. Additionally, the final deterministic PFA model is provided along with a certainty model. All of these datasets are best used with an ArcGIS software package, specifically Spatial Data Modeler.
How to reach linguistic consensus: a proof of convergence for the naming game.
De Vylder, Bart; Tuyls, Karl
2006-10-21
In this paper we introduce a mathematical model of naming games. Naming games have been widely used within research on the origins and evolution of language. Despite the many interesting empirical results these studies have produced, most of this research lacks a formal elucidating theory. In this paper we show how a population of agents can reach linguistic consensus, i.e. learn to use one common language to communicate with one another. Our approach differs from existing formal work in two important ways: one, we relax the too strong assumption that an agent samples infinitely often during each time interval. This assumption is usually made to guarantee convergence of an empirical learning process to a deterministic dynamical system. Two, we provide a proof that under these new realistic conditions, our model converges to a common language for the entire population of agents. Finally the model is experimentally validated.
Modeling approaches in avian conservation and the role of field biologists
Beissinger, Steven R.; Walters, J.R.; Catanzaro, D.G.; Smith, Kimberly G.; Dunning, J.B.; Haig, Susan M.; Noon, Barry; Stith, Bradley M.
2006-01-01
This review grew out of our realization that models play an increasingly important role in conservation but are rarely used in the research of most avian biologists. Modelers are creating models that are more complex and mechanistic and that can incorporate more of the knowledge acquired by field biologists. Such models require field biologists to provide more specific information, larger sample sizes, and sometimes new kinds of data, such as habitat-specific demography and dispersal information. Field biologists need to support model development by testing key model assumptions and validating models. The best conservation decisions will occur where cooperative interaction enables field biologists, modelers, statisticians, and managers to contribute effectively. We begin by discussing the general form of ecological models—heuristic or mechanistic, "scientific" or statistical—and then highlight the structure, strengths, weaknesses, and applications of six types of models commonly used in avian conservation: (1) deterministic single-population matrix models, (2) stochastic population viability analysis (PVA) models for single populations, (3) metapopulation models, (4) spatially explicit models, (5) genetic models, and (6) species distribution models. We end by considering their unique attributes, determining whether the assumptions that underlie the structure are valid, and testing the ability of the model to predict the future correctly.
Lahodny, G E; Gautam, R; Ivanek, R
2015-01-01
Indirect transmission through the environment, pathogen shedding by infectious hosts, replication of free-living pathogens within the environment, and environmental decontamination are suspected to play important roles in the spread and control of environmentally transmitted infectious diseases. To account for these factors, the classic Susceptible-Infectious-Recovered-Susceptible epidemic model is modified to include a compartment representing the amount of free-living pathogen within the environment. The model accounts for host demography, direct and indirect transmission, replication of free-living pathogens in the environment, and removal of free-living pathogens by natural death or environmental decontamination. Based on the assumptions of the deterministic model, a continuous-time Markov chain model is developed. An estimate for the probability of disease extinction or a major outbreak is obtained by approximating the Markov chain with a multitype branching process. Numerical simulations illustrate important differences between the deterministic and stochastic counterparts, relevant for outbreak prevention, that depend on indirect transmission, pathogen shedding by infectious hosts, replication of free-living pathogens, and environmental decontamination. The probability of a major outbreak is computed for salmonellosis in a herd of dairy cattle as well as cholera in a human population. An explicit expression for the probability of disease extinction or a major outbreak in terms of the model parameters is obtained for systems with no direct transmission or replication of free-living pathogens.
Leirs, H.; Stenseth, N.C.; Nichols, J.D.; Hines, J.E.; Verhagen, R.; Verheyen, W.
1997-01-01
Ecology has long been troubled by the controversy over how populations are regulated. Some ecologists focus on the role of environmental effects, whereas others argue that density-dependent feedback mechanisms are central. The relative importance of both processes is still hotly debated, but clear examples of both processes acting in the same population are rare. Keyfactor analysis (regression of population changes on possible causal factors) and time-series analysis are often used to investigate the presence of density dependence, but such approaches may be biased and provide no information on actual demographic rates. Here we report on both density-dependent and density-independent effects in a murid rodent pest species, the multimammate rat Mastomys natalensis (Smith, 1834), using statistical capture-recapture models. Both effects occur simultaneously, but we also demonstrate that they do not affect all demographic rates in the same way. We have incorporated the obtained estimates of demographic rates in a population dynamics model and show that the observed dynamics are affected by stabilizing nonlinear density-dependent components coupled with strong deterministic and stochastic seasonal components.
Combining Deterministic structures and stochastic heterogeneity for transport modeling
NASA Astrophysics Data System (ADS)
Zech, Alraune; Attinger, Sabine; Dietrich, Peter; Teutsch, Georg
2017-04-01
Contaminant transport in highly heterogeneous aquifers is extremely challenging and subject of current scientific debate. Tracer plumes often show non-symmetric but highly skewed plume shapes. Predicting such transport behavior using the classical advection-dispersion-equation (ADE) in combination with a stochastic description of aquifer properties requires a dense measurement network. This is in contrast to the available information for most aquifers. A new conceptual aquifer structure model is presented which combines large-scale deterministic information and the stochastic approach for incorporating sub-scale heterogeneity. The conceptual model is designed to allow for a goal-oriented, site specific transport analysis making use of as few data as possible. Thereby the basic idea is to reproduce highly skewed tracer plumes in heterogeneous media by incorporating deterministic contrasts and effects of connectivity instead of using unimodal heterogeneous models with high variances. The conceptual model consists of deterministic blocks of mean hydraulic conductivity which might be measured by pumping tests indicating values differing in orders of magnitudes. A sub-scale heterogeneity is introduced within every block. This heterogeneity can be modeled as bimodal or log-normal distributed. The impact of input parameters, structure and conductivity contrasts is investigated in a systematic manor. Furthermore, some first successful implementation of the model was achieved for the well known MADE site.
Modeling the within-host dynamics of cholera: bacterial-viral interaction.
Wang, Xueying; Wang, Jin
2017-08-01
Novel deterministic and stochastic models are proposed in this paper for the within-host dynamics of cholera, with a focus on the bacterial-viral interaction. The deterministic model is a system of differential equations describing the interaction among the two types of vibrios and the viruses. The stochastic model is a system of Markov jump processes that is derived based on the dynamics of the deterministic model. The multitype branching process approximation is applied to estimate the extinction probability of bacteria and viruses within a human host during the early stage of the bacterial-viral infection. Accordingly, a closed-form expression is derived for the disease extinction probability, and analytic estimates are validated with numerical simulations. The local and global dynamics of the bacterial-viral interaction are analysed using the deterministic model, and the result indicates that there is a sharp disease threshold characterized by the basic reproduction number [Formula: see text]: if [Formula: see text], vibrios ingested from the environment into human body will not cause cholera infection; if [Formula: see text], vibrios will grow with increased toxicity and persist within the host, leading to human cholera. In contrast, the stochastic model indicates, more realistically, that there is always a positive probability of disease extinction within the human host.
Parallel Stochastic discrete event simulation of calcium dynamics in neuron.
Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W
2017-09-26
The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.
Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter
2015-01-20
While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.
Schreiber, Sebastian J; Rosenheim, Jay A; Williams, Neal W; Harder, Lawrence D
2015-01-01
Variation in resource availability can select for traits that reduce the negative impacts of this variability on mean fitness. Such selection may be particularly potent for seed production in flowering plants, as they often experience variation in pollen receipt among individuals and among flowers within individuals. Using analytically tractable models, we examine the optimal allocations for producing ovules, attracting pollen, and maturing seeds in deterministic and stochastic pollen environments. In deterministic environments, the optimal strategy attracts sufficient pollen to fertilize every ovule and mature every zygote into a seed. Stochastic environments select for allocations proportional to the risk of seed production being limited by zygotes or seed maturation. When producing an ovule is cheap and maturing a seed is expensive, among-plant variation selects for attracting more pollen at the expense of producing fewer ovules and having fewer resources for seed maturation. Despite this increased allocation, such populations are likely to be pollen limited. In contrast, within-plant variation generally selects for an overproduction of ovules and, to a lesser extent, pollen attraction. Such populations are likely to be resource limited and exhibit low seed-to-ovule ratios. These results highlight the importance of multiscale variation in the evolution and ecology of resource allocations.
Granato, Gregory E.; Jones, Susan C.
2014-01-01
In cooperation with FHWA, the U.S. Geological Survey developed the stochastic empirical loading and dilution model (SELDM) to supersede the 1990 FHWA runoff quality model. The SELDM tool is designed to transform disparate and complex scientific data into meaningful information about the adverse risks of runoff on receiving waters, the potential need for mitigation measures, and the potential effectiveness of such measures for reducing such risks. The SELDM tool is easy to use because much of the information and data needed to run it are embedded in the model and obtained by defining the site location and five simple basin properties. Information and data from thousands of sites across the country were compiled to facilitate the use of the SELDM tool. A case study illustrates how to use the SELDM tool for conducting the types of sensitivity analyses needed to properly assess water quality risks. For example, the use of deterministic values to model upstream stormflows instead of representative variations in prestorm flow and runoff may substantially overestimate the proportion of highway runoff in downstream flows. Also, the risks for total phosphorus excursions are substantially affected by the selected criteria and the modeling methods used. For example, if a single deterministic concentration is used rather than a stochastic population of values to model upstream concentrations, then the percentage of water quality excursions in the downstream receiving waters may depend entirely on the selected upstream concentration.
Entrepreneurs, chance, and the deterministic concentration of wealth.
Fargione, Joseph E; Lehman, Clarence; Polasky, Stephen
2011-01-01
In many economies, wealth is strikingly concentrated. Entrepreneurs--individuals with ownership in for-profit enterprises--comprise a large portion of the wealthiest individuals, and their behavior may help explain patterns in the national distribution of wealth. Entrepreneurs are less diversified and more heavily invested in their own companies than is commonly assumed in economic models. We present an intentionally simplified individual-based model of wealth generation among entrepreneurs to assess the role of chance and determinism in the distribution of wealth. We demonstrate that chance alone, combined with the deterministic effects of compounding returns, can lead to unlimited concentration of wealth, such that the percentage of all wealth owned by a few entrepreneurs eventually approaches 100%. Specifically, concentration of wealth results when the rate of return on investment varies by entrepreneur and by time. This result is robust to inclusion of realities such as differing skill among entrepreneurs. The most likely overall growth rate of the economy decreases as businesses become less diverse, suggesting that high concentrations of wealth may adversely affect a country's economic growth. We show that a tax on large inherited fortunes, applied to a small portion of the most fortunate in the population, can efficiently arrest the concentration of wealth at intermediate levels.
Scott, B R; Lyzlov, A F; Osovets, S V
1998-05-01
During a Phase-I effort, studies were planned to evaluate deterministic (nonstochastic) effects of chronic exposure of nuclear workers at the Mayak atomic complex in the former Soviet Union to relatively high levels (> 0.25 Gy) of ionizing radiation. The Mayak complex has been used, since the late 1940's, to produce plutonium for nuclear weapons. Workers at Site A of the complex were involved in plutonium breeding using nuclear reactors, and some were exposed to relatively large doses of gamma rays plus relatively small neutron doses. The Weibull normalized-dose model, which has been set up to evaluate the risk of specific deterministic effects of combined, continuous exposure of humans to alpha, beta, and gamma radiations, is here adapted for chronic exposure to gamma rays and neutrons during repeated 6-h work shifts--as occurred for some nuclear workers at Site A. Using the adapted model, key conclusions were reached that will facilitate a Phase-II study of deterministic effects among Mayak workers. These conclusions include the following: (1) neutron doses may be more important for Mayak workers than for Japanese A-bomb victims in Hiroshima and can be accounted for using an adjusted dose (which accounts for neutron relative biological effectiveness); (2) to account for dose-rate effects, normalized dose X (a dimensionless fraction of an LD50 or ED50) can be evaluated in terms of an adjusted dose; (3) nonlinear dose-response curves for the risk of death via the hematopoietic mode can be converted to linear dose-response curves (for low levels of risk) using a newly proposed dimensionless dose, D = X(V), in units of Oklad (where D is pronounced "deh"), and V is the shape parameter in the Weibull model; (4) for X < or = Xo, where Xo is the threshold normalized dose, D = 0; (5) unlike absorbed dose, the dose D can be averaged over different Mayak workers in order to calculate the average risk of death via the hematopoietic mode for the population exposed at Site A; and (6) the expected cases of death via the hematopoietic syndrome mode for Mayak workers chronically exposed during work shifts at Site A to gamma rays and neutrons can be predicted using ln(2)B M[D]; where B (pronounced "beh") is the number of workers at risk (criticality accident victims excluded); and M[D] is the average (mean) value of D (averaged over the worker population at risk, for Site A, for the time period considered). These results can be used to facilitate a Phase II study of deterministic radiation effects among Mayak workers chronically exposed to gamma rays and neutrons.
Understanding resistant effect of mosquito on fumigation strategy in dengue control program
NASA Astrophysics Data System (ADS)
Aldila, D.; Situngkir, N.; Nareswari, K.
2018-01-01
A mathematical model of dengue disease transmission will be introduced in this talk with involving fumigation intervention into mosquito population. Worsening effect of uncontrolled fumigation in the form of resistance of mosquito to fumigation chemicals will also be included into the model to capture the reality in the field. Deterministic approach in a 9 dimensional of ordinary differential equation will be used. Analytical result about the existence and local stability of the equilibrium points followed with the basic reproduction number will be discussed. Some numerical result will be performed for some scenario to give a better interpretation for the analytical results.
A model for a spatially structured metapopulation accounting for within patch dynamics.
Smith, Andrew G; McVinish, Ross; Pollett, Philip K
2014-01-01
We develop a stochastic metapopulation model that accounts for spatial structure as well as within patch dynamics. Using a deterministic approximation derived from a functional law of large numbers, we develop conditions for extinction and persistence of the metapopulation in terms of the birth, death and migration parameters. Interestingly, we observe the Allee effect in a metapopulation comprising two patches of greatly different sizes, despite there being decreasing patch specific per-capita birth rates. We show that the Allee effect is due to the way the migration rates depend on the population density of the patches. Copyright © 2013 Elsevier Inc. All rights reserved.
The Diffusion Model Is Not a Deterministic Growth Model: Comment on Jones and Dzhafarov (2014)
Smith, Philip L.; Ratcliff, Roger; McKoon, Gail
2015-01-01
Jones and Dzhafarov (2014) claim that several current models of speeded decision making in cognitive tasks, including the diffusion model, can be viewed as special cases of other general models or model classes. The general models can be made to match any set of response time (RT) distribution and accuracy data exactly by a suitable choice of parameters and so are unfalsifiable. The implication of their claim is that models like the diffusion model are empirically testable only by artificially restricting them to exclude unfalsifiable instances of the general model. We show that Jones and Dzhafarov’s argument depends on enlarging the class of “diffusion” models to include models in which there is little or no diffusion. The unfalsifiable models are deterministic or near-deterministic growth models, from which the effects of within-trial variability have been removed or in which they are constrained to be negligible. These models attribute most or all of the variability in RT and accuracy to across-trial variability in the rate of evidence growth, which is permitted to be distributed arbitrarily and to vary freely across experimental conditions. In contrast, in the standard diffusion model, within-trial variability in evidence is the primary determinant of variability in RT. Across-trial variability, which determines the relative speed of correct responses and errors, is theoretically and empirically constrained. Jones and Dzhafarov’s attempt to include the diffusion model in a class of models that also includes deterministic growth models misrepresents and trivializes it and conveys a misleading picture of cognitive decision-making research. PMID:25347314
Sardanyés, Josep; Arderiu, Andreu; Elena, Santiago F; Alarcón, Tomás
2018-05-01
Evolutionary and dynamical investigations into real viral populations indicate that RNA replication can range between the two extremes represented by so-called 'stamping machine replication' (SMR) and 'geometric replication' (GR). The impact of asymmetries in replication for single-stranded (+) sense RNA viruses has been mainly studied with deterministic models. However, viral replication should be better described by including stochasticity, as the cell infection process is typically initiated with a very small number of RNA macromolecules, and thus largely influenced by intrinsic noise. Under appropriate conditions, deterministic theoretical descriptions of viral RNA replication predict a quasi-neutral coexistence scenario, with a line of fixed points involving different strands' equilibrium ratios depending on the initial conditions. Recent research into the quasi-neutral coexistence in two competing populations reveals that stochastic fluctuations fundamentally alter the mean-field scenario, and one of the two species outcompetes the other. In this article, we study this phenomenon for viral RNA replication modes by means of stochastic simulations and a diffusion approximation. Our results reveal that noise has a strong impact on the amplification of viral RNAs, also causing the emergence of noise-induced bistability. We provide analytical criteria for the dominance of (+) sense strands depending on the initial populations on the line of equilibria, which are in agreement with direct stochastic simulation results. The biological implications of this noise-driven mechanism are discussed within the framework of the evolutionary dynamics of RNA viruses with different modes of replication. © 2018 The Author(s).
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
({The) Solar System Large Planets influence on a new Maunder Miniμm}
NASA Astrophysics Data System (ADS)
Yndestad, Harald; Solheim, Jan-Erik
2016-04-01
In 1890´s G. Spörer and E. W. Maunder (1890) reported that the solar activity stopped in a period of 70 years from 1645 to 1715. Later a reconstruction of the solar activity confirms the grand minima Maunder (1640-1720), Spörer (1390-1550), Wolf (1270-1340), and the minima Oort (1010-1070) and Dalton (1785-1810) since the year 1000 A.D. (Usoskin et al. 2007). These minimum periods have been associated with less irradiation from the Sun and cold climate periods on Earth. An identification of a three grand Maunder type periods and two Dalton type periods in a period thousand years, indicates that sooner or later there will be a colder climate on Earth from a new Maunder- or Dalton- type period. The cause of these minimum periods, are not well understood. An expected new Maunder-type period is based on the properties of solar variability. If the solar variability has a deterministic element, we can estimate better a new Maunder grand minimum. A random solar variability can only explain the past. This investigation is based on the simple idea that if the solar variability has a deterministic property, it must have a deterministic source, as a first cause. If this deterministic source is known, we can compute better estimates the next expected Maunder grand minimum period. The study is based on a TSI ACRIM data series from 1700, a TSI ACRIM data series from 1000 A.D., sunspot data series from 1611 and a Solar Barycenter orbit data series from 1000. The analysis method is based on a wavelet spectrum analysis, to identify stationary periods, coincidence periods and their phase relations. The result shows that the TSI variability and the sunspots variability have deterministic oscillations, controlled by the large planets Jupiter, Uranus and Neptune, as the first cause. A deterministic model of TSI variability and sunspot variability confirms the known minimum and grand minimum periods since 1000. From this deterministic model we may expect a new Maunder type sunspot minimum period from about 2018 to 2055. The deterministic model of a TSI ACRIM data series from 1700 computes a new Maunder type grand minimum period from 2015 to 2071. A model of the longer TSI ACRIM data series from 1000 computes a new Dalton to Maunder type minimum irradiation period from 2047 to 2068.
Analysis of a model of gambiense sleeping sickness in humans and cattle.
Ndondo, A M; Munganga, J M W; Mwambakana, J N; Saad-Roy, C M; van den Driessche, P; Walo, R O
2016-01-01
Human African Trypanosomiasis (HAT) and Nagana in cattle, commonly called sleeping sickness, is caused by trypanosome protozoa transmitted by bites of infected tsetse flies. We present a deterministic model for the transmission of HAT caused by Trypanosoma brucei gambiense between human hosts, cattle hosts and tsetse flies. The model takes into account the growth of the tsetse fly, from its larval stage to the adult stage. Disease in the tsetse fly population is modeled by three compartments, and both the human and cattle populations are modeled by four compartments incorporating the two stages of HAT. We provide a rigorous derivation of the basic reproduction number R0. For R0 < 1, the disease free equilibrium is globally asymptotically stable, thus HAT dies out; whereas (assuming no return to susceptibility) for R0 >1, HAT persists. Elasticity indices for R0 with respect to different parameters are calculated with baseline parameter values appropriate for HAT in West Africa; indicating parameters that are important for control strategies to bring R0 below 1. Numerical simulations with R0 > 1 show values for the infected populations at the endemic equilibrium, and indicate that with certain parameter values, HAT could not persist in the human population in the absence of cattle.
Susdorf, R; Salama, N K G; Lusseau, D
2017-11-21
Atlantic salmon Salmo salar is an iconic species of high conservation and economic importance. At sea, individuals typically are subject to sea lice infestation, which can have detrimental effects on their host. Over recent decades, the body condition and marine survival in NE Atlantic stocks have generally decreased, reflected in fewer adults returning to rivers, which is partly attributable to sea lice. We developed a deterministic stage-structured population model to assess condition-mediated population dynamics resulting in changing fecundity, age at sexual maturation and marine survival rate. The model is parameterized using data from the North Esk system, north-east Scotland. Both constant and density-dependent juvenile survival rates are considered. We show that even small sea lice-mediated changes in mean body condition of MSW can cause substantial population declines, whereas 1SW condition is less influential. Density dependence alleviates the condition-mediated population effect. The resilience of the population to demographic perturbations declines as adult condition is reduced. Indirect demographic changes in salmonid life-history traits (e.g., body condition) are often considered unimportant for population trajectory. The model shows that Atlantic salmon population dynamics can be highly responsive to sea lice-mediated effects on adult body condition, thus highlighting the importance of non-lethal parasitic long-term effects. © 2017 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Dannemann, Teodoro; Boyer, Denis; Miramontes, Octavio
2018-04-10
Multiple-scale mobility is ubiquitous in nature and has become instrumental for understanding and modeling animal foraging behavior. However, the impact of individual movements on the long-term stability of populations remains largely unexplored. We analyze deterministic and stochastic Lotka-Volterra systems, where mobile predators consume scarce resources (prey) confined in patches. In fragile systems (that is, those unfavorable to species coexistence), the predator species has a maximized abundance and is resilient to degraded prey conditions when individual mobility is multiple scaled. Within the Lévy flight model, highly superdiffusive foragers rarely encounter prey patches and go extinct, whereas normally diffusing foragers tend to proliferate within patches, causing extinctions by overexploitation. Lévy flights of intermediate index allow a sustainable balance between patch exploitation and regeneration over wide ranges of demographic rates. Our analytical and simulated results can explain field observations and suggest that scale-free random movements are an important mechanism by which entire populations adapt to scarcity in fragmented ecosystems.
Impacts of Considering Climate Variability on Investment Decisions in Ethiopia
NASA Astrophysics Data System (ADS)
Strzepek, K.; Block, P.; Rosegrant, M.; Diao, X.
2005-12-01
In Ethiopia, climate extremes, inducing droughts or floods, are not unusual. Monitoring the effects of these extremes, and climate variability in general, is critical for economic prediction and assessment of the country's future welfare. The focus of this study involves adding climate variability to a deterministic, mean climate-driven agro-economic model, in an attempt to understand its effects and degree of influence on general economic prediction indicators for Ethiopia. Four simulations are examined, including a baseline simulation and three investment strategies: simulations of irrigation investment, roads investment, and a combination investment of both irrigation and roads. The deterministic model is transformed into a stochastic model by dynamically adding year-to-year climate variability through climate-yield factors. Nine sets of actual, historic, variable climate data are individually assembled and implemented into the 12-year stochastic model simulation, producing an ensemble of economic prediction indicators. This ensemble allows for a probabilistic approach to planning and policy making, allowing decision makers to consider risk. The economic indicators from the deterministic and stochastic approaches, including rates of return to investments, are significantly different. The predictions of the deterministic model appreciably overestimate the future welfare of Ethiopia; the predictions of the stochastic model, utilizing actual climate data, tend to give a better semblance of what may be expected. Inclusion of climate variability is vital for proper analysis of the predictor values from this agro-economic model.
NASA Astrophysics Data System (ADS)
Ramos, José A.; Mercère, Guillaume
2016-12-01
In this paper, we present an algorithm for identifying two-dimensional (2D) causal, recursive and separable-in-denominator (CRSD) state-space models in the Roesser form with deterministic-stochastic inputs. The algorithm implements the N4SID, PO-MOESP and CCA methods, which are well known in the literature on 1D system identification, but here we do so for the 2D CRSD Roesser model. The algorithm solves the 2D system identification problem by maintaining the constraint structure imposed by the problem (i.e. Toeplitz and Hankel) and computes the horizontal and vertical system orders, system parameter matrices and covariance matrices of a 2D CRSD Roesser model. From a computational point of view, the algorithm has been presented in a unified framework, where the user can select which of the three methods to use. Furthermore, the identification task is divided into three main parts: (1) computing the deterministic horizontal model parameters, (2) computing the deterministic vertical model parameters and (3) computing the stochastic components. Specific attention has been paid to the computation of a stabilised Kalman gain matrix and a positive real solution when required. The efficiency and robustness of the unified algorithm have been demonstrated via a thorough simulation example.
NASA Astrophysics Data System (ADS)
Fischer, P.; Jardani, A.; Lecoq, N.
2018-02-01
In this paper, we present a novel inverse modeling method called Discrete Network Deterministic Inversion (DNDI) for mapping the geometry and property of the discrete network of conduits and fractures in the karstified aquifers. The DNDI algorithm is based on a coupled discrete-continuum concept to simulate numerically water flows in a model and a deterministic optimization algorithm to invert a set of observed piezometric data recorded during multiple pumping tests. In this method, the model is partioned in subspaces piloted by a set of parameters (matrix transmissivity, and geometry and equivalent transmissivity of the conduits) that are considered as unknown. In this way, the deterministic optimization process can iteratively correct the geometry of the network and the values of the properties, until it converges to a global network geometry in a solution model able to reproduce the set of data. An uncertainty analysis of this result can be performed from the maps of posterior uncertainties on the network geometry or on the property values. This method has been successfully tested for three different theoretical and simplified study cases with hydraulic responses data generated from hypothetical karstic models with an increasing complexity of the network geometry, and of the matrix heterogeneity.
Khammash, Mustafa
2014-01-01
Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed. PMID:24968191
Stochastic simulations on a model of circadian rhythm generation.
Miura, Shigehiro; Shimokawa, Tetsuya; Nomura, Taishin
2008-01-01
Biological phenomena are often modeled by differential equations, where states of a model system are described by continuous real values. When we consider concentrations of molecules as dynamical variables for a set of biochemical reactions, we implicitly assume that numbers of the molecules are large enough so that their changes can be regarded as continuous and they are described deterministically. However, for a system with small numbers of molecules, changes in their numbers are apparently discrete and molecular noises become significant. In such cases, models with deterministic differential equations may be inappropriate, and the reactions must be described by stochastic equations. In this study, we focus a clock gene expression for a circadian rhythm generation, which is known as a system involving small numbers of molecules. Thus it is appropriate for the system to be modeled by stochastic equations and analyzed by methodologies of stochastic simulations. The interlocked feedback model proposed by Ueda et al. as a set of deterministic ordinary differential equations provides a basis of our analyses. We apply two stochastic simulation methods, namely Gillespie's direct method and the stochastic differential equation method also by Gillespie, to the interlocked feedback model. To this end, we first reformulated the original differential equations back to elementary chemical reactions. With those reactions, we simulate and analyze the dynamics of the model using two methods in order to compare them with the dynamics obtained from the original deterministic model and to characterize dynamics how they depend on the simulation methodologies.
NASA Astrophysics Data System (ADS)
Lye, Ribin; Tan, James Peng Lung; Cheong, Siew Ann
2012-11-01
We describe a bottom-up framework, based on the identification of appropriate order parameters and determination of phase diagrams, for understanding progressively refined agent-based models and simulations of financial markets. We illustrate this framework by starting with a deterministic toy model, whereby N independent traders buy and sell M stocks through an order book that acts as a clearing house. The price of a stock increases whenever it is bought and decreases whenever it is sold. Price changes are updated by the order book before the next transaction takes place. In this deterministic model, all traders based their buy decisions on a call utility function, and all their sell decisions on a put utility function. We then make the agent-based model more realistic, by either having a fraction fb of traders buy a random stock on offer, or a fraction fs of traders sell a random stock in their portfolio. Based on our simulations, we find that it is possible to identify useful order parameters from the steady-state price distributions of all three models. Using these order parameters as a guide, we find three phases: (i) the dead market; (ii) the boom market; and (iii) the jammed market in the phase diagram of the deterministic model. Comparing the phase diagrams of the stochastic models against that of the deterministic model, we realize that the primary effect of stochasticity is to eliminate the dead market phase.
Distinguishing between stochasticity and determinism: Examples from cell cycle duration variability.
Pearl Mizrahi, Sivan; Sandler, Oded; Lande-Diner, Laura; Balaban, Nathalie Q; Simon, Itamar
2016-01-01
We describe a recent approach for distinguishing between stochastic and deterministic sources of variability, focusing on the mammalian cell cycle. Variability between cells is often attributed to stochastic noise, although it may be generated by deterministic components. Interestingly, lineage information can be used to distinguish between variability and determinism. Analysis of correlations within a lineage of the mammalian cell cycle duration revealed its deterministic nature. Here, we discuss the sources of such variability and the possibility that the underlying deterministic process is due to the circadian clock. Finally, we discuss the "kicked cell cycle" model and its implication on the study of the cell cycle in healthy and cancerous tissues. © 2015 WILEY Periodicals, Inc.
Plenary: Progress in Regional Landslide Hazard Assessment—Examples from the USA
Baum, Rex L.; Schulz, William; Brien, Dianne L.; Burns, William J.; Reid, Mark E.; Godt, Jonathan W.
2014-01-01
Landslide hazard assessment at local and regional scales contributes to mitigation of landslides in developing and densely populated areas by providing information for (1) land development and redevelopment plans and regulations, (2) emergency preparedness plans, and (3) economic analysis to (a) set priorities for engineered mitigation projects and (b) define areas of similar levels of hazard for insurance purposes. US Geological Survey (USGS) research on landslide hazard assessment has explored a range of methods that can be used to estimate temporal and spatial landslide potential and probability for various scales and purposes. Cases taken primarily from our work in the U.S. Pacific Northwest illustrate and compare a sampling of methods, approaches, and progress. For example, landform mapping using high-resolution topographic data resulted in identification of about four times more landslides in Seattle, Washington, than previous efforts using aerial photography. Susceptibility classes based on the landforms captured 93 % of all historical landslides (all types) throughout the city. A deterministic model for rainfall infiltration and shallow landslide initiation, TRIGRS, was able to identify locations of 92 % of historical shallow landslides in southwest Seattle. The potentially unstable areas identified by TRIGRS occupied only 26 % of the slope areas steeper than 20°. Addition of an unsaturated infiltration model to TRIGRS expands the applicability of the model to areas of highly permeable soils. Replacement of the single cell, 1D factor of safety with a simple 3D method of columns improves accuracy of factor of safety predictions for both saturated and unsaturated infiltration models. A 3D deterministic model for large, deep landslides, SCOOPS, combined with a three-dimensional model for groundwater flow, successfully predicted instability in steep areas of permeable outwash sand and topographic reentrants. These locations are consistent with locations of large, deep, historically active landslides. For an area in Seattle, a composite of the three maps illustrates how maps produced by different approaches might be combined to assess overall landslide potential. Examples from Oregon, USA, illustrate how landform mapping and deterministic analysis for shallow landslide potential have been adapted into standardized methods for efficiently producing detailed landslide inventory and shallow landslide susceptibility maps that have consistent content and format statewide.
2014-01-01
Background Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication. PMID:24955110
Erickson, Richard A.; Thogmartin, Wayne E.; Szymanski, Jennifer A.
2014-01-01
Background: Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results: BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions: BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.
Erickson, Richard A; Thogmartin, Wayne E; Szymanski, Jennifer A
2014-01-01
Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.
Dalhoff, A; Ambrose, P G; Mouton, J W
2009-08-01
Since the origin of an "'International Collaborative Study on Antibiotic Sensitivity Testing'" in 1971, considerable advancement has been made to standardize clinical susceptibility testing procedures of antimicrobial agents. However, a consensus on the methods to be used and interpretive criteria was not reached, so the results of susceptibility testing were discrepant. Recently, the European Committee on Antimicrobial Susceptibility Testing achieved a harmonization of existing methods for susceptibility testing and now co-ordinates the process for setting breakpoints. Previously, breakpoints were set by adjusting the mean pharmacokinetic parameters derived from healthy volunteers to the susceptibilities of a population of potential pathogens expressed as the mean minimum inhibitory concentration (MIC) or MIC90%. Breakpoints derived by the deterministic approach tend to be too high, since this procedure does not take the variabilities of drug exposure and the susceptibility patterns into account. Therefore, first-step mutants or borderline susceptible bacteria may be considered as fully susceptible. As the drug exposure of such sub-populations is inadequate, resistance development will increase and eradication rates will decrease, resulting in clinical failure. The science of pharmacokinetics/pharmacodynamics integrates all possible drug exposures for standard dosage regimens and all MIC values likely to be found for the clinical isolates into the breakpoint definitions. Ideally, the data sets used originate from patients suffering from the disease to be treated. Probability density functions for both the pharmacokinetic and microbiological variables are determined, and a large number of MIC/drug exposure scenarios are calculated. Therefore, this method is defined as the probabilistic approach. The breakpoints thus derived are lower than the ones defined deterministically, as the entire range of probable drug exposures from low to high is modeled. Therefore, the amplification of drug-resistant sub-populations will be reduced. It has been a long journey since the first attempts in 1971 to define breakpoints. Clearly, this implies that none of the various approaches is right or wrong, and that the different approaches reflect different philosophies and mirror the tremendous progress made in the understanding of the pharmacodynamic properties of different classes of antimicrobials.
The threshold of a stochastic delayed SIR epidemic model with temporary immunity
NASA Astrophysics Data System (ADS)
Liu, Qun; Chen, Qingmei; Jiang, Daqing
2016-05-01
This paper is concerned with the asymptotic properties of a stochastic delayed SIR epidemic model with temporary immunity. Sufficient conditions for extinction and persistence in the mean of the epidemic are established. The threshold between persistence in the mean and extinction of the epidemic is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.
Mapping flood hazards under uncertainty through probabilistic flood inundation maps
NASA Astrophysics Data System (ADS)
Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.
2017-12-01
Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.
A queuing model for road traffic simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.
We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.
Noise-induced extinction for a ratio-dependent predator-prey model with strong Allee effect in prey
NASA Astrophysics Data System (ADS)
Mandal, Partha Sarathi
2018-04-01
In this paper, we study a stochastically forced ratio-dependent predator-prey model with strong Allee effect in prey population. In the deterministic case, we show that the model exhibits the stable interior equilibrium point or limit cycle corresponding to the co-existence of both species. We investigate a probabilistic mechanism of the noise-induced extinction in a zone of stable interior equilibrium point. Computational methods based on the stochastic sensitivity function technique are applied for the analysis of the dispersion of random states near stable interior equilibrium point. This method allows to construct a confidence domain and estimate the threshold value of the noise intensity for a transition from the coexistence to the extinction.
Effect of nonlinearity in hybrid kinetic Monte Carlo-continuum models.
Balter, Ariel; Lin, Guang; Tartakovsky, Alexandre M
2012-01-01
Recently there has been interest in developing efficient ways to model heterogeneous surface reactions with hybrid computational models that couple a kinetic Monte Carlo (KMC) model for a surface to a finite-difference model for bulk diffusion in a continuous domain. We consider two representative problems that validate a hybrid method and show that this method captures the combined effects of nonlinearity and stochasticity. We first validate a simple deposition-dissolution model with a linear rate showing that the KMC-continuum hybrid agrees with both a fully deterministic model and its analytical solution. We then study a deposition-dissolution model including competitive adsorption, which leads to a nonlinear rate, and show that in this case the KMC-continuum hybrid and fully deterministic simulations do not agree. However, we are able to identify the difference as a natural result of the stochasticity coming from the KMC surface process. Because KMC captures inherent fluctuations, we consider it to be more realistic than a purely deterministic model. Therefore, we consider the KMC-continuum hybrid to be more representative of a real system.
Effect of Nonlinearity in Hybrid Kinetic Monte Carlo-Continuum Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balter, Ariel I.; Lin, Guang; Tartakovsky, Alexandre M.
2012-04-23
Recently there has been interest in developing efficient ways to model heterogeneous surface reactions with hybrid computational models that couple a KMC model for a surface to a finite difference model for bulk diffusion in a continuous domain. We consider two representative problems that validate a hybrid method and also show that this method captures the combined effects of nonlinearity and stochasticity. We first validate a simple deposition/dissolution model with a linear rate showing that the KMC-continuum hybrid agrees with both a fully deterministic model and its analytical solution. We then study a deposition/dissolution model including competitive adsorption, which leadsmore » to a nonlinear rate, and show that, in this case, the KMC-continuum hybrid and fully deterministic simulations do not agree. However, we are able to identify the difference as a natural result of the stochasticity coming from the KMC surface process. Because KMC captures inherent fluctuations, we consider it to be more realistic than a purely deterministic model. Therefore, we consider the KMC-continuum hybrid to be more representative of a real system.« less
Kinetic theory of age-structured stochastic birth-death processes
NASA Astrophysics Data System (ADS)
Greenman, Chris D.; Chou, Tom
2016-01-01
Classical age-structured mass-action models such as the McKendrick-von Foerster equation have been extensively studied but are unable to describe stochastic fluctuations or population-size-dependent birth and death rates. Stochastic theories that treat semi-Markov age-dependent processes using, e.g., the Bellman-Harris equation do not resolve a population's age structure and are unable to quantify population-size dependencies. Conversely, current theories that include size-dependent population dynamics (e.g., mathematical models that include carrying capacity such as the logistic equation) cannot be easily extended to take into account age-dependent birth and death rates. In this paper, we present a systematic derivation of a new, fully stochastic kinetic theory for interacting age-structured populations. By defining multiparticle probability density functions, we derive a hierarchy of kinetic equations for the stochastic evolution of an aging population undergoing birth and death. We show that the fully stochastic age-dependent birth-death process precludes factorization of the corresponding probability densities, which then must be solved by using a Bogoliubov--Born--Green--Kirkwood--Yvon-like hierarchy. Explicit solutions are derived in three limits: no birth, no death, and steady state. These are then compared with their corresponding mean-field results. Our results generalize both deterministic models and existing master equation approaches by providing an intuitive and efficient way to simultaneously model age- and population-dependent stochastic dynamics applicable to the study of demography, stem cell dynamics, and disease evolution.
Hahl, Sayuri K; Kremling, Andreas
2016-01-01
In the mathematical modeling of biochemical reactions, a convenient standard approach is to use ordinary differential equations (ODEs) that follow the law of mass action. However, this deterministic ansatz is based on simplifications; in particular, it neglects noise, which is inherent to biological processes. In contrast, the stochasticity of reactions is captured in detail by the discrete chemical master equation (CME). Therefore, the CME is frequently applied to mesoscopic systems, where copy numbers of involved components are small and random fluctuations are thus significant. Here, we compare those two common modeling approaches, aiming at identifying parallels and discrepancies between deterministic variables and possible stochastic counterparts like the mean or modes of the state space probability distribution. To that end, a mathematically flexible reaction scheme of autoregulatory gene expression is translated into the corresponding ODE and CME formulations. We show that in the thermodynamic limit, deterministic stable fixed points usually correspond well to the modes in the stationary probability distribution. However, this connection might be disrupted in small systems. The discrepancies are characterized and systematically traced back to the magnitude of the stoichiometric coefficients and to the presence of nonlinear reactions. These factors are found to synergistically promote large and highly asymmetric fluctuations. As a consequence, bistable but unimodal, and monostable but bimodal systems can emerge. This clearly challenges the role of ODE modeling in the description of cellular signaling and regulation, where some of the involved components usually occur in low copy numbers. Nevertheless, systems whose bimodality originates from deterministic bistability are found to sustain a more robust separation of the two states compared to bimodal, but monostable systems. In regulatory circuits that require precise coordination, ODE modeling is thus still expected to provide relevant indications on the underlying dynamics.
Deterministic Mean-Field Ensemble Kalman Filtering
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
2016-05-03
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay
2017-11-01
Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.
Deterministic Mean-Field Ensemble Kalman Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
A hybrid model for predicting carbon monoxide from vehicular exhausts in urban environments
NASA Astrophysics Data System (ADS)
Gokhale, Sharad; Khare, Mukesh
Several deterministic-based air quality models evaluate and predict the frequently occurring pollutant concentration well but, in general, are incapable of predicting the 'extreme' concentrations. In contrast, the statistical distribution models overcome the above limitation of the deterministic models and predict the 'extreme' concentrations. However, the environmental damages are caused by both extremes as well as by the sustained average concentration of pollutants. Hence, the model should predict not only 'extreme' ranges but also the 'middle' ranges of pollutant concentrations, i.e. the entire range. Hybrid modelling is one of the techniques that estimates/predicts the 'entire range' of the distribution of pollutant concentrations by combining the deterministic based models with suitable statistical distribution models ( Jakeman, et al., 1988). In the present paper, a hybrid model has been developed to predict the carbon monoxide (CO) concentration distributions at one of the traffic intersections, Income Tax Office (ITO), in the Delhi city, where the traffic is heterogeneous in nature and meteorology is 'tropical'. The model combines the general finite line source model (GFLSM) as its deterministic, and log logistic distribution (LLD) model, as its statistical components. The hybrid (GFLSM-LLD) model is then applied at the ITO intersection. The results show that the hybrid model predictions match with that of the observed CO concentration data within the 5-99 percentiles range. The model is further validated at different street location, i.e. Sirifort roadway. The validation results show that the model predicts CO concentrations fairly well ( d=0.91) in 10-95 percentiles range. The regulatory compliance is also developed to estimate the probability of exceedance of hourly CO concentration beyond the National Ambient Air Quality Standards (NAAQS) of India. It consists of light vehicles, heavy vehicles, three- wheelers (auto rickshaws) and two-wheelers (scooters, motorcycles, etc).
Dini-Andreote, Francisco; Stegen, James C; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-03-17
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages--which provide a larger spatiotemporal scale relative to within stage analyses--revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended--and experimentally testable--conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems.
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-01-01
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages—which provide a larger spatiotemporal scale relative to within stage analyses—revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended—and experimentally testable—conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems. PMID:25733885
Modeling sandhill crane population dynamics
Johnson, D.H.
1979-01-01
The impact of sport hunting on the Central Flyway population of sandhill cranes (Grus canadensis) has been a subject of controversy for several years. A recent study (Buller 1979) presented new and important information on sandhill crane population dynamics. The present report is intended to incorporate that and other information into a mathematical model for the purpose of assessing the long-range impact of hunting on the population of sandhill cranes.The model is a simple deterministic system that embodies density-dependent rates of survival and recruitment. The model employs four kinds of data: (1) spring population size of sandhill cranes, estimated from aerial surveys to be between 250,000 and 400,000 birds; (2) age composition in fall, estimated for 1974-76 to be 11.3% young; (3) annual harvest of cranes, estimated from a variety of sources to be about 5 to 7% of the spring population; and (4) age composition of harvested cranes, which was difficult to estimate but suggests that immatures were 2 to 4 times as vulnerable to hunting as adults.Because the true nature of sandhill crane population dynamics remains so poorly understood, it was necessary to try numerous (768 in all) combinations of survival and recruitment functions, and focus on the relatively few (37) that yielded population sizes and age structures comparable to those extant in the real population. Hunting was then applied to those simulated populations. In all combinations, hunting resulted in a lower asymptotic crane population, the decline ranging from 5 to 54%. The median decline was 22%, which suggests that a hunted sandhill crane population might be about three-fourths as large as it would be if left unhunted. Results apply to the aggregate of the three subspecies in the Central Flyway; individual subspecies or populations could be affected to a greater or lesser degree.
Risk assessment for furan contamination through the food chain in Belgian children.
Scholl, Georges; Huybrechts, Inge; Humblet, Marie-France; Scippo, Marie-Louise; De Pauw, Edwin; Eppe, Gauthier; Saegerman, Claude
2012-08-01
Young, old, pregnant and immuno-compromised persons are of great concern for risk assessors as they represent the sub-populations most at risk. The present paper focuses on risk assessment linked to furan exposure in children. Only the Belgian population was considered because individual contamination and consumption data that are required for accurate risk assessment were available for Belgian children only. Two risk assessment approaches, the so-called deterministic and probabilistic, were applied and the results were compared for the estimation of daily intake. A significant difference between the average Estimated Daily Intake (EDI) was underlined between the deterministic (419 ng kg⁻¹ body weight (bw) day⁻¹) and the probabilistic (583 ng kg⁻¹ bw day⁻¹) approaches, which results from the mathematical treatment of the null consumption and contamination data. The risk was characterised by two ways: (1) the classical approach by comparison of the EDI to a reference dose (RfD(chronic-oral)) and (2) the most recent approach, namely the Margin of Exposure (MoE) approach. Both reached similar conclusions: the risk level is not of a major concern, but is neither negligible. In the first approach, only 2.7 or 6.6% (respectively in the deterministic and in the probabilistic way) of the studied population presented an EDI above the RfD(chronic-oral). In the second approach, the percentage of children displaying a MoE above 10,000 and below 100 is 3-0% and 20-0.01% in the deterministic and probabilistic modes, respectively. In addition, children were compared to adults and significant differences between the contamination patterns were highlighted. While major contamination was linked to coffee consumption in adults (55%), no item predominantly contributed to the contamination in children. The most important were soups (19%), dairy products (17%), pasta and rice (11%), fruit and potatoes (9% each).
A model for cancer tissue heterogeneity.
Mohanty, Anwoy Kumar; Datta, Aniruddha; Venkatraj, Vijayanagaram
2014-03-01
An important problem in the study of cancer is the understanding of the heterogeneous nature of the cell population. The clonal evolution of the tumor cells results in the tumors being composed of multiple subpopulations. Each subpopulation reacts differently to any given therapy. This calls for the development of novel (regulatory network) models, which can accommodate heterogeneity in cancerous tissues. In this paper, we present a new approach to model heterogeneity in cancer. We model heterogeneity as an ensemble of deterministic Boolean networks based on prior pathway knowledge. We develop the model considering the use of qPCR data. By observing gene expressions when the tissue is subjected to various stimuli, the compositional breakup of the tissue under study can be determined. We demonstrate the viability of this approach by using our model on synthetic data, and real-world data collected from fibroblasts.
Estimating golden-cheeked warbler immigration: Implications for the spatial scale of conservation
Duarte, A.; Weckerly, F.W.; Schaub, M.; Hatfield, Jeffrey S.
2016-01-01
Understanding the factors that drive population dynamics is fundamental to species conservation and management. Since the golden-cheeked warbler Setophaga chrysoparia was first listed as endangered, much effort has taken place to monitor warbler abundance, occupancy, reproduction and survival. Yet, despite being directly related to local population dynamics, movement rates have not been estimated for the species. We used an integrated population model to investigate the relationship between immigration rate, fledging rate, survival probabilities and population growth rate for warblers in central Texas, USA. Furthermore, using a deterministic projection model, we examined the response required by vital rates to maintain a viable population across varying levels of immigration. Warbler abundance fluctuated with an overall positive trend across years. In the absence of immigration, the abundance would have decreased. However, the population could remain viable without immigration if both adult and juvenile survival increased by almost half or if juvenile survival more than doubled. We also investigated the response required by fledging rates across a range of immigration in order to maintain a viable population. Overall, we found that immigration was required to maintain warbler target populations, indicating that warbler conservation and management programs need to be implemented at larger spatial scales than current efforts to be effective. This study also demonstrates that by using limited data within integrated population models, biologists are able to monitor multiple key demographic parameters simultaneously to gauge the efficacy of strategies designed to maximize warbler viability in a changing landscape.
Epidemic spreading on adaptively weighted scale-free networks.
Sun, Mengfeng; Zhang, Haifeng; Kang, Huiyan; Zhu, Guanghu; Fu, Xinchu
2017-04-01
We introduce three modified SIS models on scale-free networks that take into account variable population size, nonlinear infectivity, adaptive weights, behavior inertia and time delay, so as to better characterize the actual spread of epidemics. We develop new mathematical methods and techniques to study the dynamics of the models, including the basic reproduction number, and the global asymptotic stability of the disease-free and endemic equilibria. We show the disease-free equilibrium cannot undergo a Hopf bifurcation. We further analyze the effects of local information of diseases and various immunization schemes on epidemic dynamics. We also perform some stochastic network simulations which yield quantitative agreement with the deterministic mean-field approach.
Early signatures of regime shifts in gene expression dynamics
NASA Astrophysics Data System (ADS)
Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani
2013-06-01
Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence.
A mathematical study of a model for childhood diseases with non-permanent immunity
NASA Astrophysics Data System (ADS)
Moghadas, S. M.; Gumel, A. B.
2003-08-01
Protecting children from diseases that can be prevented by vaccination is a primary goal of health administrators. Since vaccination is considered to be the most effective strategy against childhood diseases, the development of a framework that would predict the optimal vaccine coverage level needed to prevent the spread of these diseases is crucial. This paper provides this framework via qualitative and quantitative analysis of a deterministic mathematical model for the transmission dynamics of a childhood disease in the presence of a preventive vaccine that may wane over time. Using global stability analysis of the model, based on constructing a Lyapunov function, it is shown that the disease can be eradicated from the population if the vaccination coverage level exceeds a certain threshold value. It is also shown that the disease will persist within the population if the coverage level is below this threshold. These results are verified numerically by constructing, and then simulating, a robust semi-explicit second-order finite-difference method.
Stochastic Multi-Timescale Power System Operations With Variable Wind Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hongyu; Krad, Ibrahim; Florita, Anthony
This paper describes a novel set of stochastic unit commitment and economic dispatch models that consider stochastic loads and variable generation at multiple operational timescales. The stochastic model includes four distinct stages: stochastic day-ahead security-constrained unit commitment (SCUC), stochastic real-time SCUC, stochastic real-time security-constrained economic dispatch (SCED), and deterministic automatic generation control (AGC). These sub-models are integrated together such that they are continually updated with decisions passed from one to another. The progressive hedging algorithm (PHA) is applied to solve the stochastic models to maintain the computational tractability of the proposed models. Comparative case studies with deterministic approaches are conductedmore » in low wind and high wind penetration scenarios to highlight the advantages of the proposed methodology, one with perfect forecasts and the other with current state-of-the-art but imperfect deterministic forecasts. The effectiveness of the proposed method is evaluated with sensitivity tests using both economic and reliability metrics to provide a broader view of its impact.« less
Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.
Gomez, Christophe; Hartung, Niklas
2018-01-01
Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.
Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models
2002-03-01
such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most
Dynamic analysis of a stochastic rumor propagation model
NASA Astrophysics Data System (ADS)
Jia, Fangju; Lv, Guangying
2018-01-01
The rapid development of the Internet, especially the emergence of the social networks, leads rumor propagation into a new media era. In this paper, we are concerned with a stochastic rumor propagation model. Sufficient conditions for extinction and persistence in the mean of the rumor are established. The threshold between persistence in the mean and extinction of the rumor is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.
Miller, David A.; Clark, W.R.; Arnold, S.J.; Bronikowski, A.M.
2011-01-01
Comparative evaluations of population dynamics in species with temporal and spatial variation in life-history traits are rare because they require long-term demographic time series from multiple populations. We present such an analysis using demographic data collected during the interval 1978-1996 for six populations of western terrestrial garter snakes (Thamnophis elegans) from two evolutionarily divergent ecotypes. Three replicate populations from a slow-living ecotype, found in mountain meadows of northeastern California, were characterized by individuals that develop slowly, mature late, reproduce infrequently with small reproductive effort, and live longer than individuals of three populations of a fast-living ecotype found at lakeshore locales. We constructed matrix population models for each of the populations based on 8-13 years of data per population and analyzed both deterministic dynamics based on mean annual vital rates and stochastic dynamics incorporating annual variation in vital rates. (1) Contributions of highly variable vital rates to fitness (??s) were buffered against the negative effects of stochastic variation, and this relationship was consistent with differences between the meadow (M-slow) and lakeshore (L-fast) ecotypes. (2) Annual variation in the proportion of gravid females had the greatest negative effect among all vital rates on ?? s. The magnitude of variation in the proportion of gravid females and its effect on ??s was greater in M-slow than L-fast populations. (3) Variation in the proportion of gravid females, in turn, depended on annual variation in prey availability, and its effect on ??s was 4- 23 times greater in M-slow than L-fast populations. In addition to differences in stochastic dynamics between ecotypes, we also found higher mean mortality rates across all age classes in the L-fast populations. Our results suggest that both deterministic and stochastic selective forces have affected the evolution of divergent life-history traits in the two ecotypes, which, in turn, affect population dynamics. M-slow populations have evolved life-history traits that buffer fitness against direct effects of variation in reproduction and that spread lifetime reproduction across a greater number of reproductive bouts. These results highlight the importance of long-term demographic and environmental monitoring and of incorporating temporal dynamics into empirical studies of life-history evolution. ?? 2011 by the Ecological Society of America.
Miller, David A; Clark, William R; Arnold, Stevan J; Bronikowski, Anne M
2011-08-01
Comparative evaluations of population dynamics in species with temporal and spatial variation in life-history traits are rare because they require long-term demographic time series from multiple populations. We present such an analysis using demographic data collected during the interval 1978-1996 for six populations of western terrestrial garter snakes (Thamnophis elegans) from two evolutionarily divergent ecotypes. Three replicate populations from a slow-living ecotype, found in mountain meadows of northeastern California, were characterized by individuals that develop slowly, mature late, reproduce infrequently with small reproductive effort, and live longer than individuals of three populations of a fast-living ecotype found at lakeshore locales. We constructed matrix population models for each of the populations based on 8-13 years of data per population and analyzed both deterministic dynamics based on mean annual vital rates and stochastic dynamics incorporating annual variation in vital rates. (1) Contributions of highly variable vital rates to fitness (lambda(s)) were buffered against the negative effects of stochastic variation, and this relationship was consistent with differences between the meadow (M-slow) and lakeshore (L-fast) ecotypes. (2) Annual variation in the proportion of gravid females had the greatest negative effect among all vital rates on lambda(s). The magnitude of variation in the proportion of gravid females and its effect on lambda(s) was greater in M-slow than L-fast populations. (3) Variation in the proportion of gravid females, in turn, depended on annual variation in prey availability, and its effect on lambda(s) was 4 23 times greater in M-slow than L-fast populations. In addition to differences in stochastic dynamics between ecotypes, we also found higher mean mortality rates across all age classes in the L-fast populations. Our results suggest that both deterministic and stochastic selective forces have affected the evolution of divergent life-history traits in the two ecotypes, which, in turn, affect population dynamics. M-slow populations have evolved life-history traits that buffer fitness against direct effects of variation in reproduction and that spread lifetime reproduction across a greater number of reproductive bouts. These results highlight the importance of long-term demographic and environmental monitoring and of incorporating temporal dynamics into empirical studies of life-history evolution.
Lunelli, Antonella; Pugliese, Andrea; Rizzo, Caterina
2009-07-01
Due to the recent emergence of H5N1 virus, the modelling of pandemic influenza has become a relevant issue. Here we present an SEIR model formulated to simulate a possible outbreak in Italy, analysing its structure and, more generally, the effect of including specific details into a model. These details regard population heterogeneities, such as age and spatial distribution, as well as stochasticity, that regulates the epidemic dynamics when the number of infectives is low. We discuss and motivate the specific modelling choices made when building the model and investigate how the model details influence the predicted dynamics. Our analysis may help in deciding which elements of complexity are worth including in the design of a deterministic model for pandemic influenza, in a balance between, on the one hand, keeping the model computationally efficient and the number of parameters low and, on the other hand, maintaining the necessary realistic features.
NASA Astrophysics Data System (ADS)
Krogh-Madsen, Trine; Kold Taylor, Louise; Skriver, Anne D.; Schaffer, Peter; Guevara, Michael R.
2017-09-01
The transmembrane potential is recorded from small isopotential clusters of 2-4 embryonic chick ventricular cells spontaneously generating action potentials. We analyze the cycle-to-cycle fluctuations in the time between successive action potentials (the interbeat interval or IBI). We also convert an existing model of electrical activity in the cluster, which is formulated as a Hodgkin-Huxley-like deterministic system of nonlinear ordinary differential equations describing five individual ionic currents, into a stochastic model consisting of a population of ˜20 000 independently and randomly gating ionic channels, with the randomness being set by a real physical stochastic process (radio static). This stochastic model, implemented using the Clay-DeFelice algorithm, reproduces the fluctuations seen experimentally: e.g., the coefficient of variation (standard deviation/mean) of IBI is 4.3% in the model vs. the 3.9% average value of the 17 clusters studied. The model also replicates all but one of several other quantitative measures of the experimental results, including the power spectrum and correlation integral of the voltage, as well as the histogram, Poincaré plot, serial correlation coefficients, power spectrum, detrended fluctuation analysis, approximate entropy, and sample entropy of IBI. The channel noise from one particular ionic current (IKs), which has channel kinetics that are relatively slow compared to that of the other currents, makes the major contribution to the fluctuations in IBI. Reproduction of the experimental coefficient of variation of IBI by adding a Gaussian white noise-current into the deterministic model necessitates using an unrealistically high noise-current amplitude. Indeed, a major implication of the modelling results is that, given the wide range of time-scales over which the various species of channels open and close, only a cell-specific stochastic model that is formulated taking into consideration the widely different ranges in the frequency content of the channel-noise produced by the opening and closing of several different types of channels will be able to reproduce precisely the various effects due to membrane noise seen in a particular electrophysiological preparation.
Frasca, Mattia; Sharkey, Kieran J
2016-06-21
Understanding the dynamics of spread of infectious diseases between individuals is essential for forecasting the evolution of an epidemic outbreak or for defining intervention policies. The problem is addressed by many approaches including stochastic and deterministic models formulated at diverse scales (individuals, populations) and different levels of detail. Here we consider discrete-time SIR (susceptible-infectious-removed) dynamics propagated on contact networks. We derive a novel set of 'discrete-time moment equations' for the probability of the system states at the level of individual nodes and pairs of nodes. These equations form a set which we close by introducing appropriate approximations of the joint probabilities appearing in them. For the example case of SIR processes, we formulate two types of model, one assuming statistical independence at the level of individuals and one at the level of pairs. From the pair-based model we then derive a model at the level of the population which captures the behavior of epidemics on homogeneous random networks. With respect to their continuous-time counterparts, the models include a larger number of possible transitions from one state to another and joint probabilities with a larger number of individuals. The approach is validated through numerical simulation over different network topologies. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Sailem, Heba; Bousgouni, Vicky; Cooper, Sam; Bakal, Chris
2014-01-22
One goal of cell biology is to understand how cells adopt different shapes in response to varying environmental and cellular conditions. Achieving a comprehensive understanding of the relationship between cell shape and environment requires a systems-level understanding of the signalling networks that respond to external cues and regulate the cytoskeleton. Classical biochemical and genetic approaches have identified thousands of individual components that contribute to cell shape, but it remains difficult to predict how cell shape is generated by the activity of these components using bottom-up approaches because of the complex nature of their interactions in space and time. Here, we describe the regulation of cellular shape by signalling systems using a top-down approach. We first exploit the shape diversity generated by systematic RNAi screening and comprehensively define the shape space a migratory cell explores. We suggest a simple Boolean model involving the activation of Rac and Rho GTPases in two compartments to explain the basis for all cell shapes in the dataset. Critically, we also generate a probabilistic graphical model to show how cells explore this space in a deterministic, rather than a stochastic, fashion. We validate the predictions made by our model using live-cell imaging. Our work explains how cross-talk between Rho and Rac can generate different cell shapes, and thus morphological heterogeneity, in genetically identical populations.
NASA Astrophysics Data System (ADS)
Sinner, K.; Teasley, R. L.
2016-12-01
Groundwater models serve as integral tools for understanding flow processes and informing stakeholders and policy makers in management decisions. Historically, these models tended towards a deterministic nature, relying on historical data to predict and inform future decisions based on model outputs. This research works towards developing a stochastic method of modeling recharge inputs from pipe main break predictions in an existing groundwater model, which subsequently generates desired outputs incorporating future uncertainty rather than deterministic data. The case study for this research is the Barton Springs segment of the Edwards Aquifer near Austin, Texas. Researchers and water resource professionals have modeled the Edwards Aquifer for decades due to its high water quality, fragile ecosystem, and stakeholder interest. The original case study and model that this research is built upon was developed as a co-design problem with regional stakeholders and the model outcomes are generated specifically for communication with policy makers and managers. Recently, research in the Barton Springs segment demonstrated a significant contribution of urban, or anthropogenic, recharge to the aquifer, particularly during dry period, using deterministic data sets. Due to social and ecological importance of urban water loss to recharge, this study develops an evaluation method to help predicted pipe breaks and their related recharge contribution within the Barton Springs segment of the Edwards Aquifer. To benefit groundwater management decision processes, the performance measures captured in the model results, such as springflow, head levels, storage, and others, were determined by previous work in elicitation of problem framing to determine stakeholder interests and concerns. The results of the previous deterministic model and the stochastic model are compared to determine gains to stakeholder knowledge through the additional modeling
Saito, Hiroshi; Katahira, Kentaro; Okanoya, Kazuo; Okada, Masato
2014-01-01
The decision making behaviors of humans and animals adapt and then satisfy an "operant matching law" in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.
Necessary conditions for the emergence of homochirality via autocatalytic self-replication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stich, Michael; Ribó, Josep M.; Blackmond, Donna G., E-mail: blackmond@scripps.edu
We analyze a recent proposal for spontaneous mirror symmetry breaking based on the coupling of first-order enantioselective autocatalysis and direct production of the enantiomers that invokes a critical role for intrinsic reaction noise. For isolated systems, the racemic state is the unique stable outcome for both stochastic and deterministic dynamics when the system is in compliance with the constraints dictated by the thermodynamics of chemical reaction processes. In open systems, the racemic outcome also results for both stochastic and deterministic dynamics when driving the autocatalysis unidirectionally by external reagents. Nonracemic states can result in the latter only if the reversemore » reactions are strictly zero: these are kinetically controlled outcomes for small populations and volumes, and can be simulated by stochastic dynamics. However, the stability of the thermodynamic limit proves that the racemic outcome is the unique stable state for strictly irreversible externally driven autocatalysis. These findings contradict the suggestion that the inhibition requirement of the Frank autocatalytic model for the emergence of homochirality may be relaxed in a noise-induced mechanism.« less
Developing a tuberculosis transmission model that accounts for changes in population health.
Oxlade, Olivia; Schwartzman, Kevin; Benedetti, Andrea; Pai, Madhukar; Heymann, Jody; Menzies, Dick
2011-01-01
Simulation models are useful in policy planning for tuberculosis (TB) control. To accurately assess interventions, important modifiers of the epidemic should be accounted for in evaluative models. Improvements in population health were associated with the declining TB epidemic in the pre-antibiotic era and may be relevant today. The objective of this study was to develop and validate a TB transmission model that accounted for changes in population health. We developed a deterministic TB transmission model, using reported data from the pre-antibiotic era in England. Change in adjusted life expectancy, used as a proxy for general health, was used to determine the rate of change of key epidemiological parameters. Predicted outcomes included risk of TB infection and TB mortality. The model was validated in the setting of the Netherlands and then applied to modern Peru. The model, developed in the setting of England, predicted TB trends in the Netherlands very accurately. The R(2) value for correlation between observed and predicted data was 0.97 and 0.95 for TB infection and mortality, respectively. In Peru, the predicted decline in incidence prior to the expansion of "Directly Observed Treatment Short Course" (The DOTS strategy) was 3.7% per year (observed = 3.9% per year). After DOTS expansion, the predicted decline was very similar to the observed decline of 5.8% per year. We successfully developed and validated a TB model, which uses a proxy for population health to estimate changes in key epidemiology parameters. Population health contributed significantly to improvement in TB outcomes observed in Peru. Changing population health should be incorporated into evaluative models for global TB control.
Oldring, P.K.T.; Castle, L.; O'Mahony, C.; Dixon, J.
2013-01-01
The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19–64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005–0.012 mg dm−2. The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg−1 body weight day−1 for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg−1 body weight day. These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the lowest and the highest estimates from the refined deterministic calculations. Since this should be the case, for a fully probabilistic compared with a deterministic approach, it is concluded that the FACET tool has been verified in this example. A recent EFSA draft opinion on exposure to BPA from different sources showed that canned foods were a major contributor and compared results from various models, including those from FACET. The results from FACET were overall conservative. PMID:24405320
Role of demographic stochasticity in a speciation model with sexual reproduction
NASA Astrophysics Data System (ADS)
Lafuerza, Luis F.; McKane, Alan J.
2016-03-01
Recent theoretical studies have shown that demographic stochasticity can greatly increase the tendency of asexually reproducing phenotypically diverse organisms to spontaneously evolve into localized clusters, suggesting a simple mechanism for sympatric speciation. Here we study the role of demographic stochasticity in a model of competing organisms subject to assortative mating. We find that in models with sexual reproduction, noise can also lead to the formation of phenotypic clusters in parameter ranges where deterministic models would lead to a homogeneous distribution. In some cases, noise can have a sizable effect, rendering the deterministic modeling insufficient to understand the phenotypic distribution.
Pudda, Catherine; Boizot, François; Verplanck, Nicolas; Revol-Cavalier, Frédéric; Berthier, Jean; Thuaire, Aurélie
2018-01-01
Particle separation in microfluidic devices is a common problematic for sample preparation in biology. Deterministic lateral displacement (DLD) is efficiently implemented as a size-based fractionation technique to separate two populations of particles around a specific size. However, real biological samples contain components of many different sizes and a single DLD separation step is not sufficient to purify these complex samples. When connecting several DLD modules in series, pressure balancing at the DLD outlets of each step becomes critical to ensure an optimal separation efficiency. A generic microfluidic platform is presented in this paper to optimize pressure balancing, when DLD separation is connected either to another DLD module or to a different microfluidic function. This is made possible by generating droplets at T-junctions connected to the DLD outlets. Droplets act as pressure controllers, which perform at the same time the encapsulation of DLD sorted particles and the balance of output pressures. The optimized pressures to apply on DLD modules and on T-junctions are determined by a general model that ensures the equilibrium of the entire platform. The proposed separation platform is completely modular and reconfigurable since the same predictive model applies to any cascaded DLD modules of the droplet-based cartridge. PMID:29768490
Entrepreneurs, Chance, and the Deterministic Concentration of Wealth
Fargione, Joseph E.; Lehman, Clarence; Polasky, Stephen
2011-01-01
In many economies, wealth is strikingly concentrated. Entrepreneurs–individuals with ownership in for-profit enterprises–comprise a large portion of the wealthiest individuals, and their behavior may help explain patterns in the national distribution of wealth. Entrepreneurs are less diversified and more heavily invested in their own companies than is commonly assumed in economic models. We present an intentionally simplified individual-based model of wealth generation among entrepreneurs to assess the role of chance and determinism in the distribution of wealth. We demonstrate that chance alone, combined with the deterministic effects of compounding returns, can lead to unlimited concentration of wealth, such that the percentage of all wealth owned by a few entrepreneurs eventually approaches 100%. Specifically, concentration of wealth results when the rate of return on investment varies by entrepreneur and by time. This result is robust to inclusion of realities such as differing skill among entrepreneurs. The most likely overall growth rate of the economy decreases as businesses become less diverse, suggesting that high concentrations of wealth may adversely affect a country's economic growth. We show that a tax on large inherited fortunes, applied to a small portion of the most fortunate in the population, can efficiently arrest the concentration of wealth at intermediate levels. PMID:21814540
Scaling theory for the quasideterministic limit of continuous bifurcations.
Kessler, David A; Shnerb, Nadav M
2012-05-01
Deterministic rate equations are widely used in the study of stochastic, interacting particles systems. This approach assumes that the inherent noise, associated with the discreteness of the elementary constituents, may be neglected when the number of particles N is large. Accordingly, it fails close to the extinction transition, when the amplitude of stochastic fluctuations is comparable with the size of the population. Here we present a general scaling theory of the transition regime for spatially extended systems. We demonstrate this through a detailed study of two fundamental models for out-of-equilibrium phase transitions: the Susceptible-Infected-Susceptible (SIS) that belongs to the directed percolation equivalence class and the Susceptible-Infected-Recovered (SIR) model belonging to the dynamic percolation class. Implementing the Ginzburg criteria we show that the width of the fluctuation-dominated region scales like N^{-κ}, where N is the number of individuals per site and κ=2/(d_{u}-d), d_{u} is the upper critical dimension. Other exponents that control the approach to the deterministic limit are shown to be calculable once κ is known. The theory is extended to include the corrections to the front velocity above the transition. It is supported by the results of extensive numerical simulations for systems of various dimensionalities.
2012-01-01
Background The impact of weather and climate on malaria transmission has attracted considerable attention in recent years, yet uncertainties around future disease trends under climate change remain. Mathematical models provide powerful tools for addressing such questions and understanding the implications for interventions and eradication strategies, but these require realistic modeling of the vector population dynamics and its response to environmental variables. Methods Published and unpublished field and experimental data are used to develop new formulations for modeling the relationships between key aspects of vector ecology and environmental variables. These relationships are integrated within a validated deterministic model of Anopheles gambiae s.s. population dynamics to provide a valuable tool for understanding vector response to biotic and abiotic variables. Results A novel, parsimonious framework for assessing the effects of rainfall, cloudiness, wind speed, desiccation, temperature, relative humidity and density-dependence on vector abundance is developed, allowing ease of construction, analysis, and integration into malaria transmission models. Model validation shows good agreement with longitudinal vector abundance data from Tanzania, suggesting that recent malaria reductions in certain areas of Africa could be due to changing environmental conditions affecting vector populations. Conclusions Mathematical models provide a powerful, explanatory means of understanding the role of environmental variables on mosquito populations and hence for predicting future malaria transmission under global change. The framework developed provides a valuable advance in this respect, but also highlights key research gaps that need to be resolved if we are to better understand future malaria risk in vulnerable communities. PMID:22877154
Optimal control of population recovery--the role of economic restoration threshold.
Lampert, Adam; Hastings, Alan
2014-01-01
A variety of ecological systems around the world have been damaged in recent years, either by natural factors such as invasive species, storms and global change or by direct human activities such as overfishing and water pollution. Restoration of these systems to provide ecosystem services entails significant economic benefits. Thus, choosing how and when to restore in an optimal fashion is important, but has not been well studied. Here we examine a general model where population growth can be induced or accelerated by investing in active restoration. We show that the most cost-effective method to restore an ecosystem dictates investment until the population approaches an 'economic restoration threshold', a density above which the ecosystem should be left to recover naturally. Therefore, determining this threshold is a key general approach for guiding efficient restoration management, and we demonstrate how to calculate this threshold for both deterministic and stochastic ecosystems. © 2013 John Wiley & Sons Ltd/CNRS.
Mwanga, Gasper G; Haario, Heikki; Capasso, Vicenzo
2015-03-01
The main scope of this paper is to study the optimal control practices of malaria, by discussing the implementation of a catalog of optimal control strategies in presence of parameter uncertainties, which is typical of infectious diseases data. In this study we focus on a deterministic mathematical model for the transmission of malaria, including in particular asymptomatic carriers and two age classes in the human population. A partial qualitative analysis of the relevant ODE system has been carried out, leading to a realistic threshold parameter. For the deterministic model under consideration, four possible control strategies have been analyzed: the use of Long-lasting treated mosquito nets, indoor residual spraying, screening and treatment of symptomatic and asymptomatic individuals. The numerical results show that using optimal control the disease can be brought to a stable disease free equilibrium when all four controls are used. The Incremental Cost-Effectiveness Ratio (ICER) for all possible combinations of the disease-control measures is determined. The numerical simulations of the optimal control in the presence of parameter uncertainty demonstrate the robustness of the optimal control: the main conclusions of the optimal control remain unchanged, even if inevitable variability remains in the control profiles. The results provide a promising framework for the designing of cost-effective strategies for disease controls with multiple interventions, even under considerable uncertainty of model parameters. Copyright © 2014 Elsevier Inc. All rights reserved.
Jenouvrier, Stéphanie; Holland, Marika; Stroeve, Julienne; Barbraud, Christophe; Weimerskirch, Henri; Serreze, Mark; Caswell, Hal
2012-09-01
Sea ice conditions in the Antarctic affect the life cycle of the emperor penguin (Aptenodytes forsteri). We present a population projection for the emperor penguin population of Terre Adélie, Antarctica, by linking demographic models (stage-structured, seasonal, nonlinear, two-sex matrix population models) to sea ice forecasts from an ensemble of IPCC climate models. Based on maximum likelihood capture-mark-recapture analysis, we find that seasonal sea ice concentration anomalies (SICa ) affect adult survival and breeding success. Demographic models show that both deterministic and stochastic population growth rates are maximized at intermediate values of annual SICa , because neither the complete absence of sea ice, nor heavy and persistent sea ice, would provide satisfactory conditions for the emperor penguin. We show that under some conditions the stochastic growth rate is positively affected by the variance in SICa . We identify an ensemble of five general circulation climate models whose output closely matches the historical record of sea ice concentration in Terre Adélie. The output of this ensemble is used to produce stochastic forecasts of SICa , which in turn drive the population model. Uncertainty is included by incorporating multiple climate models and by a parametric bootstrap procedure that includes parameter uncertainty due to both model selection and estimation error. The median of these simulations predicts a decline of the Terre Adélie emperor penguin population of 81% by the year 2100. We find a 43% chance of an even greater decline, of 90% or more. The uncertainty in population projections reflects large differences among climate models in their forecasts of future sea ice conditions. One such model predicts population increases over much of the century, but overall, the ensemble of models predicts that population declines are far more likely than population increases. We conclude that climate change is a significant risk for the emperor penguin. Our analytical approach, in which demographic models are linked to IPCC climate models, is powerful and generally applicable to other species and systems. © 2012 Blackwell Publishing Ltd.
Robust planning of dynamic wireless charging infrastructure for battery electric buses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zhaocai; Song, Ziqi
Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less
Robust planning of dynamic wireless charging infrastructure for battery electric buses
Liu, Zhaocai; Song, Ziqi
2017-10-01
Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less
Multi-parametric variational data assimilation for hydrological forecasting
NASA Astrophysics Data System (ADS)
Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.
2017-12-01
Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.
Martinez, Alexander S.; Faist, Akasha M.
2016-01-01
Background Understanding patterns of biodiversity is a longstanding challenge in ecology. Similar to other biotic groups, arthropod community structure can be shaped by deterministic and stochastic processes, with limited understanding of what moderates the relative influence of these processes. Disturbances have been noted to alter the relative influence of deterministic and stochastic processes on community assembly in various study systems, implicating ecological disturbances as a potential moderator of these forces. Methods Using a disturbance gradient along a 5-year chronosequence of insect-induced tree mortality in a subalpine forest of the southern Rocky Mountains, Colorado, USA, we examined changes in community structure and relative influences of deterministic and stochastic processes in the assembly of aboveground (surface and litter-active species) and belowground (species active in organic and mineral soil layers) arthropod communities. Arthropods were sampled for all years of the chronosequence via pitfall traps (aboveground community) and modified Winkler funnels (belowground community) and sorted to morphospecies. Community structure of both communities were assessed via comparisons of morphospecies abundance, diversity, and composition. Assembly processes were inferred from a mixture of linear models and matrix correlations testing for community associations with environmental properties, and from null-deviation models comparing observed vs. expected levels of species turnover (Beta diversity) among samples. Results Tree mortality altered community structure in both aboveground and belowground arthropod communities, but null models suggested that aboveground communities experienced greater relative influences of deterministic processes, while the relative influence of stochastic processes increased for belowground communities. Additionally, Mantel tests and linear regression models revealed significant associations between the aboveground arthropod communities and vegetation and soil properties, but no significant association among belowground arthropod communities and environmental factors. Discussion Our results suggest context-dependent influences of stochastic and deterministic community assembly processes across different fractions of a spatially co-occurring ground-dwelling arthropod community following disturbance. This variation in assembly may be linked to contrasting ecological strategies and dispersal rates within above- and below-ground communities. Our findings add to a growing body of evidence indicating concurrent influences of stochastic and deterministic processes in community assembly, and highlight the need to consider potential variation across different fractions of biotic communities when testing community ecology theory and considering conservation strategies. PMID:27761333
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
NASA Astrophysics Data System (ADS)
Wang, Fengyu
Traditional deterministic reserve requirements rely on ad-hoc, rule of thumb methods to determine adequate reserve in order to ensure a reliable unit commitment. Since congestion and uncertainties exist in the system, both the quantity and the location of reserves are essential to ensure system reliability and market efficiency. The modeling of operating reserves in the existing deterministic reserve requirements acquire the operating reserves on a zonal basis and do not fully capture the impact of congestion. The purpose of a reserve zone is to ensure that operating reserves are spread across the network. Operating reserves are shared inside each reserve zone, but intra-zonal congestion may block the deliverability of operating reserves within a zone. Thus, improving reserve policies such as reserve zones may improve the location and deliverability of reserve. As more non-dispatchable renewable resources are integrated into the grid, it will become increasingly difficult to predict the transfer capabilities and the network congestion. At the same time, renewable resources require operators to acquire more operating reserves. With existing deterministic reserve requirements unable to ensure optimal reserve locations, the importance of reserve location and reserve deliverability will increase. While stochastic programming can be used to determine reserve by explicitly modelling uncertainties, there are still scalability as well as pricing issues. Therefore, new methods to improve existing deterministic reserve requirements are desired. One key barrier of improving existing deterministic reserve requirements is its potential market impacts. A metric, quality of service, is proposed in this thesis to evaluate the price signal and market impacts of proposed hourly reserve zones. Three main goals of this thesis are: 1) to develop a theoretical and mathematical model to better locate reserve while maintaining the deterministic unit commitment and economic dispatch structure, especially with the consideration of renewables, 2) to develop a market settlement scheme of proposed dynamic reserve policies such that the market efficiency is improved, 3) to evaluate the market impacts and price signal of the proposed dynamic reserve policies.
Forecasting extinction risk with nonstationary matrix models.
Gotelli, Nicholas J; Ellison, Aaron M
2006-02-01
Matrix population growth models are standard tools for forecasting population change and for managing rare species, but they are less useful for predicting extinction risk in the face of changing environmental conditions. Deterministic models provide point estimates of lambda, the finite rate of increase, as well as measures of matrix sensitivity and elasticity. Stationary matrix models can be used to estimate extinction risk in a variable environment, but they assume that the matrix elements are randomly sampled from a stationary (i.e., non-changing) distribution. Here we outline a method for using nonstationary matrix models to construct realistic forecasts of population fluctuation in changing environments. Our method requires three pieces of data: (1) field estimates of transition matrix elements, (2) experimental data on the demographic responses of populations to altered environmental conditions, and (3) forecasting data on environmental drivers. These three pieces of data are combined to generate a series of sequential transition matrices that emulate a pattern of long-term change in environmental drivers. Realistic estimates of population persistence and extinction risk can be derived from stochastic permutations of such a model. We illustrate the steps of this analysis with data from two populations of Sarracenia purpurea growing in northern New England. Sarracenia purpurea is a perennial carnivorous plant that is potentially at risk of local extinction because of increased nitrogen deposition. Long-term monitoring records or models of environmental change can be used to generate time series of driver variables under different scenarios of changing environments. Both manipulative and natural experiments can be used to construct a linking function that describes how matrix parameters change as a function of the environmental driver. This synthetic modeling approach provides quantitative estimates of extinction probability that have an explicit mechanistic basis.
Felton, Shilo K.; Hostetter, Nathan J.; Pollock, Kenneth H.; Simons, Theodore R.
2017-01-01
In populations of long-lived species, adult survival typically has a relatively high influence on population growth. From a management perspective, however, adult survival can be difficult to increase in some instances, so other component rates must be considered to reverse population declines. In North Carolina, USA, management to conserve the American Oystercatcher (Haematopus palliatus) targets component vital rates related to fecundity, specifically nest and chick survival. The effectiveness of such a management approach in North Carolina was assessed by creating a three-stage female-based deterministic matrix model. Isoclines were produced from the matrix model to evaluate minimum nest and chick survival rates necessary to reverse population decline, assuming all other vital rates remained stable at mean values. Assuming accurate vital rates, breeding populations within North Carolina appear to be declining. To reverse this decline, combined nest and chick survival would need to increase from 0.14 to ≤ 0.27, a rate that appears to be attainable based on historical estimates. Results are heavily dependent on assumptions of other vital rates, most notably adult survival, revealing the need for accurate estimates of all vital rates to inform management actions. This approach provides valuable insights for evaluating conservation goals for species of concern.
Study on the evaluation method for fault displacement based on characterized source model
NASA Astrophysics Data System (ADS)
Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.
2016-12-01
In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.
Effect of Uncertainty on Deterministic Runway Scheduling
NASA Technical Reports Server (NTRS)
Gupta, Gautam; Malik, Waqar; Jung, Yoon C.
2012-01-01
Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.
Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.
Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio
2010-03-26
Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.
Cao, Pengxing; Tan, Xiahui; Donovan, Graham; Sanderson, Michael J; Sneyd, James
2014-08-01
The inositol trisphosphate receptor ([Formula: see text]) is one of the most important cellular components responsible for oscillations in the cytoplasmic calcium concentration. Over the past decade, two major questions about the [Formula: see text] have arisen. Firstly, how best should the [Formula: see text] be modeled? In other words, what fundamental properties of the [Formula: see text] allow it to perform its function, and what are their quantitative properties? Secondly, although calcium oscillations are caused by the stochastic opening and closing of small numbers of [Formula: see text], is it possible for a deterministic model to be a reliable predictor of calcium behavior? Here, we answer these two questions, using airway smooth muscle cells (ASMC) as a specific example. Firstly, we show that periodic calcium waves in ASMC, as well as the statistics of calcium puffs in other cell types, can be quantitatively reproduced by a two-state model of the [Formula: see text], and thus the behavior of the [Formula: see text] is essentially determined by its modal structure. The structure within each mode is irrelevant for function. Secondly, we show that, although calcium waves in ASMC are generated by a stochastic mechanism, [Formula: see text] stochasticity is not essential for a qualitative prediction of how oscillation frequency depends on model parameters, and thus deterministic [Formula: see text] models demonstrate the same level of predictive capability as do stochastic models. We conclude that, firstly, calcium dynamics can be accurately modeled using simplified [Formula: see text] models, and, secondly, to obtain qualitative predictions of how oscillation frequency depends on parameters it is sufficient to use a deterministic model.
Observations, theoretical ideas and modeling of turbulent flows: Past, present and future
NASA Technical Reports Server (NTRS)
Chapman, G. T.; Tobak, M.
1985-01-01
Turbulence was analyzed in a historical context featuring the interactions between observations, theoretical ideas, and modeling within three successive movements. These are identified as predominantly statistical, structural and deterministic. The statistical movement is criticized for its failure to deal with the structural elements observed in turbulent flows. The structural movement is criticized for its failure to embody observed structural elements within a formal theory. The deterministic movement is described as having the potential of overcoming these deficiencies by allowing structural elements to exhibit chaotic behavior that is nevertheless embodied within a theory. Four major ideas of this movement are described: bifurcation theory, strange attractors, fractals, and the renormalization group. A framework for the future study of turbulent flows is proposed, based on the premises of the deterministic movement.
Gascuel, Fanny; Choisy, Marc; Duplantier, Jean-Marc; Débarre, Florence; Brouat, Carine
2013-01-01
Although bubonic plague is an endemic zoonosis in many countries around the world, the factors responsible for the persistence of this highly virulent disease remain poorly known. Classically, the endemic persistence of plague is suspected to be due to the coexistence of plague resistant and plague susceptible rodents in natural foci, and/or to a metapopulation structure of reservoirs. Here, we test separately the effect of each of these factors on the long-term persistence of plague. We analyse the dynamics and equilibria of a model of plague propagation, consistent with plague ecology in Madagascar, a major focus where this disease is endemic since the 1920s in central highlands. By combining deterministic and stochastic analyses of this model, and including sensitivity analyses, we show that (i) endemicity is favoured by intermediate host population sizes, (ii) in large host populations, the presence of resistant rats is sufficient to explain long-term persistence of plague, and (iii) the metapopulation structure of susceptible host populations alone can also account for plague endemicity, thanks to both subdivision and the subsequent reduction in the size of subpopulations, and extinction-recolonization dynamics of the disease. In the light of these results, we suggest scenarios to explain the localized presence of plague in Madagascar. PMID:23675291
Gascuel, Fanny; Choisy, Marc; Duplantier, Jean-Marc; Débarre, Florence; Brouat, Carine
2013-01-01
Although bubonic plague is an endemic zoonosis in many countries around the world, the factors responsible for the persistence of this highly virulent disease remain poorly known. Classically, the endemic persistence of plague is suspected to be due to the coexistence of plague resistant and plague susceptible rodents in natural foci, and/or to a metapopulation structure of reservoirs. Here, we test separately the effect of each of these factors on the long-term persistence of plague. We analyse the dynamics and equilibria of a model of plague propagation, consistent with plague ecology in Madagascar, a major focus where this disease is endemic since the 1920s in central highlands. By combining deterministic and stochastic analyses of this model, and including sensitivity analyses, we show that (i) endemicity is favoured by intermediate host population sizes, (ii) in large host populations, the presence of resistant rats is sufficient to explain long-term persistence of plague, and (iii) the metapopulation structure of susceptible host populations alone can also account for plague endemicity, thanks to both subdivision and the subsequent reduction in the size of subpopulations, and extinction-recolonization dynamics of the disease. In the light of these results, we suggest scenarios to explain the localized presence of plague in Madagascar.
Billiard, Sylvain; Castric, Vincent; Vekemans, Xavier
2007-03-01
We developed a general model of sporophytic self-incompatibility under negative frequency-dependent selection allowing complex patterns of dominance among alleles. We used this model deterministically to investigate the effects on equilibrium allelic frequencies of the number of dominance classes, the number of alleles per dominance class, the asymmetry in dominance expression between pollen and pistil, and whether selection acts on male fitness only or both on male and on female fitnesses. We show that the so-called "recessive effect" occurs under a wide variety of situations. We found emerging properties of finite population models with several alleles per dominance class such as that higher numbers of alleles are maintained in more dominant classes and that the number of dominance classes can evolve. We also investigated the occurrence of homozygous genotypes and found that substantial proportions of those can occur for the most recessive alleles. We used the model for two species with complex dominance patterns to test whether allelic frequencies in natural populations are in agreement with the distribution predicted by our model. We suggest that the model can be used to test explicitly for additional, allele-specific, selective forces.
Two Strain Dengue Model with Temporary Cross Immunity and Seasonality
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Ballesteros, Sebastien; Stollenwerk, Nico
2010-09-01
Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.
Two Strain Dengue Model with Temporary Cross Immunity and Seasonality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguiar, Maira; Ballesteros, Sebastien; Stollenwerk, Nico
Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.
The threshold of a stochastic delayed SIR epidemic model with vaccination
NASA Astrophysics Data System (ADS)
Liu, Qun; Jiang, Daqing
2016-11-01
In this paper, we study the threshold dynamics of a stochastic delayed SIR epidemic model with vaccination. We obtain sufficient conditions for extinction and persistence in the mean of the epidemic. The threshold between persistence in the mean and extinction of the stochastic system is also obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number Rbar0 of the deterministic system. Results show that time delay has important effects on the persistence and extinction of the epidemic.
Stomp, A M
1994-01-01
To meet the demands for goods and services of an exponentially growing human population, global ecosystems will come under increasing human management. The hallmark of successful ecosystem management will be long-term ecosystem stability. Ecosystems and the genetic information and processes which underlie interactions of organisms with the environment in populations and communities exhibit behaviors which have nonlinear characteristics. Nonlinear mathematical formulations describing deterministic chaos have been used successfully to model such systems in physics, chemistry, economics, physiology, and epidemiology. This approach can be extended to ecotoxicology and can be used to investigate how changes in genetic information determine the behavior of populations and communities. This article seeks to provide the arguments for such an approach and to give initial direction to the search for the boundary conditions within which lies ecosystem stability. The identification of a theoretical framework for ecotoxicology and the parameters which drive the underlying model is a critical component in the formulation of a prioritized research agenda and appropriate ecosystem management policy and regulation. PMID:7713038
On the deterministic and stochastic use of hydrologic models
Farmer, William H.; Vogel, Richard M.
2016-01-01
Environmental simulation models, such as precipitation-runoff watershed models, are increasingly used in a deterministic manner for environmental and water resources design, planning, and management. In operational hydrology, simulated responses are now routinely used to plan, design, and manage a very wide class of water resource systems. However, all such models are calibrated to existing data sets and retain some residual error. This residual, typically unknown in practice, is often ignored, implicitly trusting simulated responses as if they are deterministic quantities. In general, ignoring the residuals will result in simulated responses with distributional properties that do not mimic those of the observed responses. This discrepancy has major implications for the operational use of environmental simulation models as is shown here. Both a simple linear model and a distributed-parameter precipitation-runoff model are used to document the expected bias in the distributional properties of simulated responses when the residuals are ignored. The systematic reintroduction of residuals into simulated responses in a manner that produces stochastic output is shown to improve the distributional properties of the simulated responses. Every effort should be made to understand the distributional behavior of simulation residuals and to use environmental simulation models in a stochastic manner.
Ordinal optimization and its application to complex deterministic problems
NASA Astrophysics Data System (ADS)
Yang, Mike Shang-Yu
1998-10-01
We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.
Structural stability of nonlinear population dynamics.
Cenci, Simone; Saavedra, Serguei
2018-01-01
In population dynamics, the concept of structural stability has been used to quantify the tolerance of a system to environmental perturbations. Yet, measuring the structural stability of nonlinear dynamical systems remains a challenging task. Focusing on the classic Lotka-Volterra dynamics, because of the linearity of the functional response, it has been possible to measure the conditions compatible with a structurally stable system. However, the functional response of biological communities is not always well approximated by deterministic linear functions. Thus, it is unclear the extent to which this linear approach can be generalized to other population dynamics models. Here, we show that the same approach used to investigate the classic Lotka-Volterra dynamics, which is called the structural approach, can be applied to a much larger class of nonlinear models. This class covers a large number of nonlinear functional responses that have been intensively investigated both theoretically and experimentally. We also investigate the applicability of the structural approach to stochastic dynamical systems and we provide a measure of structural stability for finite populations. Overall, we show that the structural approach can provide reliable and tractable information about the qualitative behavior of many nonlinear dynamical systems.
Structural stability of nonlinear population dynamics
NASA Astrophysics Data System (ADS)
Cenci, Simone; Saavedra, Serguei
2018-01-01
In population dynamics, the concept of structural stability has been used to quantify the tolerance of a system to environmental perturbations. Yet, measuring the structural stability of nonlinear dynamical systems remains a challenging task. Focusing on the classic Lotka-Volterra dynamics, because of the linearity of the functional response, it has been possible to measure the conditions compatible with a structurally stable system. However, the functional response of biological communities is not always well approximated by deterministic linear functions. Thus, it is unclear the extent to which this linear approach can be generalized to other population dynamics models. Here, we show that the same approach used to investigate the classic Lotka-Volterra dynamics, which is called the structural approach, can be applied to a much larger class of nonlinear models. This class covers a large number of nonlinear functional responses that have been intensively investigated both theoretically and experimentally. We also investigate the applicability of the structural approach to stochastic dynamical systems and we provide a measure of structural stability for finite populations. Overall, we show that the structural approach can provide reliable and tractable information about the qualitative behavior of many nonlinear dynamical systems.
Palmer, Tim N.; O’Shea, Michael
2015-01-01
How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete. PMID:26528173
Heart rate variability as determinism with jump stochastic parameters.
Zheng, Jiongxuan; Skufca, Joseph D; Bollt, Erik M
2013-08-01
We use measured heart rate information (RR intervals) to develop a one-dimensional nonlinear map that describes short term deterministic behavior in the data. Our study suggests that there is a stochastic parameter with persistence which causes the heart rate and rhythm system to wander about a bifurcation point. We propose a modified circle map with a jump process noise term as a model which can qualitatively capture such this behavior of low dimensional transient determinism with occasional (stochastically defined) jumps from one deterministic system to another within a one parameter family of deterministic systems.
Population profiling in China by gender and age: implication for HIV incidences.
Pan, Yuanyi; Wu, Jianhong
2009-11-18
With the world's largest population, HIV spread in China has been closely watched and widely studied by its government and the international community. One important factor that might contribute to the epidemic is China's numerous surplus of men, due to its imbalanced sex ratio in newborns. However, the sex ratio in the human population is often assumed to be 1:1 in most studies of sexually transmitted diseases (STDs). Here, a mathematical model is proposed to estimate the population size in each gender and within different stages of reproduction and sexual activities. This population profiling by age and gender will assist in more precise prediction of HIV incidences. The total population is divided into 6 subgroups by gender and age. A deterministic compartmental model is developed to describe birth, death, age and the interactions among different subgroups, with a focus on the preference for newborn boys and its impact for the sex ratios. Data from 2003 to 2007 is used to estimate model parameters, and simulations predict short-term and long-term population profiles. The population of China will go to a descending track around 2030. Despite the possible underestimated number of newborns in the last couple of years, model-based simulations show that there will be about 28 million male individuals in 2055 without female partners during their sexually active stages. The birth rate in China must be increased to keep the population viable. But increasing the birth rate without balancing the sex ratio in newborns is problematic, as this will generate a large number of surplus males. Besides other social, economic and psychological issues, the impact of this surplus of males on STD incidences, including HIV infections, must be dealt with as early as possible.
Tag-mediated cooperation with non-deterministic genotype-phenotype mapping
NASA Astrophysics Data System (ADS)
Zhang, Hong; Chen, Shu
2016-01-01
Tag-mediated cooperation provides a helpful framework for resolving evolutionary social dilemmas. However, most of the previous studies have not taken into account genotype-phenotype distinction in tags, which may play an important role in the process of evolution. To take this into consideration, we introduce non-deterministic genotype-phenotype mapping into a tag-based model with spatial prisoner's dilemma. By our definition, the similarity between genotypic tags does not directly imply the similarity between phenotypic tags. We find that the non-deterministic mapping from genotypic tag to phenotypic tag has non-trivial effects on tag-mediated cooperation. Although we observe that high levels of cooperation can be established under a wide variety of conditions especially when the decisiveness is moderate, the uncertainty in the determination of phenotypic tags may have a detrimental effect on the tag mechanism by disturbing the homophilic interaction structure which can explain the promotion of cooperation in tag systems. Furthermore, the non-deterministic mapping may undermine the robustness of the tag mechanism with respect to various factors such as the structure of the tag space and the tag flexibility. This observation warns us about the danger of applying the classical tag-based models to the analysis of empirical phenomena if genotype-phenotype distinction is significant in real world. Non-deterministic genotype-phenotype mapping thus provides a new perspective to the understanding of tag-mediated cooperation.
Joyce Maschinski
2001-01-01
Small populations are threatened with deterministic and stochastic events that can drive the number of individuals below a critical threshold for survival. Long-term studies allow us to increase our understanding of processes required for their conservation. In the past 7 years, the population of the federally endangered Holy Ghost ipomopsis (Ipomopsis sancti-spiritus...
Casciano, Roman; Chulikavit, Maruit; Di Lorenzo, Giuseppe; Liu, Zhimei; Baladi, Jean-Francois; Wang, Xufang; Robertson, Justin; Garrison, Lou
2011-01-01
A recent indirect comparison study showed that sunitinib-refractory metastatic renal cell carcinoma (mRCC) patients treated with everolimus are expected to have improved overall survival outcomes compared to patients treated with sorafenib. This analysis examines the likely cost-effectiveness of everolimus versus sorafenib in this setting from a US payer perspective. A Markov model was developed to simulate a cohort of sunitinib-refractory mRCC patients and to estimate the cost per incremental life-years gained (LYG) and quality-adjusted life-years (QALYs) gained. Markov states included are stable disease without adverse events, stable disease with adverse events, disease progression, and death. Transition probabilities were estimated using a subset of the RECORD-1 patient population receiving everolimus after sunitinib, and a comparable population receiving sorafenib in a single-arm phase II study. Costs of antitumor therapies were based on wholesale acquisition cost. Health state costs accounted for physician visits, tests, adverse events, postprogression therapy, and end-of-life care. The model extrapolated beyond the trial time horizon for up to 6 years based on published trial data. Deterministic and probabilistic sensitivity analyses were conducted. The estimated gain over sorafenib treatment was 1.273 LYs (0.916 QALYs) at an incremental cost of $81,643. The deterministic analysis resulted in an incremental cost-effectiveness ratio (ICER) of $64,155/LYG ($89,160/QALY). The probabilistic sensitivity analysis demonstrated that results were highly consistent across simulations. As the ICER fell within the cost per QALY range for many other widely used oncology medicines, everolimus is projected to be a cost-effective treatment relative to sorafenib for sunitinib-refractory mRCC. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Time Domain and Frequency Domain Deterministic Channel Modeling for Tunnel/Mining Environments.
Zhou, Chenming; Jacksha, Ronald; Yan, Lincan; Reyes, Miguel; Kovalchik, Peter
2017-01-01
Understanding wireless channels in complex mining environments is critical for designing optimized wireless systems operated in these environments. In this paper, we propose two physics-based, deterministic ultra-wideband (UWB) channel models for characterizing wireless channels in mining/tunnel environments - one in the time domain and the other in the frequency domain. For the time domain model, a general Channel Impulse Response (CIR) is derived and the result is expressed in the classic UWB tapped delay line model. The derived time domain channel model takes into account major propagation controlling factors including tunnel or entry dimensions, frequency, polarization, electrical properties of the four tunnel walls, and transmitter and receiver locations. For the frequency domain model, a complex channel transfer function is derived analytically. Based on the proposed physics-based deterministic channel models, channel parameters such as delay spread, multipath component number, and angular spread are analyzed. It is found that, despite the presence of heavy multipath, both channel delay spread and angular spread for tunnel environments are relatively smaller compared to that of typical indoor environments. The results and findings in this paper have application in the design and deployment of wireless systems in underground mining environments.
Time Domain and Frequency Domain Deterministic Channel Modeling for Tunnel/Mining Environments
Zhou, Chenming; Jacksha, Ronald; Yan, Lincan; Reyes, Miguel; Kovalchik, Peter
2018-01-01
Understanding wireless channels in complex mining environments is critical for designing optimized wireless systems operated in these environments. In this paper, we propose two physics-based, deterministic ultra-wideband (UWB) channel models for characterizing wireless channels in mining/tunnel environments — one in the time domain and the other in the frequency domain. For the time domain model, a general Channel Impulse Response (CIR) is derived and the result is expressed in the classic UWB tapped delay line model. The derived time domain channel model takes into account major propagation controlling factors including tunnel or entry dimensions, frequency, polarization, electrical properties of the four tunnel walls, and transmitter and receiver locations. For the frequency domain model, a complex channel transfer function is derived analytically. Based on the proposed physics-based deterministic channel models, channel parameters such as delay spread, multipath component number, and angular spread are analyzed. It is found that, despite the presence of heavy multipath, both channel delay spread and angular spread for tunnel environments are relatively smaller compared to that of typical indoor environments. The results and findings in this paper have application in the design and deployment of wireless systems in underground mining environments.† PMID:29457801
Travelling Wave Solutions in Multigroup Age-Structured Epidemic Models
NASA Astrophysics Data System (ADS)
Ducrot, Arnaut; Magal, Pierre; Ruan, Shigui
2010-01-01
Age-structured epidemic models have been used to describe either the age of individuals or the age of infection of certain diseases and to determine how these characteristics affect the outcomes and consequences of epidemiological processes. Most results on age-structured epidemic models focus on the existence, uniqueness, and convergence to disease equilibria of solutions. In this paper we investigate the existence of travelling wave solutions in a deterministic age-structured model describing the circulation of a disease within a population of multigroups. Individuals of each group are able to move with a random walk which is modelled by the classical Fickian diffusion and are classified into two subclasses, susceptible and infective. A susceptible individual in a given group can be crisscross infected by direct contact with infective individuals of possibly any group. This process of transmission can depend upon the age of the disease of infected individuals. The goal of this paper is to provide sufficient conditions that ensure the existence of travelling wave solutions for the age-structured epidemic model. The case of two population groups is numerically investigated which applies to the crisscross transmission of feline immunodeficiency virus (FIV) and some sexual transmission diseases.
Guardini, Ilario; Deroma, Laura; Salmaso, Daniele; Palese, Alvisa
2011-01-01
The aging of the nursing workforce is a phenomenon that several industrialized countries has been facing for at least a decade. In Italy, for the period between 2011 and 2021, the issue associated with the nursing workforce will not be one of shortage but rather one of aging. The main objective of this study was to estimate the employed nurse population aging trends for the period 2009-2035 in two Teaching Hospitals (TH1 and TH2) located in the North of Italy. A deterministic mathematical model has been developed in order to obtain aging projections for the nursing workforce from 2009 until 2035. Within the next six years, the aging trend of nurse populations, with respect to 2008, at the TH 1 and 2 will show a steady increase of nurses aged over 45; specifically, a 29.4% vs. 34.1% increase and a 21.6% vs. 41.4% respectively. It is hypothesized that the TH 1 will have the highest proportion of nurses aged over 45 in 2014, whereas it is estimated that for the TH 2 this trend will continue until 2021 when the proportion of nurses aged over 45 will make up 48.8% out of the nurse population. The trend may lead to an increase in the number of experienced nurses; however, such a trend should be looked at with concern, with respect to the physical unsuitability. Nurses aged over 45 represent 20% of the workforce at both TH. Conclusions/implications for management and research. If the trend predicted by the model were to occur in the coming years, the problem of nursing workforce ageing will have to be addressed because it involves different expectations but also the perception of different work skills. The nursing direction is called to test new strategies for managing the staff and the career of nurses will also need to be redesigned, because contract law looks primarily at the initial stage of working life (specializations, university education, career opportunities) neglecting the final one (from the 50th year of age to retirement).
Horn, Johannes; Damm, Oliver; Greiner, Wolfgang; Hengel, Hartmut; Kretzschmar, Mirjam E; Siedler, Anette; Ultsch, Bernhard; Weidemann, Felix; Wichmann, Ole; Karch, André; Mikolajczyk, Rafael T
2018-01-09
Epidemiological studies suggest that reduced exposure to varicella might lead to an increased risk for herpes zoster (HZ). Reduction of exposure to varicella is a consequence of varicella vaccination but also of demographic changes. We analyzed how the combination of vaccination programs and demographic dynamics will affect the epidemiology of varicella and HZ in Germany over the next 50 years. We used a deterministic dynamic compartmental model to assess the impact of different varicella and HZ vaccination strategies on varicella and HZ epidemiology in three demographic scenarios, namely the projected population for Germany, the projected population additionally accounting for increased immigration as observed in 2015/2016, and a stationary population. Projected demographic changes alone result in an increase of annual HZ cases by 18.3% and a decrease of varicella cases by 45.7% between 1990 and 2060. Independently of the demographic scenario, varicella vaccination reduces the cumulative number of varicella cases until 2060 by approximately 70%, but also increases HZ cases by 10%. Unlike the currently licensed live attenuated HZ vaccine, the new subunit vaccine candidate might completely counteract this effect. Relative vaccine effects were consistent across all demographic scenarios. Demographic dynamics will be a major determinant of HZ epidemiology in the next 50 years. While stationary population models are appropriate for assessing vaccination impact, models incorporating realistic population structures allow a direct comparison to surveillance data and can thus provide additional input for immunization decision-making and resource planning.
Guidelines 13 and 14—Prediction uncertainty
Hill, Mary C.; Tiedeman, Claire
2005-01-01
An advantage of using optimization for model development and calibration is that optimization provides methods for evaluating and quantifying prediction uncertainty. Both deterministic and statistical methods can be used. Guideline 13 discusses using regression and post-audits, which we classify as deterministic methods. Guideline 14 discusses inferential statistics and Monte Carlo methods, which we classify as statistical methods.
A data-driven model for influenza transmission incorporating media effects.
Mitchell, Lewis; Ross, Joshua V
2016-10-01
Numerous studies have attempted to model the effect of mass media on the transmission of diseases such as influenza; however, quantitative data on media engagement has until recently been difficult to obtain. With the recent explosion of 'big data' coming from online social media and the like, large volumes of data on a population's engagement with mass media during an epidemic are becoming available to researchers. In this study, we combine an online dataset comprising millions of shared messages relating to influenza with traditional surveillance data on flu activity to suggest a functional form for the relationship between the two. Using this data, we present a simple deterministic model for influenza dynamics incorporating media effects, and show that such a model helps explain the dynamics of historical influenza outbreaks. Furthermore, through model selection we show that the proposed media function fits historical data better than other media functions proposed in earlier studies.
Designing a podiatry service to meet the needs of the population: a service simulation.
Campbell, Jackie A
2007-02-01
A model of a podiatry service has been developed which takes into consideration the effect of changing access criteria, skill mix and staffing levels (among others) given fixed local staffing budgets and the foot-health characteristics of the local community. A spreadsheet-based deterministic model was chosen to allow maximum transparency of programming. This work models a podiatry service in England, but could be adapted for other settings and, with some modification, for other community-based services. This model enables individual services to see the effect on outcome parameters such as number of patients treated, number discharged and size of waiting lists of various service configurations, given their individual local data profile. The process of designing the model has also had spin-off benefits for the participants in making explicit many of the implicit rules used in managing their services.
Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity
NASA Astrophysics Data System (ADS)
Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján
2017-06-01
It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
From Weakly Chaotic Dynamics to Deterministic Subdiffusion via Copula Modeling
NASA Astrophysics Data System (ADS)
Nazé, Pierre
2018-03-01
Copula modeling consists in finding a probabilistic distribution, called copula, whereby its coupling with the marginal distributions of a set of random variables produces their joint distribution. The present work aims to use this technique to connect the statistical distributions of weakly chaotic dynamics and deterministic subdiffusion. More precisely, we decompose the jumps distribution of Geisel-Thomae map into a bivariate one and determine the marginal and copula distributions respectively by infinite ergodic theory and statistical inference techniques. We verify therefore that the characteristic tail distribution of subdiffusion is an extreme value copula coupling Mittag-Leffler distributions. We also present a method to calculate the exact copula and joint distributions in the case where weakly chaotic dynamics and deterministic subdiffusion statistical distributions are already known. Numerical simulations and consistency with the dynamical aspects of the map support our results.
The noisy edge of traveling waves
Hallatschek, Oskar
2011-01-01
Traveling waves are ubiquitous in nature and control the speed of many important dynamical processes, including chemical reactions, epidemic outbreaks, and biological evolution. Despite their fundamental role in complex systems, traveling waves remain elusive because they are often dominated by rare fluctuations in the wave tip, which have defied any rigorous analysis so far. Here, we show that by adjusting nonlinear model details, noisy traveling waves can be solved exactly. The moment equations of these tuned models are closed and have a simple analytical structure resembling the deterministic approximation supplemented by a nonlocal cutoff term. The peculiar form of the cutoff shapes the noisy edge of traveling waves and is critical for the correct prediction of the wave speed and its fluctuations. Our approach is illustrated and benchmarked using the example of fitness waves arising in simple models of microbial evolution, which are highly sensitive to number fluctuations. We demonstrate explicitly how these models can be tuned to account for finite population sizes and determine how quickly populations adapt as a function of population size and mutation rates. More generally, our method is shown to apply to a broad class of models, in which number fluctuations are generated by branching processes. Because of this versatility, the method of model tuning may serve as a promising route toward unraveling universal properties of complex discrete particle systems. PMID:21187435
Gasche, Loïc; Mahévas, Stéphanie; Marchal, Paul
2013-01-01
Ecosystems are usually complex, nonlinear and strongly influenced by poorly known environmental variables. Among these systems, marine ecosystems have high uncertainties: marine populations in general are known to exhibit large levels of natural variability and the intensity of fishing efforts can change rapidly. These uncertainties are a source of risks that threaten the sustainability of both fish populations and fishing fleets targeting them. Appropriate management measures have to be found in order to reduce these risks and decrease sensitivity to uncertainties. Methods have been developed within decision theory that aim at allowing decision making under severe uncertainty. One of these methods is the information-gap decision theory. The info-gap method has started to permeate ecological modelling, with recent applications to conservation. However, these practical applications have so far been restricted to simple models with analytical solutions. Here we implement a deterministic approach based on decision theory in a complex model of the Eastern English Channel. Using the ISIS-Fish modelling platform, we model populations of sole and plaice in this area. We test a wide range of values for ecosystem, fleet and management parameters. From these simulations, we identify management rules controlling fish harvesting that allow reaching management goals recommended by ICES (International Council for the Exploration of the Sea) working groups while providing the highest robustness to uncertainties on ecosystem parameters. PMID:24204873
Gasche, Loïc; Mahévas, Stéphanie; Marchal, Paul
2013-01-01
Ecosystems are usually complex, nonlinear and strongly influenced by poorly known environmental variables. Among these systems, marine ecosystems have high uncertainties: marine populations in general are known to exhibit large levels of natural variability and the intensity of fishing efforts can change rapidly. These uncertainties are a source of risks that threaten the sustainability of both fish populations and fishing fleets targeting them. Appropriate management measures have to be found in order to reduce these risks and decrease sensitivity to uncertainties. Methods have been developed within decision theory that aim at allowing decision making under severe uncertainty. One of these methods is the information-gap decision theory. The info-gap method has started to permeate ecological modelling, with recent applications to conservation. However, these practical applications have so far been restricted to simple models with analytical solutions. Here we implement a deterministic approach based on decision theory in a complex model of the Eastern English Channel. Using the ISIS-Fish modelling platform, we model populations of sole and plaice in this area. We test a wide range of values for ecosystem, fleet and management parameters. From these simulations, we identify management rules controlling fish harvesting that allow reaching management goals recommended by ICES (International Council for the Exploration of the Sea) working groups while providing the highest robustness to uncertainties on ecosystem parameters.
Public health consequences of macrolide use in food animals: a deterministic risk assessment.
Hurd, H Scott; Doores, Stephanie; Hayes, Dermot; Mathew, Alan; Maurer, John; Silley, Peter; Singer, Randall S; Jones, Ronald N
2004-05-01
The potential impact on human health from antibiotic-resistant bacteria selected by use of antibiotics in food animals has resulted in many reports and recommended actions. The U.S. Food and Drug Administration Center for Veterinary Medicine has issued Guidance Document 152, which advises veterinary drug sponsors of one potential process for conducting a qualitative risk assessment of drug use in food animals. Using this guideline, we developed a deterministic model to assess the risk from two macrolide antibiotics, tylosin and tilmicosin. The scope of modeling included all label claim uses of both macrolides in poultry, swine, and beef cattle. The Guidance Document was followed to define the hazard, which is illness (i) caused by foodborne bacteria with a resistance determinant, (ii) attributed to a specified animal-derived meat commodity, and (iii) treated with a human use drug of the same class. Risk was defined as the probability of this hazard combined with the consequence of treatment failure due to resistant Campylobacter spp. or Enterococcus faecium. A binomial event model was applied to estimate the annual risk for the U.S. general population. Parameters were derived from industry drug use surveys, scientific literature, medical guidelines, and government documents. This unique farm-to-patient risk assessment demonstrated that use of tylosin and tilmicosin in food animals presents a very low risk of human treatment failure, with an approximate annual probability of less than 1 in 10 million Campylobacter-derived and approximately 1 in 3 billion E. faecium-derived risk.
NASA Astrophysics Data System (ADS)
Deco, Gustavo; Martí, Daniel
2007-03-01
The analysis of transitions in stochastic neurodynamical systems is essential to understand the computational principles that underlie those perceptual and cognitive processes involving multistable phenomena, like decision making and bistable perception. To investigate the role of noise in a multistable neurodynamical system described by coupled differential equations, one usually considers numerical simulations, which are time consuming because of the need for sufficiently many trials to capture the statistics of the influence of the fluctuations on that system. An alternative analytical approach involves the derivation of deterministic differential equations for the moments of the distribution of the activity of the neuronal populations. However, the application of the method of moments is restricted by the assumption that the distribution of the state variables of the system takes on a unimodal Gaussian shape. We extend in this paper the classical moments method to the case of bimodal distribution of the state variables, such that a reduced system of deterministic coupled differential equations can be derived for the desired regime of multistability.
Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.
Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O
2006-03-01
The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.
Modeling disease transmission near eradication: An equation free approach
NASA Astrophysics Data System (ADS)
Williams, Matthew O.; Proctor, Joshua L.; Kutz, J. Nathan
2015-01-01
Although disease transmission in the near eradication regime is inherently stochastic, deterministic quantities such as the probability of eradication are of interest to policy makers and researchers. Rather than running large ensembles of discrete stochastic simulations over long intervals in time to compute these deterministic quantities, we create a data-driven and deterministic "coarse" model for them using the Equation Free (EF) framework. In lieu of deriving an explicit coarse model, the EF framework approximates any needed information, such as coarse time derivatives, by running short computational experiments. However, the choice of the coarse variables (i.e., the state of the coarse system) is critical if the resulting model is to be accurate. In this manuscript, we propose a set of coarse variables that result in an accurate model in the endemic and near eradication regimes, and demonstrate this on a compartmental model representing the spread of Poliomyelitis. When combined with adaptive time-stepping coarse projective integrators, this approach can yield over a factor of two speedup compared to direct simulation, and due to its lower dimensionality, could be beneficial when conducting systems level tasks such as designing eradication or monitoring campaigns.
Changes in seasonal climate outpace compensatory density-dependence in eastern brook trout
Bassar, Ronald D.; Letcher, Benjamin H.; Nislow, Keith H.; Whiteley, Andrew R.
2016-01-01
Understanding how multiple extrinsic (density-independent) factors and intrinsic (density-dependent) mechanisms influence population dynamics has become increasingly urgent in the face of rapidly changing climates. It is particularly unclear how multiple extrinsic factors with contrasting effects among seasons are related to declines in population numbers and changes in mean body size and whether there is a strong role for density-dependence. The primary goal of this study was to identify the roles of seasonal variation in climate driven environmental direct effects (mean stream flow and temperature) versus density-dependence on population size and mean body size in eastern brook trout (Salvelinus fontinalis). We use data from a 10-year capture-mark-recapture study of eastern brook trout in four streams in Western Massachusetts, USA to parameterize a discrete-time population projection model. The model integrates matrix modeling techniques used to characterize discrete population structures (age, habitat type and season) with integral projection models (IPMs) that characterize demographic rates as continuous functions of organismal traits (in this case body size). Using both stochastic and deterministic analyses we show that decreases in population size are due to changes in stream flow and temperature and that these changes are larger than what can be compensated for through density-dependent responses. We also show that the declines are due mostly to increasing mean stream temperatures decreasing the survival of the youngest age class. In contrast, increases in mean body size over the same period are the result of indirect changes in density with a lesser direct role of climate-driven environmental change.
A deterministic model of electron transport for electron probe microanalysis
NASA Astrophysics Data System (ADS)
Bünger, J.; Richter, S.; Torrilhon, M.
2018-01-01
Within the last decades significant improvements in the spatial resolution of electron probe microanalysis (EPMA) were obtained by instrumental enhancements. In contrast, the quantification procedures essentially remained unchanged. As the classical procedures assume either homogeneity or a multi-layered structure of the material, they limit the spatial resolution of EPMA. The possibilities of improving the spatial resolution through more sophisticated quantification procedures are therefore almost untouched. We investigate a new analytical model (M 1-model) for the quantification procedure based on fast and accurate modelling of electron-X-ray-matter interactions in complex materials using a deterministic approach to solve the electron transport equations. We outline the derivation of the model from the Boltzmann equation for electron transport using the method of moments with a minimum entropy closure and present first numerical results for three different test cases (homogeneous, thin film and interface). Taking Monte Carlo as a reference, the results for the three test cases show that the M 1-model is able to reproduce the electron dynamics in EPMA applications very well. Compared to classical analytical models like XPP and PAP, the M 1-model is more accurate and far more flexible, which indicates the potential of deterministic models of electron transport to further increase the spatial resolution of EPMA.
Exploiting Fast-Variables to Understand Population Dynamics and Evolution
NASA Astrophysics Data System (ADS)
Constable, George W. A.; McKane, Alan J.
2018-07-01
We describe a continuous-time modelling framework for biological population dynamics that accounts for demographic noise. In the spirit of the methodology used by statistical physicists, transitions between the states of the system are caused by individual events while the dynamics are described in terms of the time-evolution of a probability density function. In general, the application of the diffusion approximation still leaves a description that is quite complex. However, in many biological applications one or more of the processes happen slowly relative to the system's other processes, and the dynamics can be approximated as occurring within a slow low-dimensional subspace. We review these time-scale separation arguments and analyse the more simple stochastic dynamics that result in a number of cases. We stress that it is important to retain the demographic noise derived in this way, and emphasise this point by showing that it can alter the direction of selection compared to the prediction made from an analysis of the corresponding deterministic model.
Exploiting Fast-Variables to Understand Population Dynamics and Evolution
NASA Astrophysics Data System (ADS)
Constable, George W. A.; McKane, Alan J.
2017-11-01
We describe a continuous-time modelling framework for biological population dynamics that accounts for demographic noise. In the spirit of the methodology used by statistical physicists, transitions between the states of the system are caused by individual events while the dynamics are described in terms of the time-evolution of a probability density function. In general, the application of the diffusion approximation still leaves a description that is quite complex. However, in many biological applications one or more of the processes happen slowly relative to the system's other processes, and the dynamics can be approximated as occurring within a slow low-dimensional subspace. We review these time-scale separation arguments and analyse the more simple stochastic dynamics that result in a number of cases. We stress that it is important to retain the demographic noise derived in this way, and emphasise this point by showing that it can alter the direction of selection compared to the prediction made from an analysis of the corresponding deterministic model.
Deterministic Migration-Based Separation of White Blood Cells.
Kim, Byeongyeon; Choi, Young Joon; Seo, Hyekyung; Shin, Eui-Cheol; Choi, Sungyoung
2016-10-01
Functional and phenotypic analyses of peripheral white blood cells provide useful clinical information. However, separation of white blood cells from peripheral blood requires a time-consuming, inconvenient process and thus analyses of separated white blood cells are limited in clinical settings. To overcome this limitation, a microfluidic separation platform is developed to enable deterministic migration of white blood cells, directing the cells into designated positions according to a ridge pattern. The platform uses slant ridge structures on the channel top to induce the deterministic migration, which allows efficient and high-throughput separation of white blood cells from unprocessed whole blood. The extent of the deterministic migration under various rheological conditions is explored, enabling highly efficient migration of white blood cells in whole blood and achieving high-throughput separation of the cells (processing 1 mL of whole blood less than 7 min). In the separated cell population, the composition of lymphocyte subpopulations is well preserved, and T cells secrete cytokines without any functional impairment. On the basis of the results, this microfluidic platform is a promising tool for the rapid enrichment of white blood cells, and it is useful for functional and phenotypic analyses of peripheral white blood cells. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Zhao, Yingming; Kocovsky, Patrick M.; Madenjian, Charles P.
2013-01-01
We developed an updated stock–recruitment relationship for Lake Erie Walleye Sander vitreus using the Akaike information criterion model selection approach. Our best stock–recruitment relationship was a Ricker spawner–recruit function to which spring warming rate was added as an environmental variable, and this regression model explained 39% of the variability in Walleye recruitment over the 1978 through 2006 year-classes. Thus, most of the variability in Lake Erie Walleye recruitment appeared to be attributable to factors other than spawning stock size and spring warming rate. The abundance of age-0 Gizzard Shad Dorosoma cepedianum, which was an important term in previous models, may still be an important factor for Walleye recruitment, but poorer ability to monitor Gizzard Shad since the late 1990s could have led to that term failing to appear in our best model. Secondly, we used numerical simulation to demonstrate how to use the stock recruitment relationship to characterize the population dynamics (such as stable age structure, carrying capacity, and maximum sustainable yield) and some biological reference points (such as fishing rates at different important biomass or harvest levels) for an age-structured population in a deterministic way.
Forecasting the Change of Renal Stone Occurrence Rates in Astronauts
NASA Technical Reports Server (NTRS)
Myers, J.; Goodenow, D.; Gokoglu, S.; Kassemi, M.
2016-01-01
Changes in urine chemistry, during and post flight, potentially increases the risk of renal stones in astronauts. Although much is known about the effects of space flight on urine chemistry, no inflight incidences of renal stones in US astronauts exists and the question How much does this risk change with space flight? remains difficult to accurately quantify. In this discussion, we tackle this question utilizing a combination of deterministic and probabilistic modeling that implements the physics behind free stone growth and agglomeration, speciation of urine chemistry and published observations of population renal stone incidences to estimate changes in the rate of renal stone occurrence.
NASA Astrophysics Data System (ADS)
Soltanzadeh, I.; Azadi, M.; Vakili, G. A.
2011-07-01
Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.
NASA Astrophysics Data System (ADS)
Yan, Y.; Barth, A.; Beckers, J. M.; Candille, G.; Brankart, J. M.; Brasseur, P.
2015-07-01
Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. An incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semiindependent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analyzed jointly. The consistency and complementarity between both validations are highlighted.
A Random Variable Approach to Nuclear Targeting and Survivability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Undem, Halvor A.
We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less
Hybrid stochastic simulations of intracellular reaction-diffusion systems.
Kalantzis, Georgios
2009-06-01
With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed an adaptive partitioning strategy in which processes with high frequency are simulated with deterministic rate-based equations, and those with low frequency using the exact stochastic algorithm of Gillespie. Therefore the stochastic behavior of cellular pathways is preserved while being able to apply it to large populations of molecules. We describe our method and demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors.
Classification and unification of the microscopic deterministic traffic models.
Yang, Bo; Monterola, Christopher
2015-10-01
We identify a universal mathematical structure in microscopic deterministic traffic models (with identical drivers), and thus we show that all such existing models in the literature, including both the two-phase and three-phase models, can be understood as special cases of a master model by expansion around a set of well-defined ground states. This allows any two traffic models to be properly compared and identified. The three-phase models are characterized by the vanishing of leading orders of expansion within a certain density range, and as an example the popular intelligent driver model is shown to be equivalent to a generalized optimal velocity (OV) model. We also explore the diverse solutions of the generalized OV model that can be important both for understanding human driving behaviors and algorithms for autonomous driverless vehicles.
A stochastic model for correlated protein motions
NASA Astrophysics Data System (ADS)
Karain, Wael I.; Qaraeen, Nael I.; Ajarmah, Basem
2006-06-01
A one-dimensional Langevin-type stochastic difference equation is used to find the deterministic and Gaussian contributions of time series representing the projections of a Bovine Pancreatic Trypsin Inhibitor (BPTI) protein molecular dynamics simulation along different eigenvector directions determined using principal component analysis. The deterministic part shows a distinct nonlinear behavior only for eigenvectors contributing significantly to the collective protein motion.
Harris, Julianne E.; Hightower, Joseph E.
2012-01-01
American shad Alosa sapidissima are in decline in their native range, and modeling possible management scenarios could help guide their restoration. We developed a density-dependent, deterministic, stage-based matrix model to predict the population-level results of transporting American shad to suitable spawning habitat upstream of dams on the Roanoke River, North Carolina and Virginia. We used data on sonic-tagged adult American shad and oxytetracycline-marked American shad fry both above and below dams on the Roanoke River with information from other systems to estimate a starting population size and vital rates. We modeled the adult female population over 30 years under plausible scenarios of adult transport, effective fecundity (egg production), and survival of adults (i.e., to return to spawn the next year) and juveniles (from spawned egg to age 1). We also evaluated the potential effects of increased survival for adults and juveniles. The adult female population size in the Roanoke River was estimated to be 5,224. With no transport, the model predicted a slow population increase over the next 30 years. Predicted population increases were highest when survival was improved during the first year of life. Transport was predicted to benefit the population only if high rates of effective fecundity and juvenile survival could be achieved. Currently, transported adults and young are less likely to successfully out-migrate than individuals below the dams, and the estimated adult population size is much smaller than either of two assumed values of carrying capacity for the lower river; therefore, transport is not predicted to help restore the stock under present conditions. Research on survival rates, density-dependent processes, and the impacts of structures to increase out-migration success would improve evaluation of the potential benefits of access to additional spawning habitat for American shad.
Perry, Russell W.; Plumb, John M.; Jones, Edward C.; Som, Nicholas A.; Hetrick, Nicholas J.; Hardy, Thomas B.
2018-04-06
Fisheries and water managers often use population models to aid in understanding the effect of alternative water management or restoration actions on anadromous fish populations. We developed the Stream Salmonid Simulator (S3) to help resource managers evaluate the effect of management alternatives on juvenile salmonid populations. S3 is a deterministic stage-structured population model that tracks daily growth, movement, and survival of juvenile salmon. A key theme of the model is that river flow affects habitat availability and capacity, which in turn drives density dependent population dynamics. To explicitly link population dynamics to habitat quality and quantity, the river environment is constructed as a one-dimensional series of linked habitat units, each of which has an associated daily time series of discharge, water temperature, and usable habitat area or carrying capacity. The physical characteristics of each habitat unit and the number of fish occupying each unit, in turn, drive survival and growth within each habitat unit and movement of fish among habitat units.The purpose of this report is to outline the underlying general structure of the S3 model that is common among different applications of the model. We have developed applications of the S3 model for juvenile fall Chinook salmon (Oncorhynchus tshawytscha) in the lower Klamath River. Thus, this report is a companion to current application of the S3 model to the Trinity River (in review). The general S3 model structure provides a biological and physical framework for the salmonid freshwater life cycle. This framework captures important demographics of juvenile salmonids aimed at translating management alternatives into simulated population responses. Although the S3 model is built on this common framework, the model has been constructed to allow much flexibility in application of the model to specific river systems. The ability for practitioners to include system-specific information for the physical stream structure, survival, growth, and movement processes ensures that simulations provide results that are relevant to the questions asked about the population under study.
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
NASA Astrophysics Data System (ADS)
Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles
2017-06-01
A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.
Oldring, P K T; Castle, L; O'Mahony, C; Dixon, J
2014-01-01
The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19-64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005-0.012 mg dm(-2). The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg(-1) body weight day(-1) for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg(-1) body weight day(-1). These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the lowest and the highest estimates from the refined deterministic calculations. Since this should be the case, for a fully probabilistic compared with a deterministic approach, it is concluded that the FACET tool has been verified in this example. A recent EFSA draft opinion on exposure to BPA from different sources showed that canned foods were a major contributor and compared results from various models, including those from FACET. The results from FACET were overall conservative.
Mathematical model for HIV spreads control program with ART treatment
NASA Astrophysics Data System (ADS)
Maimunah; Aldila, Dipo
2018-03-01
In this article, using a deterministic approach in a seven-dimensional nonlinear ordinary differential equation, we establish a mathematical model for the spread of HIV with an ART treatment intervention. In a simplified model, when no ART treatment is implemented, disease-free and the endemic equilibrium points were established analytically along with the basic reproduction number. The local stability criteria of disease-free equilibrium and the existing criteria of endemic equilibrium were analyzed. We find that endemic equilibrium exists when the basic reproduction number is larger than one. From the sensitivity analysis of the basic reproduction number of the complete model (with ART treatment), we find that the increased number of infected humans who follow the ART treatment program will reduce the basic reproduction number. We simulate this result also in the numerical experiment of the autonomous system to show how treatment intervention impacts the reduction of the infected population during the intervention time period.
Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn; Ebeling, Markus; Liu, Chun; Luttik, Robert; Mastitsky, Sergey; Nacci, Diane; Topping, Chris; Wang, Magnus
2016-01-01
This article presents a case study demonstrating the application of 3 individual-based, spatially explicit population models (IBMs, also known as agent-based models) in ecological risk assessments to predict long-term effects of a pesticide to populations of small mammals. The 3 IBMs each used a hypothetical fungicide (FungicideX) in different scenarios: spraying in cereals (common vole, Microtus arvalis), spraying in orchards (field vole, Microtus agrestis), and cereal seed treatment (wood mouse, Apodemus sylvaticus). Each scenario used existing model landscapes, which differed greatly in size and structural complexity. The toxicological profile of FungicideX was defined so that the deterministic long-term first tier risk assessment would result in high risk to small mammals, thus providing the opportunity to use the IBMs for risk assessment refinement (i.e., higher tier risk assessment). Despite differing internal model design and scenarios, results indicated in all 3 cases low population sensitivity unless FungicideX was applied at very high (×10) rates. Recovery from local population impacts was generally fast. Only when patch extinctions occured in simulations of intentionally high acute toxic effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report. However, further investigation and agreement are needed to develop recommendations for landscape attributes such as size, structure, and crop rotation to define appropriate regulatory risk assessment scenarios. Overall, the application of IBMs provides multiple advantages to higher tier ecological risk assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios. © 2015 SETAC.
NASA Astrophysics Data System (ADS)
Chisholm, Rebecca H.; Lorenzi, Tommaso; Desvillettes, Laurent; Hughes, Barry D.
2016-08-01
Epigenetic mechanisms are increasingly recognised as integral to the adaptation of species that face environmental changes. In particular, empirical work has provided important insights into the contribution of epigenetic mechanisms to the persistence of clonal species, from which a number of verbal explanations have emerged that are suited to logical testing by proof-of-concept mathematical models. Here, we present a stochastic agent-based model and a related deterministic integrodifferential equation model for the evolution of a phenotype-structured population composed of asexually-reproducing and competing organisms which are exposed to novel environmental conditions. This setting has relevance to the study of biological systems where colonising asexual populations must survive and rapidly adapt to hostile environments, like pathogenesis, invasion and tumour metastasis. We explore how evolution might proceed when epigenetic variation in gene expression can change the reproductive capacity of individuals within the population in the new environment. Simulations and analyses of our models clarify the conditions under which certain evolutionary paths are possible and illustrate that while epigenetic mechanisms may facilitate adaptation in asexual species faced with environmental change, they can also lead to a type of "epigenetic load" and contribute to extinction. Moreover, our results offer a formal basis for the claim that constant environments favour individuals with low rates of stochastic phenotypic variation. Finally, our model provides a "proof of concept" of the verbal hypothesis that phenotypic stability is a key driver in rescuing the adaptive potential of an asexual lineage and supports the notion that intense selection pressure can, to an extent, offset the deleterious effects of high phenotypic instability and biased epimutations, and steer an asexual population back from the brink of an evolutionary dead end.
NASA Astrophysics Data System (ADS)
Reynders, Edwin P. B.; Langley, Robin S.
2018-08-01
The hybrid deterministic-statistical energy analysis method has proven to be a versatile framework for modeling built-up vibro-acoustic systems. The stiff system components are modeled deterministically, e.g., using the finite element method, while the wave fields in the flexible components are modeled as diffuse. In the present paper, the hybrid method is extended such that not only the ensemble mean and variance of the harmonic system response can be computed, but also of the band-averaged system response. This variance represents the uncertainty that is due to the assumption of a diffuse field in the flexible components of the hybrid system. The developments start with a cross-frequency generalization of the reciprocity relationship between the total energy in a diffuse field and the cross spectrum of the blocked reverberant loading at the boundaries of that field. By making extensive use of this generalization in a first-order perturbation analysis, explicit expressions are derived for the cross-frequency and band-averaged variance of the vibrational energies in the diffuse components and for the cross-frequency and band-averaged variance of the cross spectrum of the vibro-acoustic field response of the deterministic components. These expressions are extensively validated against detailed Monte Carlo analyses of coupled plate systems in which diffuse fields are simulated by randomly distributing small point masses across the flexible components, and good agreement is found.
Hybrid stochastic and deterministic simulations of calcium blips.
Rüdiger, S; Shuai, J W; Huisinga, W; Nagaiah, C; Warnecke, G; Parker, I; Falcke, M
2007-09-15
Intracellular calcium release is a prime example for the role of stochastic effects in cellular systems. Recent models consist of deterministic reaction-diffusion equations coupled to stochastic transitions of calcium channels. The resulting dynamics is of multiple time and spatial scales, which complicates far-reaching computer simulations. In this article, we introduce a novel hybrid scheme that is especially tailored to accurately trace events with essential stochastic variations, while deterministic concentration variables are efficiently and accurately traced at the same time. We use finite elements to efficiently resolve the extreme spatial gradients of concentration variables close to a channel. We describe the algorithmic approach and we demonstrate its efficiency compared to conventional methods. Our single-channel model matches experimental data and results in intriguing dynamics if calcium is used as charge carrier. Random openings of the channel accumulate in bursts of calcium blips that may be central for the understanding of cellular calcium dynamics.
Detecting and disentangling nonlinear structure from solar flux time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.
1992-01-01
Interest in solar activity has grown in the past two decades for many reasons. Most importantly for flight dynamics, solar activity changes the atmospheric density, which has important implications for spacecraft trajectory and lifetime prediction. Building upon the previously developed Rayleigh-Benard nonlinear dynamic solar model, which exhibits many dynamic behaviors observed in the Sun, this work introduces new chaotic solar forecasting techniques. Our attempt to use recently developed nonlinear chaotic techniques to model and forecast solar activity has uncovered highly entangled dynamics. Numerical techniques for decoupling additive and multiplicative white noise from deterministic dynamics and examines falloff of the power spectra at high frequencies as a possible means of distinguishing deterministic chaos from noise than spectrally white or colored are presented. The power spectral techniques presented are less cumbersome than current methods for identifying deterministic chaos, which require more computationally intensive calculations, such as those involving Lyapunov exponents and attractor dimension.
Population dynamics in the presence of quasispecies effects and changing environments
NASA Astrophysics Data System (ADS)
Forster, Robert Burke
2006-12-01
This thesis explores how natural selection acts on organisms such as viruses that have either highly error-prone reproduction or face variable environmental conditions or both. By modeling population dynamics under these conditions, we gain a better understanding of the selective forces at work, both in our simulations and hopefully also in real organisms. With an understanding of the important factors in natural selection we can forecast not only the immediate fate of an existing population but also in what directions such a population might evolve in the future. We demonstrate that the concept of a quasispecies is relevant to evolution in a neutral fitness landscape. Motivated by RNA viruses such as HIV, we use RNA secondary structure as our model system and find that quasispecies effects arise both rapidly and in realistically small populations. We discover that the evolutionary effects of neutral drift, punctuated equilibrium and the selection for mutational robustness extend to the concept of a quasispecies. In our study of periodic environments, we consider the tradeoffs faced by quasispecies in adapting to environmental change. We develop an analytical model to predict whether evolution favors short-term or long-term adaptation and validate our model through simulation. Our results bear directly on the population dynamics of viruses such as West Nile that alternate between two host species. More generally, we discover that a selective pressure exists under these conditions to fuse or split genes with complementary environmental functions. Lastly, we study the general effects of frequency-dependent selection on two strains competing in a periodic environment. Under very general assumptions, we prove that stable coexistence rather than extinction is the likely outcome. The population dynamics of this system may be as simple as stable equilibrium or as complex as deterministic chaos.
International migration beyond gravity: A statistical model for use in population projections
Cohen, Joel E.; Roig, Marta; Reuman, Daniel C.; GoGwilt, Cai
2008-01-01
International migration will play an increasing role in the demographic future of most nations if fertility continues to decline globally. We developed an algorithm to project future numbers of international migrants from any country or region to any other. The proposed generalized linear model (GLM) used geographic and demographic independent variables only (the population and area of origins and destinations of migrants, the distance between origin and destination, the calendar year, and indicator variables to quantify nonrandom characteristics of individual countries). The dependent variable, yearly numbers of migrants, was quantified by 43653 reports from 11 countries of migration from 228 origins and to 195 destinations during 1960–2004. The final GLM based on all data was selected by the Bayesian information criterion. The number of migrants per year from origin to destination was proportional to (population of origin)0.86(area of origin)−0.21(population of destination)0.36(distance)−0.97, multiplied by functions of year and country-specific indicator variables. The number of emigrants from an origin depended on both its population and its population density. For a variable initial year and a fixed terminal year 2004, the parameter estimates appeared stable. Multiple R2, the fraction of variation in log numbers of migrants accounted for by the starting model, improved gradually with recentness of the data: R2 = 0.57 for data from 1960 to 2004, R2 = 0.59 for 1985–2004, R2 = 0.61 for 1995–2004, and R2 = 0.64 for 2000–2004. The migration estimates generated by the model may be embedded in deterministic or stochastic population projections. PMID:18824693
Klassen, Waldemar; Adams, Jean V.; Twohey, Michael B.
2004-01-01
The suppressive effects of trapping adult sea lampreys, Petromyzon marinus Linnaeus, and releasing sterile males (SMRT) or females (SFRT) into a closed system were expressed in deterministic models. Suppression was modeled as a function of the proportion of the population removed by trapping, the number of sterile animals released, the reproductive rate and sex ratio of the population, and (for the SFRT) the rate of polygyny. Releasing sterile males reduced populations more quickly than did the release of sterile females. For a population in which 30% are trapped, sterile animals are initially released at ratio of 10 sterile to 1 fertile animal, 5 adult progeny are produced per fertile mating, 60% are male, and males mate with an average of 1.65 females, the initial population is reduced 87% by SMRT and 68% by SFRT in one generation. The extent of suppression achieved is most sensitive to changes in the initial sterile release ratio. Given the current status of sea lamprey populations and trapping operations in the Great Lakes, the sterile-male-release technique has the best chance for success on a lake-wide basis if implemented in Lake Michigan. The effectiveness of the sterile-female-release technique should be investigated in a controlled study. Advancing trapping technology should be a high priority in the near term, and artificial rearing of sea lampreys to the adult stage should be a high priority in the long term. The diligent pursuit of sea lamprey suppression over a period of several decades can be expected to yield great benefits.
NASA Astrophysics Data System (ADS)
Beakes, M.; Satterthwaite, W.; Petrik, C.; Hendrix, N.; Danner, E.; Lindley, S. T.
2016-02-01
In past decades there has been a heavy reliance on the production of hatchery-reared fish to supplement declining population numbers of Pacific salmon. In some cases, the benefits of hatchery supplementation have been negligible despite concerted long-term stocking efforts. The management and conservation of depressed salmon populations, via hatchery practices or otherwise, can be improved by expanding our understanding of the dissimilarities between hatchery and wild salmon and how each interacts with the environment. In this study we use a stage-structured salmon life-cycle model to explore the population consequences of disparate survival and behavior between hatchery and wild-origin fall-run Chinook Salmon (Oncorhynchus tshawytscha) in the California Central Valley. We couple empirically-based statistical functions with deterministic theoretical models to identify how environmental conditions (e.g., water temperature, flow) and habitat drive the survival and abundance of both hatchery and wild salmon as they integrate across riverscapes and cross marine and freshwater ecosystem boundaries during their life cycle. Results from this study suggest that hatchery practices can lead to dissimilar interactions between hatchery and wild salmon and the environmental conditions they experience. As such, the population dynamics of fall-run Chinook Salmon in the California Central Valley are partly dependent on the composition of individuals that make up their populations. In total, this study improves out ability to conserve imperiled salmonids by identifying mechanistic linkages between the natal origin of salmon, survival and behavior, and the environment at spatiotemporal scales relevant to salmon populations and fisheries management.
Opinion formation and distribution in a bounded-confidence model on various networks
NASA Astrophysics Data System (ADS)
Meng, X. Flora; Van Gorder, Robert A.; Porter, Mason A.
2018-02-01
In the social, behavioral, and economic sciences, it is important to predict which individual opinions eventually dominate in a large population, whether there will be a consensus, and how long it takes for a consensus to form. Such ideas have been studied heavily both in physics and in other disciplines, and the answers depend strongly both on how one models opinions and on the network structure on which opinions evolve. One model that was created to study consensus formation quantitatively is the Deffuant model, in which the opinion distribution of a population evolves via sequential random pairwise encounters. To consider heterogeneity of interactions in a population along with social influence, we study the Deffuant model on various network structures (deterministic synthetic networks, random synthetic networks, and social networks constructed from Facebook data). We numerically simulate the Deffuant model and conduct regression analyses to investigate the dependence of the time to reach steady states on various model parameters, including a confidence bound for opinion updates, the number of participating entities, and their willingness to compromise. We find that network structure and parameter values both have important effects on the convergence time and the number of steady-state opinion groups. For some network architectures, we observe that the relationship between the convergence time and model parameters undergoes a transition at a critical value of the confidence bound. For some networks, the steady-state opinion distribution also changes from consensus to multiple opinion groups at this critical value.
Stochastic Analysis and Probabilistic Downscaling of Soil Moisture
NASA Astrophysics Data System (ADS)
Deshon, J. P.; Niemann, J. D.; Green, T. R.; Jones, A. S.
2017-12-01
Soil moisture is a key variable for rainfall-runoff response estimation, ecological and biogeochemical flux estimation, and biodiversity characterization, each of which is useful for watershed condition assessment. These applications require not only accurate, fine-resolution soil-moisture estimates but also confidence limits on those estimates and soil-moisture patterns that exhibit realistic statistical properties (e.g., variance and spatial correlation structure). The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution (9-40 km) soil moisture from satellite remote sensing or land-surface models to produce fine-resolution (10-30 m) estimates. The model was designed to produce accurate deterministic soil-moisture estimates at multiple points, but the resulting patterns do not reproduce the variance or spatial correlation of observed soil-moisture patterns. The primary objective of this research is to generalize the EMT+VS model to produce a probability density function (pdf) for soil moisture at each fine-resolution location and time. Each pdf has a mean that is equal to the deterministic soil-moisture estimate, and the pdf can be used to quantify the uncertainty in the soil-moisture estimates and to simulate soil-moisture patterns. Different versions of the generalized model are hypothesized based on how uncertainty enters the model, whether the uncertainty is additive or multiplicative, and which distributions describe the uncertainty. These versions are then tested by application to four catchments with detailed soil-moisture observations (Tarrawarra, Satellite Station, Cache la Poudre, and Nerrigundah). The performance of the generalized models is evaluated by comparing the statistical properties of the simulated soil-moisture patterns to those of the observations and the deterministic EMT+VS model. The versions of the generalized EMT+VS model with normally distributed stochastic components produce soil-moisture patterns with more realistic statistical properties than the deterministic model. Additionally, the results suggest that the variance and spatial correlation of the stochastic soil-moisture variations do not vary consistently with the spatial-average soil moisture.
Stochastic modelling of microstructure formation in solidification processes
NASA Astrophysics Data System (ADS)
Nastac, Laurentiu; Stefanescu, Doru M.
1997-07-01
To relax many of the assumptions used in continuum approaches, a general stochastic model has been developed. The stochastic model can be used not only for an accurate description of the fraction of solid evolution, and therefore accurate cooling curves, but also for simulation of microstructure formation in castings. The advantage of using the stochastic approach is to give a time- and space-dependent description of solidification processes. Time- and space-dependent processes can also be described by partial differential equations. Unlike a differential formulation which, in most cases, has to be transformed into a difference equation and solved numerically, the stochastic approach is essentially a direct numerical algorithm. The stochastic model is comprehensive, since the competition between various phases is considered. Furthermore, grain impingement is directly included through the structure of the model. In the present research, all grain morphologies are simulated with this procedure. The relevance of the stochastic approach is that the simulated microstructures can be directly compared with microstructures obtained from experiments. The computer becomes a `dynamic metallographic microscope'. A comparison between deterministic and stochastic approaches has been performed. An important objective of this research was to answer the following general questions: (1) `Would fully deterministic approaches continue to be useful in solidification modelling?' and (2) `Would stochastic algorithms be capable of entirely replacing purely deterministic models?'
Multi-Scale Modeling of the Gamma Radiolysis of Nitrate Solutions.
Horne, Gregory P; Donoclift, Thomas A; Sims, Howard E; Orr, Robin M; Pimblott, Simon M
2016-11-17
A multiscale modeling approach has been developed for the extended time scale long-term radiolysis of aqueous systems. The approach uses a combination of stochastic track structure and track chemistry as well as deterministic homogeneous chemistry techniques and involves four key stages: radiation track structure simulation, the subsequent physicochemical processes, nonhomogeneous diffusion-reaction kinetic evolution, and homogeneous bulk chemistry modeling. The first three components model the physical and chemical evolution of an isolated radiation chemical track and provide radiolysis yields, within the extremely low dose isolated track paradigm, as the input parameters for a bulk deterministic chemistry model. This approach to radiation chemical modeling has been tested by comparison with the experimentally observed yield of nitrite from the gamma radiolysis of sodium nitrate solutions. This is a complex radiation chemical system which is strongly dependent on secondary reaction processes. The concentration of nitrite is not just dependent upon the evolution of radiation track chemistry and the scavenging of the hydrated electron and its precursors but also on the subsequent reactions of the products of these scavenging reactions with other water radiolysis products. Without the inclusion of intratrack chemistry, the deterministic component of the multiscale model is unable to correctly predict experimental data, highlighting the importance of intratrack radiation chemistry in the chemical evolution of the irradiated system.
A deterministic width function model
NASA Astrophysics Data System (ADS)
Puente, C. E.; Sivakumar, B.
Use of a deterministic fractal-multifractal (FM) geometric method to model width functions of natural river networks, as derived distributions of simple multifractal measures via fractal interpolating functions, is reported. It is first demonstrated that the FM procedure may be used to simulate natural width functions, preserving their most relevant features like their overall shape and texture and their observed power-law scaling on their power spectra. It is then shown, via two natural river networks (Racoon and Brushy creeks in the United States), that the FM approach may also be used to closely approximate existing width functions.
A Statistical Method to Distinguish Functional Brain Networks
Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045
A Statistical Method to Distinguish Functional Brain Networks.
Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).
Probabilistic Modeling of the Renal Stone Formation Module
NASA Technical Reports Server (NTRS)
Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.
2013-01-01
The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously randomly sampling the probability distributions of the electrolyte concentrations and system parameters that are inputs into the deterministic model. The total urine chemistry concentrations are used to determine the urine chemistry activity using the Joint Expert Speciation System (JESS), a biochemistry model. Information used from JESS is then fed into the deterministic growth model. Outputs from JESS and the deterministic model are passed back to the probabilistic model where a multivariate regression is used to assess the likelihood of a stone forming and the likelihood of a stone requiring clinical intervention. The parameters used to determine to quantify these risks include: relative supersaturation (RS) of calcium oxalate, citrate/calcium ratio, crystal number density, total urine volume, pH, magnesium excretion, maximum stone width, and ureteral location. Methods and Validation: The RSFM is designed to perform a Monte Carlo simulation to generate probability distributions of clinically significant renal stones, as well as provide an associated uncertainty in the estimate. Initially, early versions will be used to test integration of the components and assess component validation and verification (V&V), with later versions used to address questions regarding design reference mission scenarios. Once integrated with the deterministic component, the credibility assessment of the integrated model will follow NASA STD 7009 requirements.
NASA Astrophysics Data System (ADS)
Grossi, Giovanna; Caronna, Paolo; Ranzi, Roberto
2014-05-01
Within the framework of risk communication, the goal of an early warning system is to support the interaction between technicians and authorities (and subsequently population) as a prevention measure. The methodology proposed in the KULTURisk FP7 project aimed to build a closer collaboration between these actors, in the perspective of promoting pro-active actions to mitigate the effects of flood hazards. The transnational (Slovenia/ Italy) Soča/Isonzo case study focused on this concept of cooperation between stakeholders and hydrological forecasters. The DIMOSHONG_VIP hydrological model was calibrated for the Vipava/Vipacco River (650 km2), a tributary of the Soča/Isonzo River, on the basis of flood events occurred between 1998 and 2012. The European Centre for Medium-Range Weather Forecasts (ECMWF) provided the past meteorological forecasts, both deterministic (1 forecast) and probabilistic (51 ensemble members). The resolution of the ECMWF grid is currently about 15 km (Deterministic-DET) and 30 km (Ensemble Prediction System-EPS). A verification was conducted to validate the flood-forecast outputs of the DIMOSHONG_VIP+ECMWF early warning system. Basic descriptive statistics, like event probability, probability of a forecast occurrence and frequency bias were determined. Some performance measures were calculated, such as hit rate (probability of detection) and false alarm rate (probability of false detection). Relative Opening Characteristic (ROC) curves were generated both for deterministic and probabilistic forecasts. These analysis showed a good performance of the early warning system, in respect of the small size of the sample. A particular attention was spent to the design of flood-forecasting output charts, involving and inquiring stakeholders (Alto Adriatico River Basin Authority), hydrology specialists in the field, and common people. Graph types for both forecasted precipitation and discharge were set. Three different risk thresholds were identified ("attention", "pre-alarm" or "alert", "alarm"), with an "icon-style" representation, suitable for communication to civil protection stakeholders or the public. Aiming at showing probabilistic representations in a "user-friendly" way, we opted for the visualization of the single deterministic forecasted hydrograph together with the 5%, 25%, 50%, 75% and 95% percentiles bands of the Hydrological Ensemble Prediction System (HEPS). HEPS is generally used for 3-5 days hydrological forecasts, while the error due to incorrect initial data is comparable to the error due to the lower resolution with respect to the deterministic forecast. In the short term forecasting (12-48 hours) the HEPS-members show obviously a similar tendency; in this case, considering its higher resolution, the deterministic forecast is expected to be more effective. The plot of different forecasts in the same chart allows the use of model outputs from 4/5 days to few hours before a potential flood event. This framework was built to help a stakeholder, like a mayor, a civil protection authority, etc, in the flood control and management operations, and was designed to be included in a wider decision support system.
NASA Astrophysics Data System (ADS)
Lumi, Neeme; Laas, Katrin; Mankin, Romi
2015-11-01
The long-time limit behavior of the stochastic Lotka-Volterra model of a symbiotic metapopulation subjected to generalized Verhulst self-regulation is considered. The influence of a time-variable environment on the carrying capacities of subpopulations is modeled as a periodic deterministic part and a symmetric dichotomous noise. Relying on the mean-field approach it is established that at certain parameter regimes the mean field (average subpopulations size) exhibits hysteresis in respect to the noise correlation time, manifested in the appearance of colored-noise-induced discontinuous transitions. Especially, it is shown that the relative fluctuation of the subpopulation sizes exhibits accelerated increase prior to abrupt transitions of the metapopulation state. Moreover, in certain cases the autocorrelation function of the population sizes demonstrates anticorrelation at some values of the lag time.
Stochastic Processes in Physics: Deterministic Origins and Control
NASA Astrophysics Data System (ADS)
Demers, Jeffery
Stochastic processes are ubiquitous in the physical sciences and engineering. While often used to model imperfections and experimental uncertainties in the macroscopic world, stochastic processes can attain deeper physical significance when used to model the seemingly random and chaotic nature of the underlying microscopic world. Nowhere more prevalent is this notion than in the field of stochastic thermodynamics - a modern systematic framework used describe mesoscale systems in strongly fluctuating thermal environments which has revolutionized our understanding of, for example, molecular motors, DNA replication, far-from equilibrium systems, and the laws of macroscopic thermodynamics as they apply to the mesoscopic world. With progress, however, come further challenges and deeper questions, most notably in the thermodynamics of information processing and feedback control. Here it is becoming increasingly apparent that, due to divergences and subtleties of interpretation, the deterministic foundations of the stochastic processes themselves must be explored and understood. This thesis presents a survey of stochastic processes in physical systems, the deterministic origins of their emergence, and the subtleties associated with controlling them. First, we study time-dependent billiards in the quivering limit - a limit where a billiard system is indistinguishable from a stochastic system, and where the simplified stochastic system allows us to view issues associated with deterministic time-dependent billiards in a new light and address some long-standing problems. Then, we embark on an exploration of the deterministic microscopic Hamiltonian foundations of non-equilibrium thermodynamics, and we find that important results from mesoscopic stochastic thermodynamics have simple microscopic origins which would not be apparent without the benefit of both the micro and meso perspectives. Finally, we study the problem of stabilizing a stochastic Brownian particle with feedback control, and we find that in order to avoid paradoxes involving the first law of thermodynamics, we need a model for the fine details of the thermal driving noise. The underlying theme of this thesis is the argument that the deterministic microscopic perspective and stochastic mesoscopic perspective are both important and useful, and when used together, we can more deeply and satisfyingly understand the physics occurring over either scale.
Nanopore Current Oscillations: Nonlinear Dynamics on the Nanoscale.
Hyland, Brittany; Siwy, Zuzanna S; Martens, Craig C
2015-05-21
In this Letter, we describe theoretical modeling of an experimentally realized nanoscale system that exhibits the general universal behavior of a nonlinear dynamical system. In particular, we consider the description of voltage-induced current fluctuations through a single nanopore from the perspective of nonlinear dynamics. We briefly review the experimental system and its behavior observed and then present a simple phenomenological nonlinear model that reproduces the qualitative behavior of the experimental data. The model consists of a two-dimensional deterministic nonlinear bistable oscillator experiencing both dissipation and random noise. The multidimensionality of the model and the interplay between deterministic and stochastic forces are both required to obtain a qualitatively accurate description of the physical system.
Effects of deterministic and random refuge in a prey-predator model with parasite infection.
Mukhopadhyay, B; Bhattacharyya, R
2012-09-01
Most natural ecosystem populations suffer from various infectious diseases and the resulting host-pathogen dynamics is dependent on host's characteristics. On the other hand, empirical evidences show that for most host pathogen systems, a part of the host population always forms a refuge. To study the role of refuge on the host-pathogen interaction, we study a predator-prey-pathogen model where the susceptible and the infected prey can undergo refugia of constant size to evade predator attack. The stability aspects of the model system is investigated from a local and global perspective. The study reveals that the refuge sizes for the susceptible and the infected prey are the key parameters that control possible predator extinction as well as species co-existence. Next we perform a global study of the model system using Lyapunov functions and show the existence of a global attractor. Finally we perform a stochastic extension of the basic model to study the phenomenon of random refuge arising from various intrinsic, habitat-related and environmental factors. The stochastic model is analyzed for exponential mean square stability. Numerical study of the stochastic model shows that increasing the refuge rates has a stabilizing effect on the stochastic dynamics. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Contreras, Arturo Javier
This dissertation describes a novel Amplitude-versus-Angle (AVA) inversion methodology to quantitatively integrate pre-stack seismic data, well logs, geologic data, and geostatistical information. Deterministic and stochastic inversion algorithms are used to characterize flow units of deepwater reservoirs located in the central Gulf of Mexico. A detailed fluid/lithology sensitivity analysis was conducted to assess the nature of AVA effects in the study area. Standard AVA analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generate typical Class III AVA responses. Layer-dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution, indicating that presence of light saturating fluids clearly affects the elastic response of sands. Accordingly, AVA deterministic and stochastic inversions, which combine the advantages of AVA analysis with those of inversion, have provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties and fluid-sensitive modulus attributes (P-Impedance, S-Impedance, density, and LambdaRho, in the case of deterministic inversion; and P-velocity, S-velocity, density, and lithotype (sand-shale) distributions, in the case of stochastic inversion). The quantitative use of rock/fluid information through AVA seismic data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, provides accurate 3D models of petrophysical properties such as porosity, permeability, and water saturation. Pre-stack stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques. Furthermore, 3D petrophysical models can be more accurately co-simulated from AVA stochastic inversion results. By combining AVA sensitivity analysis techniques with pre-stack stochastic inversion, geologic data, and awareness of inversion pitfalls, it is possible to substantially reduce the risk in exploration and development of conventional and non-conventional reservoirs. From the final integration of deterministic and stochastic inversion results with depositional models and analogous examples, the M-series reservoirs have been interpreted as stacked terminal turbidite lobes within an overall fan complex (the Miocene MCAVLU Submarine Fan System); this interpretation is consistent with previous core data interpretations and regional stratigraphic/depositional studies.
Dynamical Characteristics Common to Neuronal Competition Models
Shpiro, Asya; Curtu, Rodica; Rinzel, John; Rubin, Nava
2009-01-01
Models implementing neuronal competition by reciprocally inhibitory populations are widely used to characterize bistable phenomena such as binocular rivalry. We find common dynamical behavior in several models of this general type, which differ in their architecture in the form of their gain functions, and in how they implement the slow process that underlies alternating dominance. We focus on examining the effect of the input strength on the rate (and existence) of oscillations. In spite of their differences, all considered models possess similar qualitative features, some of which we report here for the first time. Experimentally, dominance durations have been reported to decrease monotonically with increasing stimulus strength (such as Levelt's “Proposition IV”). The models predict this behavior; however, they also predict that at a lower range of input strength dominance durations increase with increasing stimulus strength. The nonmonotonic dependency of duration on stimulus strength is common to both deterministic and stochastic models. We conclude that additional experimental tests of Levelt's Proposition IV are needed to reconcile models and perception. PMID:17065254
Bonsall, Michael B; Dooley, Claire A; Kasparson, Anna; Brereton, Tom; Roy, David B; Thomas, Jeremy A
2014-01-01
Conservation of endangered species necessitates a full appreciation of the ecological processes affecting the regulation, limitation, and persistence of populations. These processes are influenced by birth, death, and dispersal events, and characterizing them requires careful accounting of both the deterministic and stochastic processes operating at both local and regional population levels. We combined ecological theory and observations on Allee effects by linking mathematical analysis and the spatial and temporal population dynamics patterns of a highly endangered butterfly, the high brown fritillary, Argynnis adippe. Our theoretical analysis showed that the role of density-dependent feedbacks in the presence of local immigration can influence the strength of Allee effects. Linking this theory to the analysis of the population data revealed strong evidence for both negative density dependence and Allee effects at the landscape or regional scale. These regional dynamics are predicted to be highly influenced by immigration. Using a Bayesian state-space approach, we characterized the local-scale births, deaths, and dispersal effects together with measurement and process uncertainty in the metapopulation. Some form of an Allee effect influenced almost three-quarters of these local populations. Our joint analysis of the deterministic and stochastic dynamics suggests that a conservation priority for this species would be to increase resource availability in currently occupied and, more importantly, in unoccupied sites.
Projecting the success of plant restoration with population viability analysis
Bell, T.J.; Bowles, M.L.; McEachern, A.K.; Brigham, C.A.; Schwartz, M.W.
2003-01-01
Conserving viable populations of plant species requires that they have high probabilities of long-term persistence within natural habitats, such as a chance of extinction in 100 years of less than 5% (Menges 1991, 1998; Brown 1994; Pavlik 1994; Chap. 1, this Vol.). For endangered and threatened species that have been severely reduces in range and whose habitats have been fragmented, important species conservation strategies may include augmenting existing populations or restoring new viable populations (Bowles and Whelan 1994; Chap. 2, this Vol.). Restoration objectives may include increasing population numbers to reduce extinction probability, deterministic manipulations to develop a staged cohort structure, or more complex restoration of a desired genetic structure to allow outcrossing or increase effective population size (DeMauro 1993, 1994; Bowles et al. 1993, 1998; Pavlik 1994; Knapp and Dyer 1998; Chap. 2, this Vol.). These efforts may require translocation of propagules from existing (in situ) populations, or from ex situ botanic gardens or seed storage facilities (Falk et al. 1996; Guerrant and Pavlik 1998; Chap. 2, this Vol.). Population viability analysis (PVA) can provide a critical foundation for plant restoration, as it models demographic projections used to evaluate the probability of population persistence and links plant life history with restoration strategies. It is unknown how well artificially created populations will meet demographic modeling requirements (e.g., due to artificial cohort transitions) and few, if any, PVAs have been applied to restorations. To guide application of PVA to restored populations and to illustrate potential difficulties, we examine effects of planting different life stages, model initial population sizes needed to achieve population viability, and compare demographic characteristics between natural and restored populations. We develop and compare plant population restoration viability analysis (PRVA) case studies of two plant species listed in the USA for which federal recovery planning calls for population restoration: Cirsium pitcheri, a short-lived semelparous herb, and Asclepias meadii, a long-lived iteroparous herb.
Stochastic switching in biology: from genotype to phenotype
NASA Astrophysics Data System (ADS)
Bressloff, Paul C.
2017-03-01
There has been a resurgence of interest in non-equilibrium stochastic processes in recent years, driven in part by the observation that the number of molecules (genes, mRNA, proteins) involved in gene expression are often of order 1-1000. This means that deterministic mass-action kinetics tends to break down, and one needs to take into account the discrete, stochastic nature of biochemical reactions. One of the major consequences of molecular noise is the occurrence of stochastic biological switching at both the genotypic and phenotypic levels. For example, individual gene regulatory networks can switch between graded and binary responses, exhibit translational/transcriptional bursting, and support metastability (noise-induced switching between states that are stable in the deterministic limit). If random switching persists at the phenotypic level then this can confer certain advantages to cell populations growing in a changing environment, as exemplified by bacterial persistence in response to antibiotics. Gene expression at the single-cell level can also be regulated by changes in cell density at the population level, a process known as quorum sensing. In contrast to noise-driven phenotypic switching, the switching mechanism in quorum sensing is stimulus-driven and thus noise tends to have a detrimental effect. A common approach to modeling stochastic gene expression is to assume a large but finite system and to approximate the discrete processes by continuous processes using a system-size expansion. However, there is a growing need to have some familiarity with the theory of stochastic processes that goes beyond the standard topics of chemical master equations, the system-size expansion, Langevin equations and the Fokker-Planck equation. Examples include stochastic hybrid systems (piecewise deterministic Markov processes), large deviations and the Wentzel-Kramers-Brillouin (WKB) method, adiabatic reductions, and queuing/renewal theory. The major aim of this review is to provide a self-contained survey of these mathematical methods, mainly within the context of biological switching processes at both the genotypic and phenotypic levels. However, applications to other examples of biological switching are also discussed, including stochastic ion channels, diffusion in randomly switching environments, bacterial chemotaxis, and stochastic neural networks.
Christensen, Jette; El Allaki, Farouk; Vallières, André
2014-05-01
Scenario tree models with temporal discounting have been applied in four continents to support claims of freedom from animal disease. Recently, a second (new) model was developed for the same population and disease. This is a natural development because surveillance is a dynamic process that needs to adapt to changing circumstances - the difficulty is the justification for, documentation of, presentation of and the acceptance of the changes. Our objective was to propose a systematic approach to present changes to an existing scenario tree model for freedom from disease. We used the example of how we adapted the deterministic Canadian Notifiable Avian Influenza scenario tree model published in 2011 to a stochastic scenario tree model where the definition of sub-populations and the estimation of probability of introduction of the pathogen were modified. We found that the standardized approach by Vanderstichel et al. (2013) with modifications provided a systematic approach to make and present changes to an existing scenario tree model. We believe that the new 2013 CanNAISS scenario tree model is a better model than the 2011 model because the 2013 model included more surveillance data. In particular, the new data on Notifiable Avian Influenza in Canada from the last 5 years were used to improve input parameters and model structure. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Ecological drivers of guanaco recruitment: variable carrying capacity and density dependence.
Marino, Andrea; Pascual, Miguel; Baldi, Ricardo
2014-08-01
Ungulates living in predator-free reserves offer the opportunity to study the influence of food limitation on population dynamics without the potentially confounding effects of top-down regulation or livestock competition. We assessed the influence of relative forage availability and population density on guanaco recruitment in two predator-free reserves in eastern Patagonia, with contrasting scenarios of population density. We also explored the relative contribution of the observed recruitment to population growth using a deterministic linear model to test the assumption that the studied populations were closed units. The observed densities increased twice as fast as our theoretical populations, indicating that marked immigration has taken place during the recovery phase experienced by both populations, thus we rejected the closed-population assumption. Regarding the factors driving variation in recruitment, in the low- to medium-density setting, we found a positive linear relationship between recruitment and surrogates of annual primary production, whereas no density dependence was detected. In contrast, in the high-density scenario, both annual primary production and population density showed marked effects, indicating a positive relationship between recruitment and per capita food availability above a food-limitation threshold. Our results support the idea that environmental carrying capacity fluctuates in response to climatic variation, and that these fluctuations have relevant consequences for herbivore dynamics, such as amplifying density dependence in drier years. We conclude that including the coupling between environmental variability in resources and density dependence is crucial to model ungulate population dynamics; to overlook temporal changes in carrying capacity may even mask density dependence as well as other important processes.
Identification of gene regulation models from single-cell data
NASA Astrophysics Data System (ADS)
Weber, Lisa; Raymond, William; Munsky, Brian
2018-09-01
In quantitative analyses of biological processes, one may use many different scales of models (e.g. spatial or non-spatial, deterministic or stochastic, time-varying or at steady-state) or many different approaches to match models to experimental data (e.g. model fitting or parameter uncertainty/sloppiness quantification with different experiment designs). These different analyses can lead to surprisingly different results, even when applied to the same data and the same model. We use a simplified gene regulation model to illustrate many of these concerns, especially for ODE analyses of deterministic processes, chemical master equation and finite state projection analyses of heterogeneous processes, and stochastic simulations. For each analysis, we employ MATLAB and PYTHON software to consider a time-dependent input signal (e.g. a kinase nuclear translocation) and several model hypotheses, along with simulated single-cell data. We illustrate different approaches (e.g. deterministic and stochastic) to identify the mechanisms and parameters of the same model from the same simulated data. For each approach, we explore how uncertainty in parameter space varies with respect to the chosen analysis approach or specific experiment design. We conclude with a discussion of how our simulated results relate to the integration of experimental and computational investigations to explore signal-activated gene expression models in yeast (Neuert et al 2013 Science 339 584–7) and human cells (Senecal et al 2014 Cell Rep. 8 75–83)5.
NASA Astrophysics Data System (ADS)
Wei, Y.; Thomas, S.; Zhou, H.; Arcas, D.; Titov, V. V.
2017-12-01
The increasing potential tsunami hazards pose great challenges for infrastructures along the coastlines of the U.S. Pacific Northwest. Tsunami impact at a coastal site is usually assessed from deterministic scenarios based on 10,000 years of geological records in the Cascadia Subduction Zone (CSZ). Aside from these deterministic methods, the new ASCE 7-16 tsunami provisions provide engineering design criteria of tsunami loads on buildings based on a probabilistic approach. This work develops a site-specific model near Newport, OR using high-resolution grids, and compute tsunami inundation depth and velocities at the study site resulted from credible probabilistic and deterministic earthquake sources in the Cascadia Subduction Zone. Three Cascadia scenarios, two deterministic scenarios, XXL1 and L1, and a 2,500-yr probabilistic scenario compliant with the new ASCE 7-16 standard, are simulated using combination of a depth-averaged shallow water model for offshore propagation and a Boussinesq-type model for onshore inundation. We speculate on the methods and procedure to obtain the 2,500-year probabilistic scenario for Newport that is compliant with the ASCE 7-16 tsunami provisions. We provide details of model results, particularly the inundation depth and flow speed for a new building, which will also be designated as a tsunami vertical evacuation shelter, at Newport, Oregon. We show that the ASCE 7-16 consistent hazards are between those obtained from deterministic L1 and XXL1 scenarios, and the greatest impact on the building may come from later waves. As a further step, we utilize the inundation model results to numerically compute tracks of large vessels in the vicinity of the building site and estimate if these vessels will impact on the building site during the extreme XXL1 and ASCE 7-16 hazard-consistent scenarios. Two-step study is carried out first to study tracks of massless particles and then large vessels with assigned mass considering drag force, inertial force, ship grounding and mooring. The simulation results show that none of the large vessels will impact on the building site in all tested scenarios.
NASA Astrophysics Data System (ADS)
Qiu, Kang; Wang, Li-Fang; Shen, Jian; Yousif, Alssadig A. M.; He, Peng; Shao, Dan-Dan; Zhang, Xiao-Min; Kirunda, John B.; Jia, Ya
2016-11-01
Based on a deterministic continuous model of cell populations dynamics in the colonic crypt and in colorectal cancer, we propose four combinations of feedback mechanisms in the differentiations from stem cells (SCs) to transit cells (TCs) and then to differentiated cells (DCs), the four combinations include the double linear (LL), the linear and saturating (LS), the saturating and linear (SL), and the double saturating (SS) feedbacks, respectively. The relative fluctuations of the population of SCs, TCs, and DCs around equilibrium states with four feedback mechanisms are studied by using the Langevin method. With the increasing of net growth rate of TCs, it is found that the Fano factors of TCs and DCs go to a peak in a transient phase, and then increase again to infinity in the cases of LS and SS feedbacks. The “up-down-up” characteristic on the Fano factor (like the van der Waals loop) demonstrates that there exists a transient phase between the normal and cancerous phases, our novel findings suggest that the mathematical model with LS or SS feedback might be better to elucidate the dynamics of a normal and abnormal (cancerous) phases.
Demographic variation and conservation of the narrow endemic plant Ranunculus weyleri
NASA Astrophysics Data System (ADS)
Cursach, Joana; Besnard, Aurélien; Rita, Juan; Fréville, Hélène
2013-11-01
Ranunculus weyleri is a narrow endemic protected plant from Majorca Island. It is known from only five populations located in two mountain areas 48 km apart. Using demographic data collected from 2007 to 2010, we assessed the demographic status of two populations - font des Coloms (FC) and talaia Moreia (TM) - using Integral Projection Models (IPMs). We showed that none of the two populations were declining under a deterministic model. Population FC was stable (λ = 1.026, CI95% = 0.965-1.093), while population TM showed sign of demographic expansion (λ = 1.113, CI95% = 1.032-1.219). Plant survival, flowering probability and the mean number of seedlings per floral peduncle were lower in TM, whereas growth and the number of floral peduncles per reproductive plant were lower in FC. Elasticity analyses showed that management strategies increasing plant survival and growth would be the most efficient to increase λ for both populations. Herbivory pressure by goats has been shown to be high in TM, resulting in high predation rate on floral peduncles. Controlling goat pressure may thus represent a promising management option, provided that we can demonstrate a negative impact of herbivory by goats on survival and growth which are the most critical parts of the life cycle in this species. Meanwhile, initiating a long-term monitoring is of crucial importance to get more insights into the relationships between environmental variation, plant performance and population dynamics.
Dynamics of the double-crested cormorant population on Lake Ontario
Blackwell, Bradley F.; Stapanian, Martin A.; Weseloh, D.V. Chip
2002-01-01
After nearly 30 years of recolonization and expansion across North America, the double-crested cormorant (Phalacrocorax auritus) occupies the role of a perceived and, in some situations, realized threat to fish stocks and other resources. However, population data necessary to plan, defend, and implement management of this species are few. Our purpose was to gain insight into the relative contribution of various population parameters to the overall rate of population growth and identify data needs critical to improving our understanding of the dynamics of double-crested cormorant populations. We demonstrated the construction of a biologically reasonable representation of cormorant population growth on Lake Ontario (1979-2000) by referencing literature values for fertility, age at first breeding, and survival. These parameters were incorporated into a deterministic stage-classified matrix model. By calculating the elasticity of matrix elements (i.e., statgspecific fertility and survival), we found that cormorant population growth on Lake Ontario was most sensitive to survival of birds about to turn age 3 and older. Finally, we demonstrated how this information could be used to evaluate management scenarios and direct future research by simulating potential environmental effects on fertility and survival, as well as a 5-year egg-oiling program. We also demonstrated that survival of older birds exerts more effective population control than changes in fertility.
Economic analysis of interventions to improve village chicken production in Myanmar.
Henning, J; Morton, J; Pym, R; Hla, T; Sunn, K; Meers, J
2013-07-01
A cost-benefit analysis using deterministic and stochastic modelling was conducted to identify the net benefits for households that adopt (1) vaccination of individual birds against Newcastle disease (ND) or (2) improved management of chick rearing by providing coops for the protection of chicks from predation and chick starter feed inside a creep feeder to support chicks' nutrition in village chicken flocks in Myanmar. Partial budgeting was used to assess the additional costs and benefits associated with each of the two interventions tested relative to neither strategy. In the deterministic model, over the first 3 years after the introduction of the interventions, the cumulative sum of the net differences from neither strategy was 13,189Kyat for ND vaccination and 77,645Kyat for improved chick management (effective exchange rate in 2005: 1000Kyat=1$US). Both interventions were also profitable after discounting over a 10-year period; Net Present Values for ND vaccination and improved chick management were 30,791 and 167,825Kyat, respectively. The Benefit-Cost Ratio for ND vaccination was very high (28.8). This was lower for improved chick management, due to greater costs of the intervention, but still favourable at 4.7. Using both interventions concurrently yielded a Net Present Value of 470,543Kyat and a Benefit-Cost Ratio of 11.2 over the 10-year period in the deterministic model. Using the stochastic model, for the first 3 years following the introduction of the interventions, the mean cumulative sums of the net difference were similar to those values obtained from the deterministic model. Sensitivity analysis indicated that the cumulative net differences were strongly influenced by grower bird sale income, particularly under improved chick management. The effects of the strategies on odds of households selling and consuming birds after 7 months, and numbers of birds being sold or consumed after this period also influenced profitability. Cost variations for equipment used under improved chick management were not markedly associated with profitability. Net Present Values and Benefit-Cost Ratios discounted over a 10-year period were also similar to the deterministic model when mean values obtained through stochastic modelling were used. In summary, the study showed that ND vaccination and improved chick management can improve the viability and profitability of village chicken production in Myanmar. Copyright © 2013 Elsevier B.V. All rights reserved.
Spatio-Temporal Data Analysis at Scale Using Models Based on Gaussian Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Michael
Gaussian processes are the most commonly used statistical model for spatial and spatio-temporal processes that vary continuously. They are broadly applicable in the physical sciences and engineering and are also frequently used to approximate the output of complex computer models, deterministic or stochastic. We undertook research related to theory, computation, and applications of Gaussian processes as well as some work on estimating extremes of distributions for which a Gaussian process assumption might be inappropriate. Our theoretical contributions include the development of new classes of spatial-temporal covariance functions with desirable properties and new results showing that certain covariance models lead tomore » predictions with undesirable properties. To understand how Gaussian process models behave when applied to deterministic computer models, we derived what we believe to be the first significant results on the large sample properties of estimators of parameters of Gaussian processes when the actual process is a simple deterministic function. Finally, we investigated some theoretical issues related to maxima of observations with varying upper bounds and found that, depending on the circumstances, standard large sample results for maxima may or may not hold. Our computational innovations include methods for analyzing large spatial datasets when observations fall on a partially observed grid and methods for estimating parameters of a Gaussian process model from observations taken by a polar-orbiting satellite. In our application of Gaussian process models to deterministic computer experiments, we carried out some matrix computations that would have been infeasible using even extended precision arithmetic by focusing on special cases in which all elements of the matrices under study are rational and using exact arithmetic. The applications we studied include total column ozone as measured from a polar-orbiting satellite, sea surface temperatures over the Pacific Ocean, and annual temperature extremes at a site in New York City. In each of these applications, our theoretical and computational innovations were directly motivated by the challenges posed by analyzing these and similar types of data.« less
Bailey, Michael M.; Zydlewski, Joseph D.
2013-01-01
Hatchery supplementation has been widely used as a restoration technique for American Shad Alosa sapidissima on the East Coast of the USA, but results have been equivocal. In the Penobscot River, Maine, dam removals and other improvements to fish passage will likely reestablish access to the majority of this species’ historic spawning habitat. Additional efforts being considered include the stocking of larval American Shad. The decision about whether to stock a river system undergoing restoration should be made after evaluating the probability of natural recolonization and examining the costs and benefits of potentially accelerating recovery using a stocking program. However, appropriate evaluation can be confounded by a dearth of information about the starting population size and age structure of the remnant American Shad spawning run in the river. We used the Penobscot River as a case study to assess the theoretical sensitivity of recovery time to either scenario (stocking or not) by building a deterministic model of an American Shad population. This model is based on the best available estimates of size at age, fecundity, rate of iteroparity, and recruitment. Density dependence was imposed, such that the population reached a plateau at an arbitrary recovery goal of 633,000 spawning adults. Stocking had a strong accelerating effect on the time to modeled recovery (as measured by the time to reach 50% of the recovery goal) in the base model, but stocking had diminishing effects with larger population sizes. There is a diminishing return to stocking when the starting population is modestly increased. With a low starting population (a spawning run of 1,000), supplementation with 12 million larvae annually accelerated modeled recovery by 12 years. Only a 2-year acceleration was observed if the starting population was 15,000. Such a heuristic model may aid managers in assessing the costs and benefits of stocking by incorporating a structured decision framework.
Finney, Charles E.; Kaul, Brian C.; Daw, C. Stuart; ...
2015-02-18
Here we review developments in the understanding of cycle to cycle variability in internal combustion engines, with a focus on spark-ignited and premixed combustion conditions. Much of the research on cyclic variability has focused on stochastic aspects, that is, features that can be modeled as inherently random with no short term predictability. In some cases, models of this type appear to work very well at describing experimental observations, but the lack of predictability limits control options. Also, even when the statistical properties of the stochastic variations are known, it can be very difficult to discern their underlying physical causes andmore » thus mitigate them. Some recent studies have demonstrated that under some conditions, cyclic combustion variations can have a relatively high degree of low dimensional deterministic structure, which implies some degree of predictability and potential for real time control. These deterministic effects are typically more pronounced near critical stability limits (e.g. near tipping points associated with ignition or flame propagation) such during highly dilute fueling or near the onset of homogeneous charge compression ignition. We review recent progress in experimental and analytical characterization of cyclic variability where low dimensional, deterministic effects have been observed. We describe some theories about the sources of these dynamical features and discuss prospects for interactive control and improved engine designs. In conclusion, taken as a whole, the research summarized here implies that the deterministic component of cyclic variability will become a pivotal issue (and potential opportunity) as engine manufacturers strive to meet aggressive emissions and fuel economy regulations in the coming decades.« less
NASA Astrophysics Data System (ADS)
Ma, Zhanshan (Sam)
In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three categories of population dynamics models: deterministic modeling with Logistic chaos map as an example, stochastic modeling with spatial distribution patterns as an example, as well as survival analysis and extended evolutionary game theory (EEGT) modeling. Sample experiment results with Genetic algorithms (GA) are presented to demonstrate the applications of these models. The proposed EC population dynamics approach also makes survival selection largely unnecessary or much simplified since the individuals are naturally selected (controlled) by the mathematical models for EC population dynamics.
Effect of Dedifferentiation on Time to Mutation Acquisition in Stem Cell-Driven Cancers
Jilkine, Alexandra; Gutenkunst, Ryan N.
2014-01-01
Accumulating evidence suggests that many tumors have a hierarchical organization, with the bulk of the tumor composed of relatively differentiated short-lived progenitor cells that are maintained by a small population of undifferentiated long-lived cancer stem cells. It is unclear, however, whether cancer stem cells originate from normal stem cells or from dedifferentiated progenitor cells. To address this, we mathematically modeled the effect of dedifferentiation on carcinogenesis. We considered a hybrid stochastic-deterministic model of mutation accumulation in both stem cells and progenitors, including dedifferentiation of progenitor cells to a stem cell-like state. We performed exact computer simulations of the emergence of tumor subpopulations with two mutations, and we derived semi-analytical estimates for the waiting time distribution to fixation. Our results suggest that dedifferentiation may play an important role in carcinogenesis, depending on how stem cell homeostasis is maintained. If the stem cell population size is held strictly constant (due to all divisions being asymmetric), we found that dedifferentiation acts like a positive selective force in the stem cell population and thus speeds carcinogenesis. If the stem cell population size is allowed to vary stochastically with density-dependent reproduction rates (allowing both symmetric and asymmetric divisions), we found that dedifferentiation beyond a critical threshold leads to exponential growth of the stem cell population. Thus, dedifferentiation may play a crucial role, the common modeling assumption of constant stem cell population size may not be adequate, and further progress in understanding carcinogenesis demands a more detailed mechanistic understanding of stem cell homeostasis. PMID:24603301
Variational principles for stochastic fluid dynamics
Holm, Darryl D.
2015-01-01
This paper derives stochastic partial differential equations (SPDEs) for fluid dynamics from a stochastic variational principle (SVP). The paper proceeds by taking variations in the SVP to derive stochastic Stratonovich fluid equations; writing their Itô representation; and then investigating the properties of these stochastic fluid models in comparison with each other, and with the corresponding deterministic fluid models. The circulation properties of the stochastic Stratonovich fluid equations are found to closely mimic those of the deterministic ideal fluid models. As with deterministic ideal flows, motion along the stochastic Stratonovich paths also preserves the helicity of the vortex field lines in incompressible stochastic flows. However, these Stratonovich properties are not apparent in the equivalent Itô representation, because they are disguised by the quadratic covariation drift term arising in the Stratonovich to Itô transformation. This term is a geometric generalization of the quadratic covariation drift term already found for scalar densities in Stratonovich's famous 1966 paper. The paper also derives motion equations for two examples of stochastic geophysical fluid dynamics; namely, the Euler–Boussinesq and quasi-geostropic approximations. PMID:27547083
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutjahr, A.L.; Kincaid, C.T.; Mercer, J.W.
1987-04-01
The objective of this report is to summarize the various modeling approaches that were used to simulate solute transport in a variably saturated emission. In particular, the technical strengths and weaknesses of each approach are discussed, and conclusions and recommendations for future studies are made. Five models are considered: (1) one-dimensional analytical and semianalytical solutions of the classical deterministic convection-dispersion equation (van Genuchten, Parker, and Kool, this report ); (2) one-dimensional simulation using a continuous-time Markov process (Knighton and Wagenet, this report); (3) one-dimensional simulation using the time domain method and the frequency domain method (Duffy and Al-Hassan, this report);more » (4) one-dimensional numerical approach that combines a solution of the classical deterministic convection-dispersion equation with a chemical equilibrium speciation model (Cederberg, this report); and (5) three-dimensional numerical solution of the classical deterministic convection-dispersion equation (Huyakorn, Jones, Parker, Wadsworth, and White, this report). As part of the discussion, the input data and modeling results are summarized. The models were used in a data analysis mode, as opposed to a predictive mode. Thus, the following discussion will concentrate on the data analysis aspects of model use. Also, all the approaches were similar in that they were based on a convection-dispersion model of solute transport. Each discussion addresses the modeling approaches in the order listed above.« less
Bayesian probabilistic population projections for all countries.
Raftery, Adrian E; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K
2012-08-28
Projections of countries' future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950-1990 are used for estimation, and applied to predict 1990-2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20-64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades.
Assortative mating without assortative preference
Xie, Yu; Cheng, Siwei; Zhou, Xiang
2015-01-01
Assortative mating—marriage of a man and a woman with similar social characteristics—is a commonly observed phenomenon. In the existing literature in both sociology and economics, this phenomenon has mainly been attributed to individuals’ conscious preferences for assortative mating. In this paper, we show that patterns of assortative mating may arise from another structural source even if individuals do not have assortative preferences or possess complementary attributes: dynamic processes of marriages in a closed system. For a given cohort of youth in a finite population, as the percentage of married persons increases, unmarried persons who newly enter marriage are systematically different from those who married earlier, giving rise to the phenomenon of assortative mating. We use microsimulation methods to illustrate this dynamic process, using first the conventional deterministic Gale–Shapley model, then a probabilistic Gale–Shapley model, and then two versions of the encounter mating model. PMID:25918366
From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities
NASA Astrophysics Data System (ADS)
Kunjwal, Ravi; Spekkens, Robert W.
2018-05-01
The Kochen-Specker theorem rules out models of quantum theory wherein projective measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly projective. For nonprojective measurements, therefore, one must drop the requirement that an outcome be assigned deterministically in the model and merely require that it be assigned a distribution over outcomes in a manner that is context-independent. By demanding context independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring an operational analog of the quantum notion of projectiveness. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques that worked only for logical proofs, which are based on sets of projective measurements that fail to admit of any deterministic noncontextual assignment, to the case of statistical proofs, which are based on sets of projective measurements that d o admit of some deterministic noncontextual assignments, but not enough to explain the quantum statistics.
Convergence studies of deterministic methods for LWR explicit reflector methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canepa, S.; Hursin, M.; Ferroukhi, H.
2013-07-01
The standard approach in modem 3-D core simulators, employed either for steady-state or transient simulations, is to use Albedo coefficients or explicit reflectors at the core axial and radial boundaries. In the latter approach, few-group homogenized nuclear data are a priori produced with lattice transport codes using 2-D reflector models. Recently, the explicit reflector methodology of the deterministic CASMO-4/SIMULATE-3 code system was identified to potentially constitute one of the main sources of errors for core analyses of the Swiss operating LWRs, which are all belonging to GII design. Considering that some of the new GIII designs will rely on verymore » different reflector concepts, a review and assessment of the reflector methodology for various LWR designs appeared as relevant. Therefore, the purpose of this paper is to first recall the concepts of the explicit reflector modelling approach as employed by CASMO/SIMULATE. Then, for selected reflector configurations representative of both GII and GUI designs, a benchmarking of the few-group nuclear data produced with the deterministic lattice code CASMO-4 and its successor CASMO-5, is conducted. On this basis, a convergence study with regards to geometrical requirements when using deterministic methods with 2-D homogenous models is conducted and the effect on the downstream 3-D core analysis accuracy is evaluated for a typical GII deflector design in order to assess the results against available plant measurements. (authors)« less
The Deterministic Information Bottleneck
NASA Astrophysics Data System (ADS)
Strouse, D. J.; Schwab, David
2015-03-01
A fundamental and ubiquitous task that all organisms face is prediction of the future based on past sensory experience. Since an individual's memory resources are limited and costly, however, there is a tradeoff between memory cost and predictive payoff. The information bottleneck (IB) method (Tishby, Pereira, & Bialek 2000) formulates this tradeoff as a mathematical optimization problem using an information theoretic cost function. IB encourages storing as few bits of past sensory input as possible while selectively preserving the bits that are most predictive of the future. Here we introduce an alternative formulation of the IB method, which we call the deterministic information bottleneck (DIB). First, we argue for an alternative cost function, which better represents the biologically-motivated goal of minimizing required memory resources. Then, we show that this seemingly minor change has the dramatic effect of converting the optimal memory encoder from stochastic to deterministic. Next, we propose an iterative algorithm for solving the DIB problem. Additionally, we compare the IB and DIB methods on a variety of synthetic datasets, and examine the performance of retinal ganglion cell populations relative to the optimal encoding strategy for each problem.
Building a population-based diabetes register: an Italian experience.
Ballotari, Paola; Chiatamone Ranieri, Sofia; Vicentini, Massimo; Caroli, Stefania; Gardini, Andrea; Rodolfi, Rossella; Crucco, Roberto; Greci, Marina; Manicardi, Valeria; Giorgi Rossi, Paolo
2014-01-01
To describe the methodology used to set up the Reggio Emilia (northern Italy) Diabetes Register. The prevalence estimates on December 31st, 2009 are also provided. The Diabetes Register covers all residents in the Reggio Emilia province. The register was created by deterministic linkage of six routinely collected data sources through a definite algorithm able to ascertain cases and to distinguish type of diabetes and model of care: Hospital Discharge, Drug Dispensation, Biochemistry Laboratory, Disease-specific Exemption, Diabetes Outpatient Clinics, and Mortality databases. Using these data, we estimated crude prevalence on December 31st, 2009 by sex, age groups, and type of diabetes. There were 25,425 ascertained prevalent cases on December 31st, 2009. Drug Dispensation and Exemption databases made the greatest contribution to prevalence. Analyzing overlapping sources, more than 80% of cases were reported by at least two sources. Crude prevalence was 4.8% and 5.9% for the whole population and for people aged 18 years and over, respectively. Males accounted for 53.6%. Type 1 diabetes accounted for 3.8% of cases, while people with Type 2 diabetes were the overriding majority (91.2%), and Diabetes Outpatient Clinics treated 75.4% of people with Type 2 diabetes. The Register is able to quantify the burden of disease, the first step in planning, implementing, and monitoring appropriate interventions. All data sources contributed to completeness and/or accuracy of the Register. Although all cases are identified by deterministic record linkage, manual revision and General Practitioner involvement are still necessary when information is insufficient or conflicting. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Mathematical models of behavior of individual animals.
Tsibulsky, Vladimir L; Norman, Andrew B
2007-01-01
This review is focused on mathematical modeling of behaviors of a whole organism with special emphasis on models with a clearly scientific approach to the problem that helps to understand the mechanisms underlying behavior. The aim is to provide an overview of old and contemporary mathematical models without complex mathematical details. Only deterministic and stochastic, but not statistical models are reviewed. All mathematical models of behavior can be divided into two main classes. First, models that are based on the principle of teleological determinism assume that subjects choose the behavior that will lead them to a better payoff in the future. Examples are game theories and operant behavior models both of which are based on the matching law. The second class of models are based on the principle of causal determinism, which assume that subjects do not choose from a set of possibilities but rather are compelled to perform a predetermined behavior in response to specific stimuli. Examples are perception and discrimination models, drug effects models and individual-based population models. A brief overview of the utility of each mathematical model is provided for each section.
Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand
2007-07-01
This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.
Rare event computation in deterministic chaotic systems using genealogical particle analysis
NASA Astrophysics Data System (ADS)
Wouters, J.; Bouchet, F.
2016-09-01
In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein-Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function.
Pyrotechnic modeling for the NSI and pin puller
NASA Technical Reports Server (NTRS)
Powers, Joseph M.; Gonthier, Keith A.
1993-01-01
A discussion concerning the modeling of pyrotechnically driven actuators is presented in viewgraph format. The following topics are discussed: literature search, constitutive data for full-scale model, simple deterministic model, observed phenomena, and results from simple model.
Deterministic multi-zone ice accretion modeling
NASA Technical Reports Server (NTRS)
Yamaguchi, K.; Hansman, R. John, Jr.; Kazmierczak, Michael
1991-01-01
The focus here is on a deterministic model of the surface roughness transition behavior of glaze ice. The initial smooth/rough transition location, bead formation, and the propagation of the transition location are analyzed. Based on the hypothesis that the smooth/rough transition location coincides with the laminar/turbulent boundary layer transition location, a multizone model is implemented in the LEWICE code. In order to verify the effectiveness of the model, ice accretion predictions for simple cylinders calculated by the multizone LEWICE are compared to experimental ice shapes. The glaze ice shapes are found to be sensitive to the laminar surface roughness and bead thickness parameters controlling the transition location, while the ice shapes are found to be insensitive to the turbulent surface roughness.
Combining deterministic and stochastic velocity fields in the analysis of deep crustal seismic data
NASA Astrophysics Data System (ADS)
Larkin, Steven Paul
Standard crustal seismic modeling obtains deterministic velocity models which ignore the effects of wavelength-scale heterogeneity, known to exist within the Earth's crust. Stochastic velocity models are a means to include wavelength-scale heterogeneity in the modeling. These models are defined by statistical parameters obtained from geologic maps of exposed crystalline rock, and are thus tied to actual geologic structures. Combining both deterministic and stochastic velocity models into a single model allows a realistic full wavefield (2-D) to be computed. By comparing these simulations to recorded seismic data, the effects of wavelength-scale heterogeneity can be investigated. Combined deterministic and stochastic velocity models are created for two datasets, the 1992 RISC seismic experiment in southeastern California and the 1986 PASSCAL seismic experiment in northern Nevada. The RISC experiment was located in the transition zone between the Salton Trough and the southern Basin and Range province. A high-velocity body previously identified beneath the Salton Trough is constrained to pinch out beneath the Chocolate Mountains to the northeast. The lateral extent of this body is evidence for the ephemeral nature of rifting loci as a continent is initially rifted. Stochastic modeling of wavelength-scale structures above this body indicate that little more than 5% mafic intrusion into a more felsic continental crust is responsible for the observed reflectivity. Modeling of the wide-angle RISC data indicates that coda waves following PmP are initially dominated by diffusion of energy out of the near-surface basin as the wavefield reverberates within this low-velocity layer. At later times, this coda consists of scattered body waves and P to S conversions. Surface waves do not play a significant role in this coda. Modeling of the PASSCAL dataset indicates that a high-gradient crust-mantle transition zone or a rough Moho interface is necessary to reduce precritical PmP energy. Possibly related, inconsistencies in published velocity models are rectified by hypothesizing the existence of large, elongate, high-velocity bodies at the base of the crust oriented to and of similar scale as the basins and ranges at the surface. This structure would result in an anisotropic lower crust.
Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente
2009-12-20
This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.
NASA Astrophysics Data System (ADS)
Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra
2016-02-01
In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.
Deterministic Stress Modeling of Hot Gas Segregation in a Turbine
NASA Technical Reports Server (NTRS)
Busby, Judy; Sondak, Doug; Staubach, Brent; Davis, Roger
1998-01-01
Simulation of unsteady viscous turbomachinery flowfields is presently impractical as a design tool due to the long run times required. Designers rely predominantly on steady-state simulations, but these simulations do not account for some of the important unsteady flow physics. Unsteady flow effects can be modeled as source terms in the steady flow equations. These source terms, referred to as Lumped Deterministic Stresses (LDS), can be used to drive steady flow solution procedures to reproduce the time-average of an unsteady flow solution. The goal of this work is to investigate the feasibility of using inviscid lumped deterministic stresses to model unsteady combustion hot streak migration effects on the turbine blade tip and outer air seal heat loads using a steady computational approach. The LDS model is obtained from an unsteady inviscid calculation. The LDS model is then used with a steady viscous computation to simulate the time-averaged viscous solution. Both two-dimensional and three-dimensional applications are examined. The inviscid LDS model produces good results for the two-dimensional case and requires less than 10% of the CPU time of the unsteady viscous run. For the three-dimensional case, the LDS model does a good job of reproducing the time-averaged viscous temperature migration and separation as well as heat load on the outer air seal at a CPU cost that is 25% of that of an unsteady viscous computation.
Monitoring the Heavens, Today, and Tomorrow
NASA Technical Reports Server (NTRS)
Johnson, Nicholas L.
2006-01-01
The current Earth satellite population in LEO for all sizes is relatively well-established by a combination of deterministic and statistical means. At higher altitudes, the population of satellites with diameters of less than 1 m is not well defined. Although a few new sensors might become operational in the near- to mid-term, no major improvement in environment characterization is anticipated during this period. With the increasing deployment of micro- and pico-satellites and with the continued growth of the small debris population, a need exists for better space surveillance to support spacecraft design and operations.
Li, Longxiang; Xue, Donglin; Deng, Weijie; Wang, Xu; Bai, Yang; Zhang, Feng; Zhang, Xuejun
2017-11-10
In deterministic computer-controlled optical surfacing, accurate dwell time execution by computer numeric control machines is crucial in guaranteeing a high-convergence ratio for the optical surface error. It is necessary to consider the machine dynamics limitations in the numerical dwell time algorithms. In this paper, these constraints on dwell time distribution are analyzed, and a model of the equal extra material removal is established. A positive dwell time algorithm with minimum equal extra material removal is developed. Results of simulations based on deterministic magnetorheological finishing demonstrate the necessity of considering machine dynamics performance and illustrate the validity of the proposed algorithm. Indeed, the algorithm effectively facilitates the determinacy of sub-aperture optical surfacing processes.
Kucza, Witold
2013-07-25
Stochastic and deterministic simulations of dispersion in cylindrical channels on the Poiseuille flow have been presented. The random walk (stochastic) and the uniform dispersion (deterministic) models have been used for computations of flow injection analysis responses. These methods coupled with the genetic algorithm and the Levenberg-Marquardt optimization methods, respectively, have been applied for determination of diffusion coefficients. The diffusion coefficients of fluorescein sodium, potassium hexacyanoferrate and potassium dichromate have been determined by means of the presented methods and FIA responses that are available in literature. The best-fit results agree with each other and with experimental data thus validating both presented approaches. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
Deterministic SLIR model for tuberculosis disease mapping
NASA Astrophysics Data System (ADS)
Aziz, Nazrina; Diah, Ijlal Mohd; Ahmad, Nazihah; Kasim, Maznah Mat
2017-11-01
Tuberculosis (TB) occurs worldwide. It can be transmitted to others directly through air when active TB persons sneeze, cough or spit. In Malaysia, it was reported that TB cases had been recognized as one of the most infectious disease that lead to death. Disease mapping is one of the methods that can be used as the prevention strategies since it can displays clear picture for the high-low risk areas. Important thing that need to be considered when studying the disease occurrence is relative risk estimation. The transmission of TB disease is studied through mathematical model. Therefore, in this study, deterministic SLIR models are used to estimate relative risk for TB disease transmission.
Sampled-Data Consensus of Linear Multi-agent Systems With Packet Losses.
Zhang, Wenbing; Tang, Yang; Huang, Tingwen; Kurths, Jurgen
In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.
The Bilinear Product Model of Hysteresis Phenomena
NASA Astrophysics Data System (ADS)
Kádár, György
1989-01-01
In ferromagnetic materials non-reversible magnetization processes are represented by rather complex hysteresis curves. The phenomenological description of such curves needs the use of multi-valued, yet unambiguous, deterministic functions. The history dependent calculation of consecutive Everett-integrals of the two-variable Preisach-function can account for the main features of hysteresis curves in uniaxial magnetic materials. The traditional Preisach model has recently been modified on the basis of population dynamics considerations, removing the non-real congruency property of the model. The Preisach-function was proposed to be a product of two factors of distinct physical significance: a magnetization dependent function taking into account the overall magnetization state of the body and a bilinear form of a single variable, magnetic field dependent, switching probability function. The most important statement of the bilinear product model is, that the switching process of individual particles is to be separated from the book-keeping procedure of their states. This empirical model of hysteresis can easily be extended to other irreversible physical processes, such as first order phase transitions.
Interactive Reliability Model for Whisker-toughened Ceramics
NASA Technical Reports Server (NTRS)
Palko, Joseph L.
1993-01-01
Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.
Nonclassical point of view of the Brownian motion generation via fractional deterministic model
NASA Astrophysics Data System (ADS)
Gilardi-Velázquez, H. E.; Campos-Cantón, E.
In this paper, we present a dynamical system based on the Langevin equation without stochastic term and using fractional derivatives that exhibit properties of Brownian motion, i.e. a deterministic model to generate Brownian motion is proposed. The stochastic process is replaced by considering an additional degree of freedom in the second-order Langevin equation. Thus, it is transformed into a system of three first-order linear differential equations, additionally α-fractional derivative are considered which allow us to obtain better statistical properties. Switching surfaces are established as a part of fluctuating acceleration. The final system of three α-order linear differential equations does not contain a stochastic term, so the system generates motion in a deterministic way. Nevertheless, from the time series analysis, we found that the behavior of the system exhibits statistics properties of Brownian motion, such as, a linear growth in time of mean square displacement, a Gaussian distribution. Furthermore, we use the detrended fluctuation analysis to prove the Brownian character of this motion.
Murakami, Masayoshi; Shteingart, Hanan; Loewenstein, Yonatan; Mainen, Zachary F
2017-05-17
The selection and timing of actions are subject to determinate influences such as sensory cues and internal state as well as to effectively stochastic variability. Although stochastic choice mechanisms are assumed by many theoretical models, their origin and mechanisms remain poorly understood. Here we investigated this issue by studying how neural circuits in the frontal cortex determine action timing in rats performing a waiting task. Electrophysiological recordings from two regions necessary for this behavior, medial prefrontal cortex (mPFC) and secondary motor cortex (M2), revealed an unexpected functional dissociation. Both areas encoded deterministic biases in action timing, but only M2 neurons reflected stochastic trial-by-trial fluctuations. This differential coding was reflected in distinct timescales of neural dynamics in the two frontal cortical areas. These results suggest a two-stage model in which stochastic components of action timing decisions are injected by circuits downstream of those carrying deterministic bias signals. Copyright © 2017 Elsevier Inc. All rights reserved.
Bubonic plague: a metapopulation model of a zoonosis.
Keeling, M J; Gilligan, C A
2000-01-01
Bubonic plague (Yersinia pestis) is generally thought of as a historical disease; however, it is still responsible for around 1000-3000 deaths each year worldwide. This paper expands the analysis of a model for bubonic plague that encompasses the disease dynamics in rat, flea and human populations. Some key variables of the deterministic model, including the force of infection to humans, are shown to be robust to changes in the basic parameters, although variation in the flea searching efficiency, and the movement rates of rats and fleas will be considered throughout the paper. The stochastic behaviour of the corresponding metapopulation model is discussed, with attention focused on the dynamics of rats and the force of infection at the local spatial scale. Short-lived local epidemics in rats govern the invasion of the disease and produce an irregular pattern of human cases similar to those observed. However, the endemic behaviour in a few rat subpopulations allows the disease to persist for many years. This spatial stochastic model is also used to identify the criteria for the spread to human populations in terms of the rat density. Finally, the full stochastic model is reduced to the form of a probabilistic cellular automaton, which allows the analysis of a large number of replicated epidemics in large populations. This simplified model enables us to analyse the spatial properties of rat epidemics and the effects of movement rates, and also to test whether the emergent metapopulation behaviour is a property of the local dynamics rather than the precise details of the model. PMID:11413636
Developing deterioration models for Wyoming bridges.
DOT National Transportation Integrated Search
2016-05-01
Deterioration models for the Wyoming Bridge Inventory were developed using both stochastic and deterministic models. : The selection of explanatory variables is investigated and a new method using LASSO regression to eliminate human bias : in explana...
Short-range solar radiation forecasts over Sweden
NASA Astrophysics Data System (ADS)
Landelius, Tomas; Lindskog, Magnus; Körnich, Heiner; Andersson, Sandra
2018-04-01
In this article the performance for short-range solar radiation forecasts by the global deterministic and ensemble models from the European Centre for Medium-Range Weather Forecasts (ECMWF) is compared with an ensemble of the regional mesoscale model HARMONIE-AROME used by the national meteorological services in Sweden, Norway and Finland. Note however that only the control members and the ensemble means are included in the comparison. The models resolution differs considerably with 18 km for the ECMWF ensemble, 9 km for the ECMWF deterministic model, and 2.5 km for the HARMONIE-AROME ensemble. The models share the same radiation code. It turns out that they all underestimate systematically the Direct Normal Irradiance (DNI) for clear-sky conditions. Except for this shortcoming, the HARMONIE-AROME ensemble model shows the best agreement with the distribution of observed Global Horizontal Irradiance (GHI) and DNI values. During mid-day the HARMONIE-AROME ensemble mean performs best. The control member of the HARMONIE-AROME ensemble also scores better than the global deterministic ECMWF model. This is an interesting result since mesoscale models have so far not shown good results when compared to the ECMWF models. Three days with clear, mixed and cloudy skies are used to illustrate the possible added value of a probabilistic forecast. It is shown that in these cases the mesoscale ensemble could provide decision support to a grid operator in terms of forecasts of both the amount of solar power and its probabilities.
Hierarchial mark-recapture models: a framework for inference about demographic processes
Link, W.A.; Barker, R.J.
2004-01-01
The development of sophisticated mark-recapture models over the last four decades has provided fundamental tools for the study of wildlife populations, allowing reliable inference about population sizes and demographic rates based on clearly formulated models for the sampling processes. Mark-recapture models are now routinely described by large numbers of parameters. These large models provide the next challenge to wildlife modelers: the extraction of signal from noise in large collections of parameters. Pattern among parameters can be described by strong, deterministic relations (as in ultrastructural models) but is more flexibly and credibly modeled using weaker, stochastic relations. Trend in survival rates is not likely to be manifest by a sequence of values falling precisely on a given parametric curve; rather, if we could somehow know the true values, we might anticipate a regression relation between parameters and explanatory variables, in which true value equals signal plus noise. Hierarchical models provide a useful framework for inference about collections of related parameters. Instead of regarding parameters as fixed but unknown quantities, we regard them as realizations of stochastic processes governed by hyperparameters. Inference about demographic processes is based on investigation of these hyperparameters. We advocate the Bayesian paradigm as a natural, mathematically and scientifically sound basis for inference about hierarchical models. We describe analysis of capture-recapture data from an open population based on hierarchical extensions of the Cormack-Jolly-Seber model. In addition to recaptures of marked animals, we model first captures of animals and losses on capture, and are thus able to estimate survival probabilities w (i.e., the complement of death or permanent emigration) and per capita growth rates f (i.e., the sum of recruitment and immigration rates). Covariation in these rates, a feature of demographic interest, is explicitly described in the model.
Kaveh, Kamran; Veller, Carl; Nowak, Martin A
2016-08-21
Evolutionary game dynamics are often studied in the context of different population structures. Here we propose a new population structure that is inspired by simple multicellular life forms. In our model, cells reproduce but can stay together after reproduction. They reach complexes of a certain size, n, before producing single cells again. The cells within a complex derive payoff from an evolutionary game by interacting with each other. The reproductive rate of cells is proportional to their payoff. We consider all two-strategy games. We study deterministic evolutionary dynamics with mutations, and derive exact conditions for selection to favor one strategy over another. Our main result has the same symmetry as the well-known sigma condition, which has been proven for stochastic game dynamics and weak selection. For a maximum complex size of n=2 our result holds for any intensity of selection. For n≥3 it holds for weak selection. As specific examples we study the prisoner's dilemma and hawk-dove games. Our model advances theoretical work on multicellularity by allowing for frequency-dependent interactions within groups. Copyright © 2016 Elsevier Ltd. All rights reserved.
The effect of loss of immunity on noise-induced sustained oscillations in epidemics.
Chaffee, J; Kuske, R
2011-11-01
The effect of loss of immunity on sustained population oscillations about an endemic equilibrium is studied via a multiple scales analysis of a SIRS model. The analysis captures the key elements supporting the nearly regular oscillations of the infected and susceptible populations, namely, the interaction of the deterministic and stochastic dynamics together with the separation of time scales of the damping and the period of these oscillations. The derivation of a nonlinear stochastic amplitude equation describing the envelope of the oscillations yields two criteria providing explicit parameter ranges where they can be observed. These conditions are similar to those found for other applications in the context of coherence resonance, in which noise drives nearly regular oscillations in a system that is quiescent without noise. In this context the criteria indicate how loss of immunity and other factors can lead to a significant increase in the parameter range for prevalence of the sustained oscillations, without any external driving forces. Comparison of the power spectral densities of the full model and the approximation confirms that the multiple scales analysis captures nonlinear features of the oscillations.
NASA Astrophysics Data System (ADS)
Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre
2015-04-01
Sea surface height, sea surface temperature and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. 60 ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. Incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with observations used in the assimilation experiments and independent observations, which goes further than most previous studies and constitutes one of the original points of this paper. Regarding the deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations in order to diagnose the ensemble distribution properties in a deterministic way. Regarding the probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analysed jointly. The consistency and complementarity between both validations are highlighted. High reliable situations, in which the RMS error and the CRPS give the same information, are identified for the first time in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; Draxl, Caroline; Hopson, Thomas
Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less
Terçariol, César Augusto Sangaletti; Martinez, Alexandre Souto
2005-08-01
Consider a medium characterized by N points whose coordinates are randomly generated by a uniform distribution along the edges of a unitary d-dimensional hypercube. A walker leaves from each point of this disordered medium and moves according to the deterministic rule to go to the nearest point which has not been visited in the preceding mu steps (deterministic tourist walk). Each trajectory generated by this dynamics has an initial nonperiodic part of t steps (transient) and a final periodic part of p steps (attractor). The neighborhood rank probabilities are parametrized by the normalized incomplete beta function Id= I1/4 [1/2, (d+1) /2] . The joint distribution S(N) (mu,d) (t,p) is relevant, and the marginal distributions previously studied are particular cases. We show that, for the memory-less deterministic tourist walk in the euclidean space, this distribution is Sinfinity(1,d) (t,p) = [Gamma (1+ I(-1)(d)) (t+ I(-1)(d) ) /Gamma(t+p+ I(-1)(d)) ] delta(p,2), where t=0, 1,2, ... infinity, Gamma(z) is the gamma function and delta(i,j) is the Kronecker delta. The mean-field models are the random link models, which correspond to d-->infinity, and the random map model which, even for mu=0 , presents nontrivial cycle distribution [ S(N)(0,rm) (p) proportional to p(-1) ] : S(N)(0,rm) (t,p) =Gamma(N)/ {Gamma[N+1- (t+p) ] N( t+p)}. The fundamental quantities are the number of explored points n(e)=t+p and Id. Although the obtained distributions are simple, they do not follow straightforwardly and they have been validated by numerical experiments.
Modeling metapopulation dynamics for single species of seabirds
Buckley, P.A.; Downer, R.; McCullough, D.R.; Barrett, R.H.
1992-01-01
Seabirds share many characteristics setting them apart from other birds. Importantly, they breed more or less obligatorily in local clusters of colonies that can move regularly from site to site, and they routinely exchange breeders. The properties of such metapopulations have only recently begun to be examined, often with models that are occupancy-based (using only colony presence or absence data) and deterministic (using single, empirically determined values for each of several population biology parameters). Some recent models are now frequency-based (using actual population sizes at each site), as well as stochastic (randomly varying critical parameters between biologically realistic limits), yielding better estimates of the behavior of future populations. Using two such models designed to quantify relative risks of population changes under different future scenarios (RAMAS/stage and RAMAS/space), we have examined probable future populations dynamics for three hypothetical seabirds -- an albatross, a cormorant, and a tern. With real parameters and ranges of values we alternatively modelled each species with and without density dependence, as well as with their numbers in a single, large colony, or in many smaller ones, distributed evenly or lognormally. We produced a series of species-typical lines for different population risks over the 50 years we simulated. We call these curves Instantaneous Threat Assessments (ITAs), and their shapes mirror the varying life history characteristics of our three species. We also demonstrated (by a process known as sensitivity analysis) that the most important parameters determining future population fates of all three species were correlation of mean growth rate among colonies; dispersal rate of present and future breeders; subadult survivorship; and the number of subpopulations (=colonies) - in roughly that descending order of importance. In addition, density dependence was found to markedly alter ITA line shape and position, dramatically in the tern. Finally, we show that for each of our three seabirds, a substantial reduction in the risk of the entire population's going to extinction was provided by a metapopulation (i.e. colonial) breeding structure -- thus comfortably confirming what avian ecologists have long known but about which population modellers are somtimes still unsure.
Discrete bivariate population balance modelling of heteroaggregation processes.
Rollié, Sascha; Briesen, Heiko; Sundmacher, Kai
2009-08-15
Heteroaggregation in binary particle mixtures was simulated with a discrete population balance model in terms of two internal coordinates describing the particle properties. The considered particle species are of different size and zeta-potential. Property space is reduced with a semi-heuristic approach to enable an efficient solution. Aggregation rates are based on deterministic models for Brownian motion and stability, under consideration of DLVO interaction potentials. A charge-balance kernel is presented, relating the electrostatic surface potential to the property space by a simple charge balance. Parameter sensitivity with respect to the fractal dimension, aggregate size, hydrodynamic correction, ionic strength and absolute particle concentration was assessed. Results were compared to simulations with the literature kernel based on geometric coverage effects for clusters with heterogeneous surface properties. In both cases electrostatic phenomena, which dominate the aggregation process, show identical trends: impeded cluster-cluster aggregation at low particle mixing ratio (1:1), restabilisation at high mixing ratios (100:1) and formation of complex clusters for intermediate ratios (10:1). The particle mixing ratio controls the surface coverage extent of the larger particle species. Simulation results are compared to experimental flow cytometric data and show very satisfactory agreement.
Modelling ecosystem service flows under uncertainty with stochiastic SPAN
Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.
2012-01-01
Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.
Deterministic Assembly of Complex Bacterial Communities in Guts of Germ-Free Cockroaches
Mikaelyan, Aram; Thompson, Claire L.; Hofer, Markus J.
2015-01-01
The gut microbiota of termites plays important roles in the symbiotic digestion of lignocellulose. However, the factors shaping the microbial community structure remain poorly understood. Because termites cannot be raised under axenic conditions, we established the closely related cockroach Shelfordella lateralis as a germ-free model to study microbial community assembly and host-microbe interactions. In this study, we determined the composition of the bacterial assemblages in cockroaches inoculated with the gut microbiota of termites and mice using pyrosequencing analysis of their 16S rRNA genes. Although the composition of the xenobiotic communities was influenced by the lineages present in the foreign inocula, their structure resembled that of conventional cockroaches. Bacterial taxa abundant in conventional cockroaches but rare in the foreign inocula, such as Dysgonomonas and Parabacteroides spp., were selectively enriched in the xenobiotic communities. Donor-specific taxa, such as endomicrobia or spirochete lineages restricted to the gut microbiota of termites, however, either were unable to colonize germ-free cockroaches or formed only small populations. The exposure of xenobiotic cockroaches to conventional adults restored their normal microbiota, which indicated that autochthonous lineages outcompete foreign ones. Our results provide experimental proof that the assembly of a complex gut microbiota in insects is deterministic. PMID:26655763
Tveito, Aslak; Lines, Glenn T; Edwards, Andrew G; McCulloch, Andrew
2016-07-01
Markov models are ubiquitously used to represent the function of single ion channels. However, solving the inverse problem to construct a Markov model of single channel dynamics from bilayer or patch-clamp recordings remains challenging, particularly for channels involving complex gating processes. Methods for solving the inverse problem are generally based on data from voltage clamp measurements. Here, we describe an alternative approach to this problem based on measurements of voltage traces. The voltage traces define probability density functions of the functional states of an ion channel. These probability density functions can also be computed by solving a deterministic system of partial differential equations. The inversion is based on tuning the rates of the Markov models used in the deterministic system of partial differential equations such that the solution mimics the properties of the probability density function gathered from (pseudo) experimental data as well as possible. The optimization is done by defining a cost function to measure the difference between the deterministic solution and the solution based on experimental data. By evoking the properties of this function, it is possible to infer whether the rates of the Markov model are identifiable by our method. We present applications to Markov model well-known from the literature. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Invasion of cooperators in lattice populations: linear and non-linear public good games.
Vásárhelyi, Zsóka; Scheuring, István
2013-08-01
A generalized version of the N-person volunteer's dilemma (NVD) Game has been suggested recently for illustrating the problem of N-person social dilemmas. Using standard replicator dynamics it can be shown that coexistence of cooperators and defectors is typical in this model. However, the question of how a rare mutant cooperator could invade a population of defectors is still open. Here we examined the dynamics of individual based stochastic models of the NVD. We analyze the dynamics in well-mixed and viscous populations. We show in both cases that coexistence between cooperators and defectors is possible; moreover, spatial aggregation of types in viscous populations can easily lead to pure cooperation. Furthermore we analyze the invasion of cooperators in populations consisting predominantly of defectors. In accordance with analytical results, in deterministic systems, we found the invasion of cooperators successful in the well-mixed case only if their initial concentration was higher than a critical threshold, defined by the replicator dynamics of the NVD. In the viscous case, however, not the initial concentration but the initial number determines the success of invasion. We show that even a single mutant cooperator can invade with a high probability, because the local density of aggregated cooperators exceeds the threshold defined by the game. Comparing the results to models using different benefit functions (linear or sigmoid), we show that the role of the benefit function is much more important in the well-mixed than in the viscous case. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Lai, C.; Tsay, T.-K.; Chien, C.-H.; Wu, I.-L.
2009-01-01
Researchers at the Hydroinformatic Research and Development Team (HIRDT) of the National Taiwan University undertook a project to create a real time flood forecasting model, with an aim to predict the current in the Tamsui River Basin. The model was designed based on deterministic approach with mathematic modeling of complex phenomenon, and specific parameter values operated to produce a discrete result. The project also devised a rainfall-stage model that relates the rate of rainfall upland directly to the change of the state of river, and is further related to another typhoon-rainfall model. The geographic information system (GIS) data, based on precise contour model of the terrain, estimate the regions that were perilous to flooding. The HIRDT, in response to the project's progress, also devoted their application of a deterministic model to unsteady flow of thermodynamics to help predict river authorities issue timely warnings and take other emergency measures.
Fuzzy linear model for production optimization of mining systems with multiple entities
NASA Astrophysics Data System (ADS)
Vujic, Slobodan; Benovic, Tomo; Miljanovic, Igor; Hudej, Marjan; Milutinovic, Aleksandar; Pavlovic, Petar
2011-12-01
Planning and production optimization within multiple mines or several work sites (entities) mining systems by using fuzzy linear programming (LP) was studied. LP is the most commonly used operations research methods in mining engineering. After the introductory review of properties and limitations of applying LP, short reviews of the general settings of deterministic and fuzzy LP models are presented. With the purpose of comparative analysis, the application of both LP models is presented using the example of the Bauxite Basin Niksic with five mines. After the assessment, LP is an efficient mathematical modeling tool in production planning and solving many other single-criteria optimization problems of mining engineering. After the comparison of advantages and deficiencies of both deterministic and fuzzy LP models, the conclusion presents benefits of the fuzzy LP model but is also stating that seeking the optimal plan of production means to accomplish the overall analysis that will encompass the LP model approaches.
Spatio-temporal modelling of rainfall in the Murray-Darling Basin
NASA Astrophysics Data System (ADS)
Nowak, Gen; Welsh, A. H.; O'Neill, T. J.; Feng, Lingbing
2018-02-01
The Murray-Darling Basin (MDB) is a large geographical region in southeastern Australia that contains many rivers and creeks, including Australia's three longest rivers, the Murray, the Murrumbidgee and the Darling. Understanding rainfall patterns in the MDB is very important due to the significant impact major events such as droughts and floods have on agricultural and resource productivity. We propose a model for modelling a set of monthly rainfall data obtained from stations in the MDB and for producing predictions in both the spatial and temporal dimensions. The model is a hierarchical spatio-temporal model fitted to geographical data that utilises both deterministic and data-derived components. Specifically, rainfall data at a given location are modelled as a linear combination of these deterministic and data-derived components. A key advantage of the model is that it is fitted in a step-by-step fashion, enabling appropriate empirical choices to be made at each step.
Stochastic Watershed Models for Risk Based Decision Making
NASA Astrophysics Data System (ADS)
Vogel, R. M.
2017-12-01
Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation
Sensitivity analysis in a Lassa fever deterministic mathematical model
NASA Astrophysics Data System (ADS)
Abdullahi, Mohammed Baba; Doko, Umar Chado; Mamuda, Mamman
2015-05-01
Lassa virus that causes the Lassa fever is on the list of potential bio-weapons agents. It was recently imported into Germany, the Netherlands, the United Kingdom and the United States as a consequence of the rapid growth of international traffic. A model with five mutually exclusive compartments related to Lassa fever is presented and the basic reproduction number analyzed. A sensitivity analysis of the deterministic model is performed. This is done in order to determine the relative importance of the model parameters to the disease transmission. The result of the sensitivity analysis shows that the most sensitive parameter is the human immigration, followed by human recovery rate, then person to person contact. This suggests that control strategies should target human immigration, effective drugs for treatment and education to reduced person to person contact.
NASA Technical Reports Server (NTRS)
Smialek, James L.
2002-01-01
An equation has been developed to model the iterative scale growth and spalling process that occurs during cyclic oxidation of high temperature materials. Parabolic scale growth and spalling of a constant surface area fraction have been assumed. Interfacial spallation of the only the thickest segments was also postulated. This simplicity allowed for representation by a simple deterministic summation series. Inputs are the parabolic growth rate constant, the spall area fraction, oxide stoichiometry, and cycle duration. Outputs include the net weight change behavior, as well as the total amount of oxygen and metal consumed, the total amount of oxide spalled, and the mass fraction of oxide spalled. The outputs all follow typical well-behaved trends with the inputs and are in good agreement with previous interfacial models.
Microsatellite analysis of genetic divergence among populations of giant Galápagos tortoises.
Ciofi, Claudio; Milinkovitch, Michel C; Gibbs, James P; Caccone, Adalgisa; Powell, Jeffrey R
2002-11-01
Giant Galápagos tortoises represent an interesting model for the study of patterns of genetic divergence and adaptive differentiation related to island colonization events. Recent mitochondrial DNA work elucidated the evolutionary history of the species and helped to clarify aspects of nomenclature. We used 10 microsatellite loci to assess levels of genetic divergence among and within island populations. In particular, we described the genetic structure of tortoises on the island of Isabela, where discrimination of different taxa is still subject of debate. Individual island populations were all genetically distinct. The island of Santa Cruz harboured two distinct populations. On Isabela, populations of Volcan Wolf, Darwin and Alcedo were significantly different from each other. On the other hand, Volcan Wolf showed allelic similarity with the island of Santiago. On Southern Isabela, lower genetic divergence was found between Northeast Sierra Negra and Volcan Alcedo, while patterns of gene flow were recorded among tortoises of Cerro Azul and Southeast Sierra Negra. These tortoises have endured heavy exploitation during the last three centuries and recently attracted much concern due to the current number of stochastic and deterministic threats to extant populations. Our study complements previous investigation based on mtDNA diversity and provides further information that may help devising tortoise management plans.
Mai, Tam V-T; Duong, Minh V; Nguyen, Hieu T; Lin, Kuang C; Huynh, Lam K
2017-04-27
An integrated deterministic and stochastic model within the master equation/Rice-Ramsperger-Kassel-Marcus (ME/RRKM) framework was first used to characterize temperature- and pressure-dependent behaviors of thermal decomposition of acetic anhydride in a wide range of conditions (i.e., 300-1500 K and 0.001-100 atm). Particularly, using potential energy surface and molecular properties obtained from high-level electronic structure calculations at CCSD(T)/CBS, macroscopic thermodynamic properties and rate coefficients of the title reaction were derived with corrections for hindered internal rotation and tunneling treatments. Being in excellent agreement with the scattered experimental data, the results from deterministic and stochastic frameworks confirmed and complemented each other to reveal that the main decomposition pathway proceeds via a 6-membered-ring transition state with the 0 K barrier of 35.2 kcal·mol -1 . This observation was further understood and confirmed by the sensitivity analysis on the time-resolved species profiles and the derived rate coefficients with respect to the ab initio barriers. Such an agreement suggests the integrated model can be confidently used for a wide range of conditions as a powerful postfacto and predictive tool in detailed chemical kinetic modeling and simulation for the title reaction and thus can be extended to complex chemical reactions.
Policy options for alcohol price regulation: the importance of modelling population heterogeneity.
Meier, Petra Sylvia; Purshouse, Robin; Brennan, Alan
2010-03-01
Context and aims Internationally, the repertoire of alcohol pricing policies has expanded to include targeted taxation, inflation-linked taxation, taxation based on alcohol-by-volume (ABV), minimum pricing policies (general or targeted), bans of below-cost selling and restricting price-based promotions. Policy makers clearly need to consider how options compare in reducing harms at the population level, but are also required to demonstrate proportionality of their actions, which necessitates a detailed understanding of policy effects on different population subgroups. This paper presents selected findings from a policy appraisal for the UK government and discusses the importance of accounting for population heterogeneity in such analyses. Method We have built a causal, deterministic, epidemiological model which takes account of differential preferences by population subgroups defined by age, gender and level of drinking (moderate, hazardous, harmful). We consider purchasing preferences in terms of the types and volumes of alcoholic beverages, prices paid and the balance between bars, clubs and restaurants as opposed to supermarkets and off-licenses. Results Age, sex and level of drinking fundamentally affect beverage preferences, drinking location, prices paid, price sensitivity and tendency to substitute for other beverage types. Pricing policies vary in their impact on different product types, price points and venues, thus having distinctly different effects on subgroups. Because population subgroups also have substantially different risk profiles for harms, policies are differentially effective in reducing health, crime, work-place absence and unemployment harms. Conclusion Policy appraisals must account for population heterogeneity and complexity if resulting interventions are to be well considered, proportionate, effective and cost-effective.
NASA Astrophysics Data System (ADS)
Krumholz, Mark R.; Fumagalli, Michele; da Silva, Robert L.; Rendahl, Theodore; Parra, Jonathan
2015-09-01
Stellar population synthesis techniques for predicting the observable light emitted by a stellar population have extensive applications in numerous areas of astronomy. However, accurate predictions for small populations of young stars, such as those found in individual star clusters, star-forming dwarf galaxies, and small segments of spiral galaxies, require that the population be treated stochastically. Conversely, accurate deductions of the properties of such objects also require consideration of stochasticity. Here we describe a comprehensive suite of modular, open-source software tools for tackling these related problems. These include the following: a greatly-enhanced version of the SLUG code introduced by da Silva et al., which computes spectra and photometry for stochastically or deterministically sampled stellar populations with nearly arbitrary star formation histories, clustering properties, and initial mass functions; CLOUDY_SLUG, a tool that automatically couples SLUG-computed spectra with the CLOUDY radiative transfer code in order to predict stochastic nebular emission; BAYESPHOT, a general-purpose tool for performing Bayesian inference on the physical properties of stellar systems based on unresolved photometry; and CLUSTER_SLUG and SFR_SLUG, a pair of tools that use BAYESPHOT on a library of SLUG models to compute the mass, age, and extinction of mono-age star clusters, and the star formation rate of galaxies, respectively. The latter two tools make use of an extensive library of pre-computed stellar population models, which are included in the software. The complete package is available at http://www.slugsps.com.
Spatiotemporal pattern formation in a prey-predator model under environmental driving forces
NASA Astrophysics Data System (ADS)
Sirohi, Anuj Kumar; Banerjee, Malay; Chakraborti, Anirban
2015-09-01
Many existing studies on pattern formation in the reaction-diffusion systems rely on deterministic models. However, environmental noise is often a major factor which leads to significant changes in the spatiotemporal dynamics. In this paper, we focus on the spatiotemporal patterns produced by the predator-prey model with ratio-dependent functional response and density dependent death rate of predator. We get the reaction-diffusion equations incorporating the self-diffusion terms, corresponding to random movement of the individuals within two dimensional habitats, into the growth equations for the prey and predator population. In order to have the noise added model, small amplitude heterogeneous perturbations to the linear intrinsic growth rates are introduced using uncorrelated Gaussian white noise terms. For the noise added system, we then observe spatial patterns for the parameter values lying outside the Turing instability region. With thorough numerical simulations we characterize the patterns corresponding to Turing and Turing-Hopf domain and study their dependence on different system parameters like noise-intensity, etc.
Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.
Marino, Dale J; Starr, Thomas B
2007-12-01
A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case evaluated. Independent draws of PBPK inputs resulted in the slightly higher URFs. Results were also comparable to corresponding values from the previously reported deterministic mouse PBPK and dose-response modeling approach that used LED(10)s to derive potency factors. This finding indicated that the adjustment from ED(10) to LED(10) in the deterministic approach for DCM compensated for variability resulting from probabilistic PBPK and dose-response modeling in the mouse. Finally, results show a similar degree of variability in DCM risk estimates from a number of different sources including the current effort even though these estimates were developed using very different techniques. Given the variety of different approaches involved, 95th percentile-to-mean risk estimate ratios of 2.1-4.1 represent reasonable bounds on variability estimates regarding probabilistic assessments of DCM.
Stochastic and deterministic processes regulate spatio-temporal variation in seed bank diversity
Alejandro A. Royo; Todd E. Ristau
2013-01-01
Seed banks often serve as reservoirs of taxonomic and genetic diversity that buffer plant populations and influence post-disturbance vegetation trajectories; yet evaluating their importance requires understanding how their composition varies within and across spatial and temporal scales (α- and β-diversity). Shifts in seed bank diversity are strongly...
Current issues and actions in radiation protection of patients.
Holmberg, Ola; Malone, Jim; Rehani, Madan; McLean, Donald; Czarwinski, Renate
2010-10-01
Medical application of ionizing radiation is a massive and increasing activity globally. While the use of ionizing radiation in medicine brings tremendous benefits to the global population, the associated risks due to stochastic and deterministic effects make it necessary to protect patients from potential harm. Current issues in radiation protection of patients include not only the rapidly increasing collective dose to the global population from medical exposure, but also that a substantial percentage of diagnostic imaging examinations are unnecessary, and the cumulative dose to individuals from medical exposure is growing. In addition to this, continued reports on deterministic injuries from safety related events in the medical use of ionizing radiation are raising awareness on the necessity for accident prevention measures. The International Atomic Energy Agency is engaged in several activities to reverse the negative trends of these current issues, including improvement of the justification process, the tracking of radiation history of individual patients, shared learning of safety significant events, and the use of comprehensive quality audits in the clinical environment. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Methods For Self-Organizing Software
Bouchard, Ann M.; Osbourn, Gordon C.
2005-10-18
A method for dynamically self-assembling and executing software is provided, containing machines that self-assemble execution sequences and data structures. In addition to ordered functions calls (found commonly in other software methods), mutual selective bonding between bonding sites of machines actuates one or more of the bonding machines. Two or more machines can be virtually isolated by a construct, called an encapsulant, containing a population of machines and potentially other encapsulants that can only bond with each other. A hierarchical software structure can be created using nested encapsulants. Multi-threading is implemented by populations of machines in different encapsulants that are interacting concurrently. Machines and encapsulants can move in and out of other encapsulants, thereby changing the functionality. Bonding between machines' sites can be deterministic or stochastic with bonding triggering a sequence of actions that can be implemented by each machine. A self-assembled execution sequence occurs as a sequence of stochastic binding between machines followed by their deterministic actuation. It is the sequence of bonding of machines that determines the execution sequence, so that the sequence of instructions need not be contiguous in memory.
Chaotic sources of noise in machine acoustics
NASA Astrophysics Data System (ADS)
Moon, F. C., Prof.; Broschart, Dipl.-Ing. T.
1994-05-01
In this paper a model is posited for deterministic, random-like noise in machines with sliding rigid parts impacting linear continuous machine structures. Such problems occur in gear transmission systems. A mathematical model is proposed to explain the random-like structure-borne and air-borne noise from such systems when the input is a periodic deterministic excitation of the quasi-rigid impacting parts. An experimental study is presented which supports the model. A thin circular plate is impacted by a chaotically vibrating mass excited by a sinusoidal moving base. The results suggest that the plate vibrations might be predicted by replacing the chaotic vibrating mass with a probabilistic forcing function. Prechaotic vibrations of the impacting mass show classical period doubling phenomena.
Operational value of ensemble streamflow forecasts for hydropower production: A Canadian case study
NASA Astrophysics Data System (ADS)
Boucher, Marie-Amélie; Tremblay, Denis; Luc, Perreault; François, Anctil
2010-05-01
Ensemble and probabilistic forecasts have many advantages over deterministic ones, both in meteorology and hydrology (e.g. Krzysztofowicz, 2001). Mainly, they inform the user on the uncertainty linked to the forecast. It has been brought to attention that such additional information could lead to improved decision making (e.g. Wilks and Hamill, 1995; Mylne, 2002; Roulin, 2007), but very few studies concentrate on operational situations involving the use of such forecasts. In addition, many authors have demonstrated that ensemble forecasts outperform deterministic forecasts in terms of performance (e.g. Jaun et al., 2005; Velazquez et al., 2009; Laio and Tamea, 2007). However, such performance is mostly assessed on the basis of numerical scoring rules, which compare the forecasts to the observations, and seldom in terms of management gains. The proposed case study adopts an operational point of view, on the basis that a novel forecasting system has value only if it leads to increase monetary and societal gains (e.g. Murphy, 1994; Laio and Tamea, 2007). More specifically, Environment Canada operational ensemble precipitation forecasts are used to drive the HYDROTEL distributed hydrological model (Fortin et al., 1995), calibrated on the Gatineau watershed located in Québec, Canada. The resulting hydrological ensemble forecasts are then incorporated into Hydro-Québec SOHO stochastic management optimization tool that automatically search for optimal operation decisions for the all reservoirs and hydropower plants located on the basin. The timeline of the study is the fall season of year 2003. This period is especially relevant because of high precipitations that nearly caused a major spill, and forced the preventive evacuation of a portion of the population located near one of the dams. We show that the use of the ensemble forecasts would have reduced the occurrence of spills and flooding, which is of particular importance for dams located in populous area, and increased hydropower production. The ensemble precipitation forecasts extend from March 1st of 2002 to December 31st of 2003. They were obtained using two atmospheric models, SEF (8 members plus the control deterministic forecast) and GEM (8 members). The corresponding deterministic precipitation forecast issued by SEF model is also used within HYDROTEL in order to compare ensemble streamflow forecasts with their deterministic counterparts. Although this study does not incorporate all the sources of uncertainty, precipitation is certainly the most important input for hydrological modeling and conveys a great portion of the total uncertainty. References: Fortin, J.P., Moussa, R., Bocquillon, C. and Villeneuve, J.P. 1995: HYDROTEL, un modèle hydrologique distribué pouvant bénéficier des données fournies par la télédétection et les systèmes d'information géographique, Revue des Sciences de l'Eau, 8(1), 94-124. Jaun, S., Ahrens, B., Walser, A., Ewen, T. and Schaer, C. 2008: A probabilistic view on the August 2005 floods in the upper Rhine catchment, Natural Hazards and Earth System Sciences, 8 (2), 281-291. Krzysztofowicz, R. 2001: The case for probabilistic forecasting in hydrology, Journal of Hydrology, 249, 2-9. Murphy, A.H. 1994: Assessing the economic value of weather forecasts: An overview of methods, results and issues, Meteorological Applications, 1, 69-73. Mylne, K.R. 2002: Decision-Making from probability forecasts based on forecast value, Meteorological Applications, 9, 307-315. Laio, F. and Tamea, S. 2007: Verification tools for probabilistic forecasts of continuous hydrological variables, Hydrology and Earth System Sciences, 11, 1267-1277. Roulin, E. 2007: Skill and relative economic value of medium-range hydrological ensemble predictions, Hydrology and Earth System Sciences, 11, 725-737. Velazquez, J.-A., Petit, T., Lavoie, A., Boucher, M.-A., Turcotte, R., Fortin, V. and Anctil, F. 2009: An evaluation of the Canadian global meteorological ensemble prediction system for short-term hydrological forecasting, Hydrology and Earth System Sciences, 13(11), 2221-2231. Wilks, D.S. and Hamill, T.M. 1995: Potential economic value of ensemble-based surface weather forecasts, Monthly Weather Review, 123(12), 3565-3575.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Johnson, Riegardt M; Ramond, Jean-Baptiste; Gunnigle, Eoin; Seely, Mary; Cowan, Don A
2017-03-01
The central Namib Desert is hyperarid, where limited plant growth ensures that biogeochemical processes are largely driven by microbial populations. Recent research has shown that niche partitioning is critically involved in the assembly of Namib Desert edaphic communities. However, these studies have mainly focussed on the Domain Bacteria. Using microbial community fingerprinting, we compared the assembly of the bacterial, fungal and archaeal populations of microbial communities across nine soil niches from four Namib Desert soil habitats (riverbed, dune, gravel plain and salt pan). Permutational multivariate analysis of variance indicated that the nine soil niches presented significantly different physicochemistries (R 2 = 0.8306, P ≤ 0.0001) and that bacterial, fungal and archaeal populations were soil niche specific (R 2 ≥ 0.64, P ≤ 0.001). However, the abiotic drivers of community structure were Domain-specific (P < 0.05), with P, clay and sand fraction, and NH 4 influencing bacterial, fungal and archaeal communities, respectively. Soil physicochemistry and soil niche explained over 50% of the variation in community structure, and communities displayed strong non-random patterns of co-occurrence. Taken together, these results demonstrate that in central Namib Desert soil microbial communities, assembly is principally driven by deterministic processes.
Nonlinear unitary quantum collapse model with self-generated noise
NASA Astrophysics Data System (ADS)
Geszti, Tamás
2018-04-01
Collapse models including some external noise of unknown origin are routinely used to describe phenomena on the quantum-classical border; in particular, quantum measurement. Although containing nonlinear dynamics and thereby exposed to the possibility of superluminal signaling in individual events, such models are widely accepted on the basis of fully reproducing the non-signaling statistical predictions of quantum mechanics. Here we present a deterministic nonlinear model without any external noise, in which randomness—instead of being universally present—emerges in the measurement process, from deterministic irregular dynamics of the detectors. The treatment is based on a minimally nonlinear von Neumann equation for a Stern–Gerlach or Bell-type measuring setup, containing coordinate and momentum operators in a self-adjoint skew-symmetric, split scalar product structure over the configuration space. The microscopic states of the detectors act as a nonlocal set of hidden parameters, controlling individual outcomes. The model is shown to display pumping of weights between setup-defined basis states, with a single winner randomly selected and the rest collapsing to zero. Environmental decoherence has no role in the scenario. Through stochastic modelling, based on Pearle’s ‘gambler’s ruin’ scheme, outcome probabilities are shown to obey Born’s rule under a no-drift or ‘fair-game’ condition. This fully reproduces quantum statistical predictions, implying that the proposed non-linear deterministic model satisfies the non-signaling requirement. Our treatment is still vulnerable to hidden signaling in individual events, which remains to be handled by future research.
DIETARY EXPOSURES OF YOUNG CHILDREN, PART 3: MODELLING
A deterministic model was used to model dietary exposure of young children. Parameters included pesticide residue on food before handling, surface pesticide loading, transfer efficiencies and children's activity patterns. Three components of dietary pesticide exposure were includ...
Mesoscopic and continuum modelling of angiogenesis
Spill, F.; Guerrero, P.; Alarcon, T.; Maini, P. K.; Byrne, H. M.
2016-01-01
Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. PMID:24615007
Modeled Impacts of Chronic Wasting Disease on White-Tailed Deer in a Semi-Arid Environment.
Foley, Aaron M; Hewitt, David G; DeYoung, Charles A; DeYoung, Randy W; Schnupp, Matthew J
2016-01-01
White-tailed deer are a culturally and economically important game species in North America, especially in South Texas. The recent discovery of chronic wasting disease (CWD) in captive deer facilities in Texas has increased concern about the potential emergence of CWD in free-ranging deer. The concern is exacerbated because much of the South Texas region is a semi-arid environment with variable rainfall, where precipitation is strongly correlated with fawn recruitment. Further, the marginally productive rangelands, in combination with erratic fawn recruitment, results in populations that are frequently density-independent, and thus sensitive to additive mortality. It is unknown how a deer population in semi-arid regions would respond to the presence of CWD. We used long-term empirical datasets from a lightly harvested (2% annual harvest) population in conjunction with 3 prevalence growth rates from CWD afflicted areas (0.26%, 0.83%, and 2.3% increases per year) via a multi-stage partially deterministic model to simulate a deer population for 25 years under four scenarios: 1) without CWD and without harvest, 2) with CWD and without harvest, 3) with CWD and male harvest only, and 4) with CWD and harvest of both sexes. The modeled populations without CWD and without harvest averaged a 1.43% annual increase over 25 years; incorporation of 2% annual harvest of both sexes resulted in a stable population. The model with slowest CWD prevalence rate growth (0.26% annually) without harvest resulted in stable populations but the addition of 1% harvest resulted in population declines. Further, the male age structure in CWD models became skewed to younger age classes. We incorporated fawn:doe ratios from three CWD afflicted areas in Wisconsin and Wyoming into the model with 0.26% annual increase in prevalence and populations did not begin to decline until ~10%, ~16%, and ~26% of deer were harvested annually. Deer populations in variable environments rely on high adult survivorship to buffer the low and erratic fawn recruitment rates. The increase in additive mortality rates for adults via CWD negatively impacted simulated population trends to the extent that hunter opportunity would be greatly reduced. Our results improve understanding of the potential influences of CWD on deer populations in semi-arid environments with implications for deer managers, disease ecologists, and policy makers.
Modeled Impacts of Chronic Wasting Disease on White-Tailed Deer in a Semi-Arid Environment
Hewitt, David G.; DeYoung, Charles A.; DeYoung, Randy W.; Schnupp, Matthew J.
2016-01-01
White-tailed deer are a culturally and economically important game species in North America, especially in South Texas. The recent discovery of chronic wasting disease (CWD) in captive deer facilities in Texas has increased concern about the potential emergence of CWD in free-ranging deer. The concern is exacerbated because much of the South Texas region is a semi-arid environment with variable rainfall, where precipitation is strongly correlated with fawn recruitment. Further, the marginally productive rangelands, in combination with erratic fawn recruitment, results in populations that are frequently density-independent, and thus sensitive to additive mortality. It is unknown how a deer population in semi-arid regions would respond to the presence of CWD. We used long-term empirical datasets from a lightly harvested (2% annual harvest) population in conjunction with 3 prevalence growth rates from CWD afflicted areas (0.26%, 0.83%, and 2.3% increases per year) via a multi-stage partially deterministic model to simulate a deer population for 25 years under four scenarios: 1) without CWD and without harvest, 2) with CWD and without harvest, 3) with CWD and male harvest only, and 4) with CWD and harvest of both sexes. The modeled populations without CWD and without harvest averaged a 1.43% annual increase over 25 years; incorporation of 2% annual harvest of both sexes resulted in a stable population. The model with slowest CWD prevalence rate growth (0.26% annually) without harvest resulted in stable populations but the addition of 1% harvest resulted in population declines. Further, the male age structure in CWD models became skewed to younger age classes. We incorporated fawn:doe ratios from three CWD afflicted areas in Wisconsin and Wyoming into the model with 0.26% annual increase in prevalence and populations did not begin to decline until ~10%, ~16%, and ~26% of deer were harvested annually. Deer populations in variable environments rely on high adult survivorship to buffer the low and erratic fawn recruitment rates. The increase in additive mortality rates for adults via CWD negatively impacted simulated population trends to the extent that hunter opportunity would be greatly reduced. Our results improve understanding of the potential influences of CWD on deer populations in semi-arid environments with implications for deer managers, disease ecologists, and policy makers. PMID:27711208
Bayesian probabilistic population projections for all countries
Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.
2012-01-01
Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249
Santangelo, James S; Johnson, Marc T J; Ness, Rob W
2018-05-16
Urban environments offer the opportunity to study the role of adaptive and non-adaptive evolutionary processes on an unprecedented scale. While the presence of parallel clines in heritable phenotypic traits is often considered strong evidence for the role of natural selection, non-adaptive evolutionary processes can also generate clines, and this may be more likely when traits have a non-additive genetic basis due to epistasis. In this paper, we use spatially explicit simulations modelled according to the cyanogenesis (hydrogen cyanide, HCN) polymorphism in white clover ( Trifolium repens ) to examine the formation of phenotypic clines along urbanization gradients under varying levels of drift, gene flow and selection. HCN results from an epistatic interaction between two Mendelian-inherited loci. Our results demonstrate that the genetic architecture of this trait makes natural populations susceptible to decreases in HCN frequencies via drift. Gradients in the strength of drift across a landscape resulted in phenotypic clines with lower frequencies of HCN in strongly drifting populations, giving the misleading appearance of deterministic adaptive changes in the phenotype. Studies of heritable phenotypic change in urban populations should generate null models of phenotypic evolution based on the genetic architecture underlying focal traits prior to invoking selection's role in generating adaptive differentiation. © 2018 The Author(s).
Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.
Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael
2014-10-01
Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.
Fishing Quotas, Induced Allee Effect, and Fluctuation-Driven Extinction.
Hastings, Harold M; Radin, Michael; Wiandt, Tamas
2017-01-01
We explore the potential of modifications to standard fishery models (for example Gordon-Schafer-Munro) to help understand events such as the collapse of the North Atlantic cod fishery. In particular we find that quota-driven and similar harvesting strategies induce an effective strong Allee effect (collapse if the population falls below a critical level). In the presence of environmental noise, fish population dynamics is similar to a random walk with (non-linear) drift. The expected survival time (first passage time to collapse) is shown to depend sensitively upon the amount of environmental noise and size of the 'safe zone' between the deterministic steady state population and the critical population level at which the system collapses; more precisely it is exponential in the cube of the size of the safe zone divided by the variance of the noise process. Similar scaling can be expected for more survival in more general systems with multiple steady states. Our calculations imply an amplification effect under which small increases in harvest yield large decreases in expected survival time, and one should be cautious in changes in harvesting, especially in fisheries with poor or limited data and fisheries affected by climate change.
Study of a tri-trophic prey-dependent food chain model of interacting populations.
Haque, Mainul; Ali, Nijamuddin; Chakravarty, Santabrata
2013-11-01
The current paper accounts for the influence of intra-specific competition among predators in a prey dependent tri-trophic food chain model of interacting populations. We offer a detailed mathematical analysis of the proposed food chain model to illustrate some of the significant results that has arisen from the interplay of deterministic ecological phenomena and processes. Biologically feasible equilibria of the system are observed and the behaviours of the system around each of them are described. In particular, persistence, stability (local and global) and bifurcation (saddle-node, transcritical, Hopf-Andronov) analysis of this model are obtained. Relevant results from previous well known food chain models are compared with the current findings. Global stability analysis is also carried out by constructing appropriate Lyapunov functions. Numerical simulations show that the present system is capable enough to produce chaotic dynamics when the rate of self-interaction is very low. On the other hand such chaotic behaviour disappears for a certain value of the rate of self interaction. In addition, numerical simulations with experimented parameters values confirm the analytical results and shows that intra-specific competitions bears a potential role in controlling the chaotic dynamics of the system; and thus the role of self interactions in food chain model is illustrated first time. Finally, a discussion of the ecological applications of the analytical and numerical findings concludes the paper. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Castaneda-Lopez, Homero
A methodology for detecting and locating defects or discontinuities on the outside covering of coated metal underground pipelines subjected to cathodic protection has been addressed. On the basis of wide range AC impedance signals for various frequencies applied to a steel-coated pipeline system and by measuring its corresponding transfer function under several laboratory simulation scenarios, a physical laboratory setup of an underground cathodic-protected, coated pipeline was built. This model included different variables and elements that exist under real conditions, such as soil resistivity, soil chemical composition, defect (holiday) location in the pipeline covering, defect area and geometry, and level of cathodic protection. The AC impedance data obtained under different working conditions were used to fit an electrical transmission line model. This model was then used as a tool to fit the impedance signal for different experimental conditions and to establish trends in the impedance behavior without the necessity of further experimental work. However, due to the chaotic nature of the transfer function response of this system under several conditions, it is believed that non-deterministic models based on pattern recognition algorithms are suitable for field condition analysis. A non-deterministic approach was used for experimental analysis by applying an artificial neural network (ANN) algorithm based on classification analysis capable of studying the pipeline system and differentiating the variables that can change impedance conditions. These variables include level of cathodic protection, location of discontinuities (holidays), and severity of corrosion. This work demonstrated a proof-of-concept for a well-known technique and a novel algorithm capable of classifying impedance data for experimental results to predict the exact location of the active holidays and defects on the buried pipelines. Laboratory findings from this procedure are promising, and efforts to develop it for field conditions should continue.
Diaz-Ruelas, Alvaro; Jeldtoft Jensen, Henrik; Piovani, Duccio; Robledo, Alberto
2016-12-01
It is well known that low-dimensional nonlinear deterministic maps close to a tangent bifurcation exhibit intermittency and this circumstance has been exploited, e.g., by Procaccia and Schuster [Phys. Rev. A 28, 1210 (1983)], to develop a general theory of 1/f spectra. This suggests it is interesting to study the extent to which the behavior of a high-dimensional stochastic system can be described by such tangent maps. The Tangled Nature (TaNa) Model of evolutionary ecology is an ideal candidate for such a study, a significant model as it is capable of reproducing a broad range of the phenomenology of macroevolution and ecosystems. The TaNa model exhibits strong intermittency reminiscent of punctuated equilibrium and, like the fossil record of mass extinction, the intermittency in the model is found to be non-stationary, a feature typical of many complex systems. We derive a mean-field version for the evolution of the likelihood function controlling the reproduction of species and find a local map close to tangency. This mean-field map, by our own local approximation, is able to describe qualitatively only one episode of the intermittent dynamics of the full TaNa model. To complement this result, we construct a complete nonlinear dynamical system model consisting of successive tangent bifurcations that generates time evolution patterns resembling those of the full TaNa model in macroscopic scales. The switch from one tangent bifurcation to the next in the sequences produced in this model is stochastic in nature, based on criteria obtained from the local mean-field approximation, and capable of imitating the changing set of types of species and total population in the TaNa model. The model combines full deterministic dynamics with instantaneous parameter random jumps at stochastically drawn times. In spite of the limitations of our approach, which entails a drastic collapse of degrees of freedom, the description of a high-dimensional model system in terms of a low-dimensional one appears to be illuminating.
Hidden order in crackling noise during peeling of an adhesive tape.
Kumar, Jagadish; Ciccotti, M; Ananthakrishna, G
2008-04-01
We address the longstanding problem of recovering dynamical information from noisy acoustic emission signals arising from peeling of an adhesive tape subject to constant traction velocity. Using the phase space reconstruction procedure we demonstrate the deterministic chaotic dynamics by establishing the existence of correlation dimension as also a positive Lyapunov exponent in a midrange of traction velocities. The results are explained on the basis of the model that also emphasizes the deterministic origin of acoustic emission by clarifying its connection to stick-slip dynamics.
MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes
Williams, B.K.
1988-01-01
Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.
3D Dynamic Rupture Simulations along Dipping Faults, with a focus on the Wasatch Fault Zone, Utah
NASA Astrophysics Data System (ADS)
Withers, K.; Moschetti, M. P.
2017-12-01
We study dynamic rupture and ground motion from dip-slip faults in regions that have high-seismic hazard, such as the Wasatch fault zone, Utah. Previous numerical simulations have modeled deterministic ground motion along segments of this fault in the heavily populated regions near Salt Lake City but were restricted to low frequencies ( 1 Hz). We seek to better understand the rupture process and assess broadband ground motions and variability from the Wasatch Fault Zone by extending deterministic ground motion prediction to higher frequencies (up to 5 Hz). We perform simulations along a dipping normal fault (40 x 20 km along strike and width, respectively) with characteristics derived from geologic observations to generate a suite of ruptures > Mw 6.5. This approach utilizes dynamic simulations (fully physics-based models, where the initial stress drop and friction law are imposed) using a summation by parts (SBP) method. The simulations include rough-fault topography following a self-similar fractal distribution (over length scales from 100 m to the size of the fault) in addition to off-fault plasticity. Energy losses from heat and other mechanisms, modeled as anelastic attenuation, are also included, as well as free-surface topography, which can significantly affect ground motion patterns. We compare the effect of material structure and both rate and state and slip-weakening friction laws have on rupture propagation. The simulations show reduced slip and moment release in the near surface with the inclusion of plasticity, better agreeing with observations of shallow slip deficit. Long-wavelength fault geometry imparts a non-uniform stress distribution along both dip and strike, influencing the preferred rupture direction and hypocenter location, potentially important for seismic hazard estimation.
Toward a Deterministic Model of Planetary Formation. IV. Effects of Type I Migration
NASA Astrophysics Data System (ADS)
Ida, S.; Lin, D. N. C.
2008-01-01
In a further development of a deterministic planet formation model (Ida & Lin), we consider the effect of type I migration of protoplanetary embryos due to their tidal interaction with their nascent disks. During the early phase of protostellar disks, although embryos rapidly emerge in regions interior to the ice line, uninhibited type I migration leads to their efficient self-clearing. But embryos continue to form from residual planetesimals, repeatedly migrate inward, and provide a main channel of heavy-element accretion onto their host stars. During the advanced stages of disk evolution (a few Myr), the gas surface density declines to values comparable to or smaller than that of the minimum mass nebula model, and type I migration is no longer effective for Mars-mass embryos. Over wide ranges of initial disk surface densities and type I migration efficiencies, the surviving population of embryos interior to the ice line has a total mass of several M⊕. With this reservoir, there is an adequate inventory of residual embryos to subsequently assemble into rocky planets similar to those around the Sun. However, the onset of efficient gas accretion requires the emergence and retention of cores more massive than a few M⊕ prior to the severe depletion of the disk gas. The formation probability of gas giant planets and hence the predicted mass and semimajor axis distributions of extrasolar gas giants are sensitively determined by the strength of type I migration. We suggest that the distributions consistent with observations can be reproduced only if the actual type I migration timescale is at least an order of magnitude longer than that deduced from linear theories.
The general situation, (but exemplified in urban areas), where a significant degree of sub-grid variability (SGV) exists in grid models poses problems when comparing gridbased air quality modeling results with observations. Typically, grid models ignore or parameterize processes ...
Corrêa, Nilton Lavatori; de Sá, Lidia Vasconcellos; de Mello, Rossana Corbo Ramalho
2017-02-01
An increase in the incidence of second primary cancers is the late effect of greatest concern that could occur in differentiated thyroid carcinoma (DTC) patients treated with radioactive iodine (RAI). The decision to treat a patient with RAI should therefore incorporate a careful risk-benefit analysis. The objective of this work was to adapt the risk-estimation models developed by the Biological Effects of Ionizing Radiation Committee to local epidemiological characteristics in order to assess the carcinogenesis risk from radiation in a population of Brazilian DTC patients treated with RAI. Absorbed radiation doses in critical organs were also estimated to determine whether they exceeded the thresholds for deterministic effects. A total of 416 DTC patients treated with RAI were retrospectively studied. Four organs were selected for absorbed dose estimation and subsequent calculation of carcinogenic risk: the kidney, stomach, salivary glands, and bone marrow. Absorbed doses were calculated by dose factors (absorbed dose per unit activity administered) previously established and based on standard human models. The lifetime attributable risk (LAR) of incidence of cancer as a function of age, sex, and organ-specific dose was estimated, relating it to the activity of RAI administered in the initial treatment. The salivary glands received the greatest absorbed doses of radiation, followed by the stomach, kidney, and bone marrow. None of these, however, surpassed the threshold for deterministic effects for a single administration of RAI. Younger patients received the same level of absorbed dose in the critical organs as older patients did. The lifetime attributable risk for stomach cancer incidence was by far the highest, followed in descending order by salivary-gland cancer, leukemia, and kidney cancer. RAI in a single administration is safe in terms of deterministic effects because even high-administered activities do not result in absorbed doses that exceed the thresholds for significant tissue reactions. The Biological Effects of Ionizing Radiation Committee mathematical models are a practical method of quantifying the risks of a second primary cancer, demonstrating a marked decrease in risk for younger patients with the administration of lower RAI activities and suggesting that only the smallest activities necessary to promote an effective ablation should be administered in low-risk DTC patients.
A variational method for analyzing limit cycle oscillations in stochastic hybrid systems
NASA Astrophysics Data System (ADS)
Bressloff, Paul C.; MacLaurin, James
2018-06-01
Many systems in biology can be modeled through ordinary differential equations, which are piece-wise continuous, and switch between different states according to a Markov jump process known as a stochastic hybrid system or piecewise deterministic Markov process (PDMP). In the fast switching limit, the dynamics converges to a deterministic ODE. In this paper, we develop a phase reduction method for stochastic hybrid systems that support a stable limit cycle in the deterministic limit. A classic example is the Morris-Lecar model of a neuron, where the switching Markov process is the number of open ion channels and the continuous process is the membrane voltage. We outline a variational principle for the phase reduction, yielding an exact analytic expression for the resulting phase dynamics. We demonstrate that this decomposition is accurate over timescales that are exponential in the switching rate ɛ-1 . That is, we show that for a constant C, the probability that the expected time to leave an O(a) neighborhood of the limit cycle is less than T scales as T exp (-C a /ɛ ) .
Analysis of stochastic model for non-linear volcanic dynamics
NASA Astrophysics Data System (ADS)
Alexandrov, D.; Bashkirtseva, I.; Ryashko, L.
2014-12-01
Motivated by important geophysical applications we consider a dynamic model of the magma-plug system previously derived by Iverson et al. (2006) under the influence of stochastic forcing. Due to strong nonlinearity of the friction force for solid plug along its margins, the initial deterministic system exhibits impulsive oscillations. Two types of dynamic behavior of the system under the influence of the parametric stochastic forcing have been found: random trajectories are scattered on both sides of the deterministic cycle or grouped on its internal side only. It is shown that dispersions are highly inhomogeneous along cycles in the presence of noises. The effects of noise-induced shifts, pressure stabilization and localization of random trajectories have been revealed with increasing the noise intensity. The plug velocity, pressure and displacement are highly dependent of noise intensity as well. These new stochastic phenomena are related with the nonlinear peculiarities of the deterministic phase portrait. It is demonstrated that the repetitive stick-slip motions of the magma-plug system in the case of stochastic forcing can be connected with drumbeat earthquakes.
Modelling the interaction between flooding events and economic growth
NASA Astrophysics Data System (ADS)
Grames, J.; Prskawetz, A.; Grass, D.; Blöschl, G.
2015-06-01
Socio-hydrology describes the interaction between the socio-economy and water. Recent models analyze the interplay of community risk-coping culture, flooding damage and economic growth (Di Baldassarre et al., 2013; Viglione et al., 2014). These models descriptively explain the feedbacks between socio-economic development and natural disasters like floods. Contrary to these descriptive models, our approach develops an optimization model, where the intertemporal decision of an economic agent interacts with the hydrological system. In order to build this first economic growth model describing the interaction between the consumption and investment decisions of an economic agent and the occurrence of flooding events, we transform an existing descriptive stochastic model into an optimal deterministic model. The intermediate step is to formulate and simulate a descriptive deterministic model. We develop a periodic water function to approximate the former discrete stochastic time series of rainfall events. Due to the non-autonomous exogenous periodic rainfall function the long-term path of consumption and investment will be periodic.
Automated Calibration For Numerical Models Of Riverflow
NASA Astrophysics Data System (ADS)
Fernandez, Betsaida; Kopmann, Rebekka; Oladyshkin, Sergey
2017-04-01
Calibration of numerical models is fundamental since the beginning of all types of hydro system modeling, to approximate the parameters that can mimic the overall system behavior. Thus, an assessment of different deterministic and stochastic optimization methods is undertaken to compare their robustness, computational feasibility, and global search capacity. Also, the uncertainty of the most suitable methods is analyzed. These optimization methods minimize the objective function that comprises synthetic measurements and simulated data. Synthetic measurement data replace the observed data set to guarantee an existing parameter solution. The input data for the objective function derivate from a hydro-morphological dynamics numerical model which represents an 180-degree bend channel. The hydro- morphological numerical model shows a high level of ill-posedness in the mathematical problem. The minimization of the objective function by different candidate methods for optimization indicates a failure in some of the gradient-based methods as Newton Conjugated and BFGS. Others reveal partial convergence, such as Nelder-Mead, Polak und Ribieri, L-BFGS-B, Truncated Newton Conjugated, and Trust-Region Newton Conjugated Gradient. Further ones indicate parameter solutions that range outside the physical limits, such as Levenberg-Marquardt and LeastSquareRoot. Moreover, there is a significant computational demand for genetic optimization methods, such as Differential Evolution and Basin-Hopping, as well as for Brute Force methods. The Deterministic Sequential Least Square Programming and the scholastic Bayes Inference theory methods present the optimal optimization results. keywords: Automated calibration of hydro-morphological dynamic numerical model, Bayesian inference theory, deterministic optimization methods.
Forecasting residential electricity demand in provincial China.
Liao, Hua; Liu, Yanan; Gao, Yixuan; Hao, Yu; Ma, Xiao-Wei; Wang, Kan
2017-03-01
In China, more than 80% electricity comes from coal which dominates the CO2 emissions. Residential electricity demand forecasting plays a significant role in electricity infrastructure planning and energy policy designing, but it is challenging to make an accurate forecast for developing countries. This paper forecasts the provincial residential electricity consumption of China in the 13th Five-Year-Plan (2016-2020) period using panel data. To overcome the limitations of widely used predication models with unreliably prior knowledge on function forms, a robust piecewise linear model in reduced form is utilized to capture the non-deterministic relationship between income and residential electricity consumption. The forecast results suggest that the growth rates of developed provinces will slow down, while the less developed will be still in fast growing. The national residential electricity demand will increase at 6.6% annually during 2016-2020, and populous provinces such as Guangdong will be the main contributors to the increments.
Rabies epidemic model with uncertainty in parameters: crisp and fuzzy approaches
NASA Astrophysics Data System (ADS)
Ndii, M. Z.; Amarti, Z.; Wiraningsih, E. D.; Supriatna, A. K.
2018-03-01
A deterministic mathematical model is formulated to investigate the transmission dynamics of rabies. In particular, we investigate the effects of vaccination, carrying capacity and the transmission rate on the rabies epidemics and allow for uncertainty in the parameters. We perform crisp and fuzzy approaches. We find that, in the case of crisp parameters, rabies epidemics may be interrupted when the carrying capacity and the transmission rate are not high. Our findings suggest that limiting the growth of dog population and reducing the potential contact between susceptible and infectious dogs may aid in interrupting rabies epidemics. We extend the work by considering a fuzzy carrying capacity and allow for low, medium, and high level of carrying capacity. The result confirms the results obtained by using crisp carrying capacity, that is, when the carrying capacity is not too high, the vaccination could confine the disease effectively.
Salgia, Ravi; Mambetsariev, Isa; Hewelt, Blake; Achuthan, Srisairam; Li, Haiqing; Poroyko, Valeriy; Wang, Yingyu; Sattler, Martin
2018-05-25
Mathematical cancer models are immensely powerful tools that are based in part on the fractal nature of biological structures, such as the geometry of the lung. Cancers of the lung provide an opportune model to develop and apply algorithms that capture changes and disease phenotypes. We reviewed mathematical models that have been developed for biological sciences and applied them in the context of small cell lung cancer (SCLC) growth, mutational heterogeneity, and mechanisms of metastasis. The ultimate goal is to develop the stochastic and deterministic nature of this disease, to link this comprehensive set of tools back to its fractalness and to provide a platform for accurate biomarker development. These techniques may be particularly useful in the context of drug development research, such as combination with existing omics approaches. The integration of these tools will be important to further understand the biology of SCLC and ultimately develop novel therapeutics.
MIMICKING COUNTERFACTUAL OUTCOMES TO ESTIMATE CAUSAL EFFECTS.
Lok, Judith J
2017-04-01
In observational studies, treatment may be adapted to covariates at several times without a fixed protocol, in continuous time. Treatment influences covariates, which influence treatment, which influences covariates, and so on. Then even time-dependent Cox-models cannot be used to estimate the net treatment effect. Structural nested models have been applied in this setting. Structural nested models are based on counterfactuals: the outcome a person would have had had treatment been withheld after a certain time. Previous work on continuous-time structural nested models assumes that counterfactuals depend deterministically on observed data, while conjecturing that this assumption can be relaxed. This article proves that one can mimic counterfactuals by constructing random variables, solutions to a differential equation, that have the same distribution as the counterfactuals, even given past observed data. These "mimicking" variables can be used to estimate the parameters of structural nested models without assuming the treatment effect to be deterministic.
NASA Astrophysics Data System (ADS)
Li, Fei; Subramanian, Kartik; Chen, Minghan; Tyson, John J.; Cao, Yang
2016-06-01
The asymmetric cell division cycle in Caulobacter crescentus is controlled by an elaborate molecular mechanism governing the production, activation and spatial localization of a host of interacting proteins. In previous work, we proposed a deterministic mathematical model for the spatiotemporal dynamics of six major regulatory proteins. In this paper, we study a stochastic version of the model, which takes into account molecular fluctuations of these regulatory proteins in space and time during early stages of the cell cycle of wild-type Caulobacter cells. We test the stochastic model with regard to experimental observations of increased variability of cycle time in cells depleted of the divJ gene product. The deterministic model predicts that overexpression of the divK gene blocks cell cycle progression in the stalked stage; however, stochastic simulations suggest that a small fraction of the mutants cells do complete the cell cycle normally.
Circular Conditional Autoregressive Modeling of Vector Fields.
Modlin, Danny; Fuentes, Montse; Reich, Brian
2012-02-01
As hurricanes approach landfall, there are several hazards for which coastal populations must be prepared. Damaging winds, torrential rains, and tornadoes play havoc with both the coast and inland areas; but, the biggest seaside menace to life and property is the storm surge. Wind fields are used as the primary forcing for the numerical forecasts of the coastal ocean response to hurricane force winds, such as the height of the storm surge and the degree of coastal flooding. Unfortunately, developments in deterministic modeling of these forcings have been hindered by computational expenses. In this paper, we present a multivariate spatial model for vector fields, that we apply to hurricane winds. We parameterize the wind vector at each site in polar coordinates and specify a circular conditional autoregressive (CCAR) model for the vector direction, and a spatial CAR model for speed. We apply our framework for vector fields to hurricane surface wind fields for Hurricane Floyd of 1999 and compare our CCAR model to prior methods that decompose wind speed and direction into its N-S and W-E cardinal components.
Circular Conditional Autoregressive Modeling of Vector Fields*
Modlin, Danny; Fuentes, Montse; Reich, Brian
2013-01-01
As hurricanes approach landfall, there are several hazards for which coastal populations must be prepared. Damaging winds, torrential rains, and tornadoes play havoc with both the coast and inland areas; but, the biggest seaside menace to life and property is the storm surge. Wind fields are used as the primary forcing for the numerical forecasts of the coastal ocean response to hurricane force winds, such as the height of the storm surge and the degree of coastal flooding. Unfortunately, developments in deterministic modeling of these forcings have been hindered by computational expenses. In this paper, we present a multivariate spatial model for vector fields, that we apply to hurricane winds. We parameterize the wind vector at each site in polar coordinates and specify a circular conditional autoregressive (CCAR) model for the vector direction, and a spatial CAR model for speed. We apply our framework for vector fields to hurricane surface wind fields for Hurricane Floyd of 1999 and compare our CCAR model to prior methods that decompose wind speed and direction into its N-S and W-E cardinal components. PMID:24353452
ERIC Educational Resources Information Center
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-01-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…
Nonlinear forecasting as a way of distinguishing chaos from measurement error in time series
NASA Astrophysics Data System (ADS)
Sugihara, George; May, Robert M.
1990-04-01
An approach is presented for making short-term predictions about the trajectories of chaotic dynamical systems. The method is applied to data on measles, chickenpox, and marine phytoplankton populations, to show how apparent noise associated with deterministic chaos can be distinguished from sampling error and other sources of externally induced environmental noise.
Amplification of intrinsic fluctuations by the Lorenz equations
NASA Astrophysics Data System (ADS)
Fox, Ronald F.; Elston, T. C.
1993-07-01
Macroscopic systems (e.g., hydrodynamics, chemical reactions, electrical circuits, etc.) manifest intrinsic fluctuations of molecular and thermal origin. When the macroscopic dynamics is deterministically chaotic, the intrinsic fluctuations may become amplified by several orders of magnitude. Numerical studies of this phenomenon are presented in detail for the Lorenz model. Amplification to macroscopic scales is exhibited, and quantitative methods (binning and a difference-norm) are presented for measuring macroscopically subliminal amplification effects. In order to test the quality of the numerical results, noise induced chaos is studied around a deterministically nonchaotic state, where the scaling law relating the Lyapunov exponent to noise strength obtained for maps is confirmed for the Lorenz model, a system of ordinary differential equations.
Automated Assume-Guarantee Reasoning by Abstraction Refinement
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra
2008-01-01
Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.
Deterministic Impulsive Vacuum Foundations for Quantum-Mechanical Wavefunctions
NASA Astrophysics Data System (ADS)
Valentine, John S.
2013-09-01
By assuming that a fermion de-constitutes immediately at source, that its constituents, as bosons, propagate uniformly as scalar vacuum terms with phase (radial) symmetry, and that fermions are unique solutions for specific phase conditions, we find a model that self-quantizes matter from continuous waves, unifying bosons and fermion ontologies in a single basis, in a constitution-invariant process. Vacuum energy has a wavefunction context, as a mass-energy term that enables wave collapse and increases its amplitude, with gravitational field as the gradient of the flux density. Gravitational and charge-based force effects emerge as statistics without special treatment. Confinement, entanglement, vacuum statistics, forces, and wavefunction terms emerge from the model's deterministic foundations.
The Constitutive Modeling of Thin Films with Randon Material Wrinkles
NASA Technical Reports Server (NTRS)
Murphey, Thomas W.; Mikulas, Martin M.
2001-01-01
Material wrinkles drastically alter the structural constitutive properties of thin films. Normally linear elastic materials, when wrinkled, become highly nonlinear and initially inelastic. Stiffness' reduced by 99% and negative Poisson's ratios are typically observed. This paper presents an effective continuum constitutive model for the elastic effects of material wrinkles in thin films. The model considers general two-dimensional stress and strain states (simultaneous bi-axial and shear stress/strain) and neglects out of plane bending. The constitutive model is derived from a traditional mechanics analysis of an idealized physical model of random material wrinkles. Model parameters are the directly measurable wrinkle characteristics of amplitude and wavelength. For these reasons, the equations are mechanistic and deterministic. The model is compared with bi-axial tensile test data for wrinkled Kaptong(Registered Trademark) HN and is shown to deterministically predict strain as a function of stress with an average RMS error of 22%. On average, fitting the model to test data yields an RMS error of 1.2%
INTEGRATED PLANNING MODEL - EPA APPLICATIONS
The Integrated Planning Model (IPM) is a multi-regional, dynamic, deterministic linear programming (LP) model of the electric power sector in the continental lower 48 states and the District of Columbia. It provides forecasts up to year 2050 of least-cost capacity expansion, elec...
NASA Astrophysics Data System (ADS)
Mannattil, Manu; Pandey, Ambrish; Verma, Mahendra K.; Chakraborty, Sagar
2017-12-01
Constructing simpler models, either stochastic or deterministic, for exploring the phenomenon of flow reversals in fluid systems is in vogue across disciplines. Using direct numerical simulations and nonlinear time series analysis, we illustrate that the basic nature of flow reversals in convecting fluids can depend on the dimensionless parameters describing the system. Specifically, we find evidence of low-dimensional behavior in flow reversals occurring at zero Prandtl number, whereas we fail to find such signatures for reversals at infinite Prandtl number. Thus, even in a single system, as one varies the system parameters, one can encounter reversals that are fundamentally different in nature. Consequently, we conclude that a single general low-dimensional deterministic model cannot faithfully characterize flow reversals for every set of parameter values.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110
NASA Astrophysics Data System (ADS)
Song, Yiliao; Qin, Shanshan; Qu, Jiansheng; Liu, Feng
2015-10-01
The issue of air quality regarding PM pollution levels in China is a focus of public attention. To address that issue, to date, a series of studies is in progress, including PM monitoring programs, PM source apportionment, and the enactment of new ambient air quality index standards. However, related research concerning computer modeling for PM future trends estimation is rare, despite its significance to forecasting and early warning systems. Thereby, a study regarding deterministic and interval forecasts of PM is performed. In this study, data on hourly and 12 h-averaged air pollutants are applied to forecast PM concentrations within the Yangtze River Delta (YRD) region of China. The characteristics of PM emissions have been primarily examined and analyzed using different distribution functions. To improve the distribution fitting that is crucial for estimating PM levels, an artificial intelligence algorithm is incorporated to select the optimal parameters. Following that step, an ANF model is used to conduct deterministic forecasts of PM. With the identified distributions and deterministic forecasts, different levels of PM intervals are estimated. The results indicate that the lognormal or gamma distributions are highly representative of the recorded PM data with a goodness-of-fit R2 of approximately 0.998. Furthermore, the results of the evaluation metrics (MSE, MAPE and CP, AW) also show high accuracy within the deterministic and interval forecasts of PM, indicating that this method enables the informative and effective quantification of future PM trends.
Griffin, James S; Wells, George F
2017-01-01
Seasonal community structure and regionally synchronous population dynamics have been observed in natural microbial ecosystems, but have not been well documented in wastewater treatment bioreactors. Few studies of community dynamics in full-scale activated sludge systems facing similar meteorological conditions have been done to compare the importance of deterministic and neutral community assembly mechanisms. We subjected weekly activated sludge samples from six regional full-scale bioreactors at four wastewater treatment plants obtained over 1 year to Illumina sequencing of 16S ribosomal RNA genes, resulting in a library of over 17 million sequences. All samples derived from reactors treating primarily municipal wastewater. Despite variation in operational characteristics and location, communities displayed temporal synchrony at the individual operational taxonomic unit (OTU), broad phylogenetic affiliation and community-wide scale. Bioreactor communities were dominated by 134 abundant and highly regionally synchronized OTU populations that accounted for over 50% of the total reads. Non-core OTUs displayed abundance-dependent population synchrony. Alpha diversity varied by reactor, but showed a highly reproducible and synchronous seasonal fluctuation. Community similarity was dominated by seasonal changes, but individual reactors maintained minor stable differences after 1 year. Finally, the impacts of mass migration driven by direct biomass transfers between reactors was investigated, but had no significant effect on community similarity or diversity in the sink community. Our results show that population dynamics in activated sludge bioreactors are consistent with niche-driven assembly guided by seasonal temperature fluctuations. PMID:27996980
Projecting the Population-level Effects of Mercury on the Common Loon in the Northeast
NASA Astrophysics Data System (ADS)
Evers, D. C.; Mitro, M. G.; Gleason, T. R.
2001-05-01
The Common Loon (Gavia immer) is a top-level predator in aquatic systems and is at risk to mercury contamination. This risk is of particular concern in the Northeast, the region of North America in which loons have the highest mean body concentration of methylmercury (MeHg). We used matrix population models to project the population-level effects of mercury on loons in four states in the Northeast (New York, Vermont, New Hampshire, and Maine) exhibiting different levels of risk to MeHg. Four categories of risk to MeHg (low, moderate, high, and extra high) were established based on MeHg levels observed in loons and associated effects observed at the individual and population levels in the field (e.g., behavior and reproductive success). We parameterized deterministic matrix population models using survival estimates from a 12-year band-resight data set and productivity estimates from a 25-year data set of nesting loon observations in NH. The juvenile loon survival rate was 0.55 (minimum) and 0.63 (maximum) (ages 1-3), and the adult loon survival rate was 0.95 (ages 4-30). The mean age at first reproduction was 7. The mean fertility was 0.26 fledgelings per individual at low to moderate risk; there were 53% fewer fledged young per individual at high to extra high risk. Productivity was weighted by risk for each state. The portion of the breeding population at high to extra high risk was 10% in NY, 15% in VT, 17% in NH, and 28% in ME. We also constructed a stochastic model in which productivity was randomly selected in each time step from the 25 estimates in the NH data set. Model results indicated a negative population growth rate for some states. There was a decreasing trend in population growth rate as the percentage of the loon population at high to extra high risk increased. The stochastic model showed that the population growth rate varied over a range of about 0.05 from year to year, and this range decreased as the percentage of the loon population at high to extra high risk increased. These results suggest that an increase in risk to mercury that effects a change in reproductive success may have a negative population-level effect on loons.
Laws of Large Numbers and Langevin Approximations for Stochastic Neural Field Equations
2013-01-01
In this study, we consider limit theorems for microscopic stochastic models of neural fields. We show that the Wilson–Cowan equation can be obtained as the limit in uniform convergence on compacts in probability for a sequence of microscopic models when the number of neuron populations distributed in space and the number of neurons per population tend to infinity. This result also allows to obtain limits for qualitatively different stochastic convergence concepts, e.g., convergence in the mean. Further, we present a central limit theorem for the martingale part of the microscopic models which, suitably re-scaled, converges to a centred Gaussian process with independent increments. These two results provide the basis for presenting the neural field Langevin equation, a stochastic differential equation taking values in a Hilbert space, which is the infinite-dimensional analogue of the chemical Langevin equation in the present setting. On a technical level, we apply recently developed law of large numbers and central limit theorems for piecewise deterministic processes taking values in Hilbert spaces to a master equation formulation of stochastic neuronal network models. These theorems are valid for processes taking values in Hilbert spaces, and by this are able to incorporate spatial structures of the underlying model. Mathematics Subject Classification (2000): 60F05, 60J25, 60J75, 92C20. PMID:23343328
McVERNON, J.; RAMSAY, M. E.; McLEAN, A. R.
2008-01-01
SUMMARY A rise in invasive Haemophilus influenzae type b (Hib) infections occurred 8 years after vaccine introduction in the United Kingdom. Aspects of Hib vaccine delivery unique to the United Kingdom have been implicated. The authors developed a fully age-structured deterministic susceptible–infected–resistant–susceptible mathematical model, expressed as a set of partial differential equations, to better understand the causes of declining vaccine effectiveness. We also investigated the consequences of the vaccine's impact on reducing Hib transmission for maintenance of immunity. Our findings emphasized the importance of maintaining high post-immunization antibody titres among age groups at greatest risk of invasive infections. In keeping with UK population-based estimates, low direct efficacy of immunological memory against disease was found, cautioning against over-reliance on evidence of priming alone as a correlate of population protection. The contribution of herd immunity to disease control was reinforced. Possible intervention strategies will be explored in subsequent work. PMID:17678559
The Solid Rocket Motor Slag Population: Results of a Radar-Based Regressive Statistical Evaluation
NASA Technical Reports Server (NTRS)
Horstman, Matthew F.; Xu, Yu-Lin
2008-01-01
Solid rocket motor (SRM) slag has been identified as a potential source of man-made orbital debris. The possibility that SRMs (in addition to generating dust particles in the sub-millimeter range) may generate particles up to centimeters in size has caused concern regarding their contribution to the debris environment. Returned surfaces from space do not have sufficient area or exposure time to provide a clear picture of the SRM millimeter and centimeter debris population. Currently, radar observation is probably the only way to collect data showing the debris contribution from SRMs. Such observation is used to sample the debris environment, but it is difficult to obtain accurate orbital elements for the detected debris objects. NASA has developed several models to describe the different orbital debris populations, based on assumed debris production mechanisms to create clouds of debris objects that can be propagated in time. The NASA model, LEGEND (LEO-to-GEO Environment Debris), functions as a time-tested debris model for most debris sources. However, the current LEGEND model does not include contributions from the SRM population. An SRM model has recently been developed by NASA, based on purely theoretical details of SRM production and known SRM launches, but verification with hard data is needed. Because the detections of individual SRM objects cannot be deterministically separated from the total debris observed by radar, the validation of the SRM model can only be done by combining it with the LEGEND breakup model and comparing it with data. By applying observational constraints, the degree of SRM slag contribution to the environment may be estimated. This serves as an observationally sound method from which to calibrate a purely theoretical model into something more realistic. For this study, we use the populations observed by the Haystack radar from 1996 to present. For the SRM debris, we use a historical database of SRM launches, propellant masses, and estimated locations and times of tailoff to produce and propagate the SRM debris clouds. Comparisons with radar data from the ensuing years were made, and the SRM model was altered with respect to size and mass production of slag particles to reflect the populations estimated from the data. The result is a model SRM population that fits within the bounds of the observed environment and estimates of the production and contribution of SRM debris to the environment.
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
USDA-ARS?s Scientific Manuscript database
The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...
A random walk on water (Henry Darcy Medal Lecture)
NASA Astrophysics Data System (ADS)
Koutsoyiannis, D.
2009-04-01
Randomness and uncertainty had been well appreciated in hydrology and water resources engineering in their initial steps as scientific disciplines. However, this changed through the years and, following other geosciences, hydrology adopted a naïve view of randomness in natural processes. Such a view separates natural phenomena into two mutually exclusive types, random or stochastic, and deterministic. When a classification of a specific process into one of these two types fails, then a separation of the process into two different, usually additive, parts is typically devised, each of which may be further subdivided into subparts (e.g., deterministic subparts such as periodic and aperiodic or trends). This dichotomous logic is typically combined with a manichean perception, in which the deterministic part supposedly represents cause-effect relationships and thus is physics and science (the "good"), whereas randomness has little relationship with science and no relationship with understanding (the "evil"). Probability theory and statistics, which traditionally provided the tools for dealing with randomness and uncertainty, have been regarded by some as the "necessary evil" but not as an essential part of hydrology and geophysics. Some took a step further to banish them from hydrology, replacing them with deterministic sensitivity analysis and fuzzy-logic representations. Others attempted to demonstrate that irregular fluctuations observed in natural processes are au fond manifestations of underlying chaotic deterministic dynamics with low dimensionality, thus attempting to render probabilistic descriptions unnecessary. Some of the above recent developments are simply flawed because they make erroneous use of probability and statistics (which, remarkably, provide the tools for such analyses), whereas the entire underlying logic is just a false dichotomy. To see this, it suffices to recall that Pierre Simon Laplace, perhaps the most famous proponent of determinism in the history of philosophy of science (cf. Laplace's demon), is, at the same time, one of the founders of probability theory, which he regarded as "nothing but common sense reduced to calculation". This harmonizes with James Clerk Maxwell's view that "the true logic for this world is the calculus of Probabilities" and was more recently and epigrammatically formulated in the title of Edwin Thompson Jaynes's book "Probability Theory: The Logic of Science" (2003). Abandoning dichotomous logic, either on ontological or epistemic grounds, we can identify randomness or stochasticity with unpredictability. Admitting that (a) uncertainty is an intrinsic property of nature; (b) causality implies dependence of natural processes in time and thus suggests predictability; but, (c) even the tiniest uncertainty (e.g., in initial conditions) may result in unpredictability after a certain time horizon, we may shape a stochastic representation of natural processes that is consistent with Karl Popper's indeterministic world view. In this representation, probability quantifies uncertainty according to the Kolmogorov system, in which probability is a normalized measure, i.e., a function that maps sets (areas where the initial conditions or the parameter values lie) to real numbers (in the interval [0, 1]). In such a representation, predictability (suggested by deterministic laws) and unpredictability (randomness) coexist, are not separable or additive components, and it is a matter of specifying the time horizon of prediction to decide which of the two dominates. An elementary numerical example has been devised to illustrate the above ideas and demonstrate that they offer a pragmatic and useful guide for practice, rather than just pertaining to philosophical discussions. A chaotic model, with fully and a priori known deterministic dynamics and deterministic inputs (without any random agent), is assumed to represent the hydrological balance in an area partly covered by vegetation. Experimentation with this toy model demonstrates, inter alia, that: (1) for short time horizons the deterministic dynamics is able to give good predictions; but (2) these predictions become extremely inaccurate and useless for long time horizons; (3) for such horizons a naïve statistical prediction (average of past data) which fully neglects the deterministic dynamics is more skilful; and (4) if this statistical prediction, in addition to past data, is combined with the probability theory (the principle of maximum entropy, in particular), it can provide a more informative prediction. Also, the toy model shows that the trajectories of the system state (and derivative properties thereof) do not resemble a regular (e.g., periodic) deterministic process nor a purely random process, but exhibit patterns indicating anti-persistence and persistence (where the latter statistically complies with a Hurst-Kolmogorov behaviour). If the process is averaged over long time scales, the anti-persistent behaviour improves predictability, whereas the persistent behaviour substantially deteriorates it. A stochastic representation of this deterministic system, which incorporates dynamics, is not only possible, but also powerful as it provides good predictions for both short and long horizons and helps to decide on when the deterministic dynamics should be considered or neglected. Obviously, a natural system is extremely more complex than this simple toy model and hence unpredictability is naturally even more prominent in the former. In addition, in a complex natural system, we can never know the exact dynamics and we must infer it from past data, which implies additional uncertainty and an additional role of stochastics in the process of formulating the system equations and estimating the involved parameters. Data also offer the only solid grounds to test any hypothesis about the dynamics, and failure of performing such testing against evidence from data renders the hypothesised dynamics worthless. If this perception of natural phenomena is adequately plausible, then it may help in studying interesting fundamental questions regarding the current state and the trends of hydrological and water resources research and their promising future paths. For instance: (i) Will it ever be possible to achieve a fully "physically based" modelling of hydrological systems that will not depend on data or stochastic representations? (ii) To what extent can hydrological uncertainty be reduced and what are the effective means for such reduction? (iii) Are current stochastic methods in hydrology consistent with observed natural behaviours? What paths should we explore for their advancement? (iv) Can deterministic methods provide solid scientific grounds for water resources engineering and management? In particular, can there be risk-free hydraulic engineering and water management? (v) Is the current (particularly important) interface between hydrology and climate satisfactory?. In particular, should hydrology rely on climate models that are not properly validated (i.e., for periods and scales not used in calibration)? In effect, is the evolution of climate and its impacts on water resources deterministically predictable?
NASA Astrophysics Data System (ADS)
Wong, B.; Kilthau, W.; Knopf, D. A.
2017-12-01
Immersion freezing is recognized as the most important ice crystal formation process in mixed-phase cloud environments. It is well established that mineral dust species can act as efficient ice nucleating particles. Previous research has focused on determination of the ice nucleation propensity of individual mineral dust species. In this study, the focus is placed on how different mineral dust species such as illite, kaolinite and feldspar, initiate freezing of water droplets when present in internal and external mixtures. The frozen fraction data for single and multicomponent mineral dust droplet mixtures are recorded under identical cooling rates. Additionally, the time dependence of freezing is explored. Externally and internally mixed mineral dust droplet samples are exposed to constant temperatures (isothermal freezing experiments) and frozen fraction data is recorded based on time intervals. Analyses of single and multicomponent mineral dust droplet samples include different stochastic and deterministic models such as the derivation of the heterogeneous ice nucleation rate coefficient (Jhet), the single contact angle (α) description, the α-PDF model, active sites representation, and the deterministic model. Parameter sets derived from freezing data of single component mineral dust samples are evaluated for prediction of cooling rate dependent and isothermal freezing of multicomponent externally or internally mixed mineral dust samples. The atmospheric implications of our findings are discussed.
Godt, J.W.; Baum, R.L.; Savage, W.Z.; Salciarini, D.; Schulz, W.H.; Harp, E.L.
2008-01-01
Application of transient deterministic shallow landslide models over broad regions for hazard and susceptibility assessments requires information on rainfall, topography and the distribution and properties of hillside materials. We survey techniques for generating the spatial and temporal input data for such models and present an example using a transient deterministic model that combines an analytic solution to assess the pore-pressure response to rainfall infiltration with an infinite-slope stability calculation. Pore-pressures and factors of safety are computed on a cell-by-cell basis and can be displayed or manipulated in a grid-based GIS. Input data are high-resolution (1.8??m) topographic information derived from LiDAR data and simple descriptions of initial pore-pressure distribution and boundary conditions for a study area north of Seattle, Washington. Rainfall information is taken from a previously defined empirical rainfall intensity-duration threshold and material strength and hydraulic properties were measured both in the field and laboratory. Results are tested by comparison with a shallow landslide inventory. Comparison of results with those from static infinite-slope stability analyses assuming fixed water-table heights shows that the spatial prediction of shallow landslide susceptibility is improved using the transient analyses; moreover, results can be depicted in terms of the rainfall intensity and duration known to trigger shallow landslides in the study area.
Deterministic Squeezed States with Collective Measurements and Feedback.
Cox, Kevin C; Greve, Graham P; Weiner, Joshua M; Thompson, James K
2016-03-04
We demonstrate the creation of entangled, spin-squeezed states using a collective, or joint, measurement and real-time feedback. The pseudospin state of an ensemble of N=5×10^{4} laser-cooled ^{87}Rb atoms is deterministically driven to a specified population state with angular resolution that is a factor of 5.5(8) [7.4(6) dB] in variance below the standard quantum limit for unentangled atoms-comparable to the best enhancements using only unitary evolution. Without feedback, conditioning on the outcome of the joint premeasurement, we directly observe up to 59(8) times [17.7(6) dB] improvement in quantum phase variance relative to the standard quantum limit for N=4×10^{5} atoms. This is one of the largest reported entanglement enhancements to date in any system.
NASA Astrophysics Data System (ADS)
Lin, Tsungpo
Performance engineers face the major challenge in modeling and simulation for the after-market power system due to system degradation and measurement errors. Currently, the majority in power generation industries utilizes the deterministic data matching method to calibrate the model and cascade system degradation, which causes significant calibration uncertainty and also the risk of providing performance guarantees. In this research work, a maximum-likelihood based simultaneous data reconciliation and model calibration (SDRMC) is used for power system modeling and simulation. By replacing the current deterministic data matching with SDRMC one can reduce the calibration uncertainty and mitigate the error propagation to the performance simulation. A modeling and simulation environment for a complex power system with certain degradation has been developed. In this environment multiple data sets are imported when carrying out simultaneous data reconciliation and model calibration. Calibration uncertainties are estimated through error analyses and populated to performance simulation by using principle of error propagation. System degradation is then quantified by performance comparison between the calibrated model and its expected new & clean status. To mitigate smearing effects caused by gross errors, gross error detection (GED) is carried out in two stages. The first stage is a screening stage, in which serious gross errors are eliminated in advance. The GED techniques used in the screening stage are based on multivariate data analysis (MDA), including multivariate data visualization and principal component analysis (PCA). Subtle gross errors are treated at the second stage, in which the serial bias compensation or robust M-estimator is engaged. To achieve a better efficiency in the combined scheme of the least squares based data reconciliation and the GED technique based on hypotheses testing, the Levenberg-Marquardt (LM) algorithm is utilized as the optimizer. To reduce the computation time and stabilize the problem solving for a complex power system such as a combined cycle power plant, meta-modeling using the response surface equation (RSE) and system/process decomposition are incorporated with the simultaneous scheme of SDRMC. The goal of this research work is to reduce the calibration uncertainties and, thus, the risks of providing performance guarantees arisen from uncertainties in performance simulation.
NASA Astrophysics Data System (ADS)
Zarindast, Atousa; Seyed Hosseini, Seyed Mohamad; Pishvaee, Mir Saman
2017-06-01
Robust supplier selection problem, in a scenario-based approach has been proposed, when the demand and exchange rates are subject to uncertainties. First, a deterministic multi-objective mixed integer linear programming is developed; then, the robust counterpart of the proposed mixed integer linear programming is presented using the recent extension in robust optimization theory. We discuss decision variables, respectively, by a two-stage stochastic planning model, a robust stochastic optimization planning model which integrates worst case scenario in modeling approach and finally by equivalent deterministic planning model. The experimental study is carried out to compare the performances of the three models. Robust model resulted in remarkable cost saving and it illustrated that to cope with such uncertainties, we should consider them in advance in our planning. In our case study different supplier were selected due to this uncertainties and since supplier selection is a strategic decision, it is crucial to consider these uncertainties in planning approach.
Robust Planning for Effects-Based Operations
2006-06-01
Algorithm ......................................... 34 2.6 Robust Optimization Literature ..................................... 36 2.6.1 Protecting Against...Model Formulation ...................... 55 3.1.5 Deterministic EBO Model Example and Performance ............. 59 3.1.6 Greedy Algorithm ...111 4.1.9 Conclusions on Robust EBO Model Performance .................... 116 4.2 Greedy Algorithm versus EBO Models
Application of Wavelet Filters in an Evaluation of Photochemical Model Performance
Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model pe...
Barthel, Erik R; Pierce, James R; Goodhue, Catherine J; Ford, Henri R; Grikscheit, Tracy C; Upperman, Jeffrey S
2011-10-12
The concept of disaster surge has arisen in recent years to describe the phenomenon of severely increased demands on healthcare systems resulting from catastrophic mass casualty events (MCEs) such as natural disasters and terrorist attacks. The major challenge in dealing with a disaster surge is the efficient triage and utilization of the healthcare resources appropriate to the magnitude and character of the affected population in terms of its demographics and the types of injuries that have been sustained. In this paper a deterministic population kinetics model is used to predict the effect of the availability of a pediatric trauma center (PTC) upon the response to an arbitrary disaster surge as a function of the rates of pediatric patients' admission to adult and pediatric centers and the corresponding discharge rates of these centers. We find that adding a hypothetical pediatric trauma center to the response documented in an historical example (the Israeli Defense Forces field hospital that responded to the Haiti earthquake of 2010) would have allowed for a significant increase in the overall rate of admission of the pediatric surge cohort. This would have reduced the time to treatment in this example by approximately half. The time needed to completely treat all children affected by the disaster would have decreased by slightly more than a third, with the caveat that the PTC would have to have been approximately as fast as the adult center in discharging its patients. Lastly, if disaster death rates from other events reported in the literature are included in the model, availability of a PTC would result in a relative mortality risk reduction of 37%. Our model provides a mathematical justification for aggressive inclusion of PTCs in planning for disasters by public health agencies.
FACTORS INFLUENCING TOTAL DIETARY EXPOSURES OF YOUNG CHILDREN
A deterministic model was developed to identify the critical input parameters needed to assess dietary intakes of young children. The model was used as a framework for understanding the important factors in data collection and data analysis. Factors incorporated into the model i...
Risk of DDT residue in maize consumed by infants as complementary diet in southwest Ethiopia.
Mekonen, Seblework; Lachat, Carl; Ambelu, Argaw; Steurbaut, Walter; Kolsteren, Patrick; Jacxsens, Liesbeth; Wondafrash, Mekitie; Houbraken, Michael; Spanoghe, Pieter
2015-04-01
Infants in Ethiopia are consuming food items such as maize as a complementary diet. However, this may expose infants to toxic contaminants like DDT. Maize samples were collected from the households visited during a consumption survey and from markets in Jimma zone, southwestern Ethiopia. The residues of total DDT and its metabolites were analyzed using the Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) method combined with dispersive solid phase extraction cleanup (d-SPE). Deterministic and probabilistic methods of analysis were applied to determine the consumer exposure of infants to total DDT. The results from the exposure assessment were compared with the health based guidance value in this case the provisional tolerable daily intake (PTDI). All maize samples (n=127) were contaminated by DDT, with a mean concentration of 1.770 mg/kg, which was far above the maximum residue limit (MRL). The mean and 97.5 percentile (P 97.5) estimated daily intake of total DDT for consumers were respectively 0.011 and 0.309 mg/kg bw/day for deterministic and 0.011 and 0.083 mg/kg bw/day for probabilistic exposure assessment. For total infant population (consumers and non-consumers), the 97.5 percentile estimated daily intake were 0.265 and 0.032 mg/kg bw/day from the deterministic and probabilistic exposure assessments, respectively. Health risk estimation revealed that, the mean and 97.5 percentile for consumers, and 97.5 percentile estimated daily intake of total DDT for total population were above the PTDI. Therefore, in Ethiopia, the use of maize as complementary food for infants may pose a health risk due to DDT residue. Copyright © 2014 Elsevier B.V. All rights reserved.
Huttunen, K-L; Mykrä, H; Oksanen, J; Astorga, A; Paavola, R; Muotka, T
2017-05-03
One of the key challenges to understanding patterns of β diversity is to disentangle deterministic patterns from stochastic ones. Stochastic processes may mask the influence of deterministic factors on community dynamics, hindering identification of the mechanisms causing variation in community composition. We studied temporal β diversity (among-year dissimilarity) of macroinvertebrate communities in near-pristine boreal streams across 14 years. To assess whether the observed β diversity deviates from that expected by chance, and to identify processes (deterministic vs. stochastic) through which different explanatory factors affect community variability, we used a null model approach. We observed that at the majority of sites temporal β diversity was low indicating high community stability. When stochastic variation was unaccounted for, connectivity was the only variable explaining temporal β diversity, with weakly connected sites exhibiting higher community variability through time. After accounting for stochastic effects, connectivity lost importance, suggesting that it was related to temporal β diversity via random colonization processes. Instead, β diversity was best explained by in-stream vegetation, community variability decreasing with increasing bryophyte cover. These results highlight the potential of stochastic factors to dampen the influence of deterministic processes, affecting our ability to understand and predict changes in biological communities through time.
Deterministic generation of remote entanglement with active quantum feedback
Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...
2015-12-10
We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less
Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.
2004-01-01
We successfully applied deterministic deconvolution to real ground-penetrating radar (GPR) data by using the source wavelet that was generated in and transmitted through air as the operator. The GPR data were collected with 400-MHz antennas on a bench adjacent to a cleanly exposed quarry face. The quarry site is characterized by horizontally bedded carbonate strata with shale partings. In order to provide groundtruth for this deconvolution approach, 23 conductive rods were drilled into the quarry face at key locations. The steel rods provided critical information for: (1) correlation between reflections on GPR data and geologic features exposed in the quarry face, (2) GPR resolution limits, (3) accuracy of velocities calculated from common midpoint data and (4) identifying any multiples. Comparing the results of deconvolved data with non-deconvolved data demonstrates the effectiveness of deterministic deconvolution in low dielectric-loss media for increased accuracy of velocity models (improved at least 10-15% in our study after deterministic deconvolution), increased vertical and horizontal resolution of specific geologic features and more accurate representation of geologic features as confirmed from detailed study of the adjacent quarry wall. ?? 2004 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Khaki, M.; Hoteit, I.; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A. I. J. M.; Schumacher, M.; Pattiaratchi, C.
2017-09-01
The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively, improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.
Relation Between the Cell Volume and the Cell Cycle Dynamics in Mammalian cell
NASA Astrophysics Data System (ADS)
Magno, A. C. G.; Oliveira, I. L.; Hauck, J. V. S.
2016-08-01
The main goal of this work is to add and analyze an equation that represents the volume in a dynamical model of the mammalian cell cycle proposed by Gérard and Goldbeter (2011) [1]. The cell division occurs when the cyclinB/Cdkl complex is totally degraded (Tyson and Novak, 2011)[2] and it reaches a minimum value. At this point, the cell is divided into two newborn daughter cells and each one will contain the half of the cytoplasmic content of the mother cell. The equations of our base model are only valid if the cell volume, where the reactions occur, is constant. Whether the cell volume is not constant, that is, the rate of change of its volume with respect to time is explicitly taken into account in the mathematical model, then the equations of the original model are no longer valid. Therefore, every equations were modified from the mass conservation principle for considering a volume that changes with time. Through this approach, the cell volume affects all model variables. Two different dynamic simulation methods were accomplished: deterministic and stochastic. In the stochastic simulation, the volume affects every model's parameters which have molar unit, whereas in the deterministic one, it is incorporated into the differential equations. In deterministic simulation, the biochemical species may be in concentration units, while in stochastic simulation such species must be converted to number of molecules which are directly proportional to the cell volume. In an effort to understand the influence of the new equation a stability analysis was performed. This elucidates how the growth factor impacts the stability of the model's limit cycles. In conclusion, a more precise model, in comparison to the base model, was created for the cell cycle as it now takes into consideration the cell volume variation
A Population Genetics Model of Marker-Assisted Selection
Luo, Z. W.; Thompson, R.; Woolliams, J. A.
1997-01-01
A deterministic two-loci model was developed to predict genetic response to marker-assisted selection (MAS) in one generation and in multiple generations. Formulas were derived to relate linkage disequilibrium in a population to the proportion of additive genetic variance used by MAS, and in turn to an extra improvement in genetic response over phenotypic selection. Predictions of the response were compared to those predicted by using an infinite-loci model and the factors affecting efficiency of MAS were examined. Theoretical analyses of the present study revealed the nonlinearity between the selection intensity and genetic response in MAS. In addition to the heritability of the trait and the proportion of the marker-associated genetic variance, the frequencies of the selectively favorable alleles at the two loci, one marker and one quantitative trait locus, were found to play an important role in determining both the short- and long-term efficiencies of MAS. The evolution of linkage disequilibrium and thus the genetic response over several generations were predicted theoretically and examined by simulation. MAS dissipated the disequilibrium more quickly than drift alone. In some cases studied, the rate of dissipation was as large as that to be expected in the circumstance where the true recombination fraction was increased by three times and selection was absent. PMID:9215918
Cassels, Susan; Pearson, Cynthia R; Kurth, Ann E; Martin, Diane P; Simoni, Jane M; Matediana, Eduardo; Gloyd, Stephen
2009-07-01
Mathematical models are increasingly used in social and behavioral studies of HIV transmission; however, model structures must be chosen carefully to best answer the question at hand and conclusions must be interpreted cautiously. In Pearson et al. (2007), we presented a simple analytically tractable deterministic model to estimate the number of secondary HIV infections stemming from a population of HIV-positive Mozambicans and to evaluate how the estimate would change under different treatment and behavioral scenarios. In a subsequent application of the model with a different data set, we observed that the model produced an unduly conservative estimate of the number of new HIV-1 infections. In this brief report, our first aim is to describe a revision of the model to correct for this underestimation. Specifically, we recommend adjusting the population-level sexually transmitted infection (STI) parameters to be applicable to the individual-level model specification by accounting for the proportion of individuals uninfected with an STI. In applying the revised model to the original data, we noted an estimated 40 infections/1000 HIV-positive persons per year (versus the original 23 infections/1000 HIV-positive persons per year). In addition, the revised model estimated that highly active antiretroviral therapy (HAART) along with syphilis and herpes simplex virus type 2 (HSV-2) treatments combined could reduce HIV-1 transmission by 72% (versus 86% according to the original model). The second aim of this report is to discuss the advantages and disadvantages of mathematical models in the field and the implications of model interpretation. We caution that simple models should be used for heuristic purposes only. Since these models do not account for heterogeneity in the population and significantly simplify HIV transmission dynamics, they should be used to describe general characteristics of the epidemic and demonstrate the importance or sensitivity of parameters in the model.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Romps, David M.
2016-03-01
Convective entrainment is a process that is poorly represented in existing convective parameterizations. By many estimates, convective entrainment is the leading source of error in global climate models. As a potential remedy, an Eulerian implementation of the Stochastic Parcel Model (SPM) is presented here as a convective parameterization that treats entrainment in a physically realistic and computationally efficient way. Drawing on evidence that convecting clouds comprise air parcels subject to Poisson-process entrainment events, the SPM calculates the deterministic limit of an infinite number of such parcels. For computational efficiency, the SPM groups parcels at each height by their purity, whichmore » is a measure of their total entrainment up to that height. This reduces the calculation of convective fluxes to a sequence of matrix multiplications. The SPM is implemented in a single-column model and compared with a large-eddy simulation of deep convection.« less
Spatial scaling patterns and functional redundancies in a changing boreal lake landscape
Angeler, David G.; Allen, Craig R.; Uden, Daniel R.; Johnson, Richard K.
2015-01-01
Global transformations extend beyond local habitats; therefore, larger-scale approaches are needed to assess community-level responses and resilience to unfolding environmental changes. Using longterm data (1996–2011), we evaluated spatial patterns and functional redundancies in the littoral invertebrate communities of 85 Swedish lakes, with the objective of assessing their potential resilience to environmental change at regional scales (that is, spatial resilience). Multivariate spatial modeling was used to differentiate groups of invertebrate species exhibiting spatial patterns in composition and abundance (that is, deterministic species) from those lacking spatial patterns (that is, stochastic species). We then determined the functional feeding attributes of the deterministic and stochastic invertebrate species, to infer resilience. Between one and three distinct spatial patterns in invertebrate composition and abundance were identified in approximately one-third of the species; the remainder were stochastic. We observed substantial differences in metrics between deterministic and stochastic species. Functional richness and diversity decreased over time in the deterministic group, suggesting a loss of resilience in regional invertebrate communities. However, taxon richness and redundancy increased monotonically in the stochastic group, indicating the capacity of regional invertebrate communities to adapt to change. Our results suggest that a refined picture of spatial resilience emerges if patterns of both the deterministic and stochastic species are accounted for. Spatially extensive monitoring may help increase our mechanistic understanding of community-level responses and resilience to regional environmental change, insights that are critical for developing management and conservation agendas in this current period of rapid environmental transformation.
Production scheduling and rescheduling with genetic algorithms.
Bierwirth, C; Mattfeld, D C
1999-01-01
A general model for job shop scheduling is described which applies to static, dynamic and non-deterministic production environments. Next, a Genetic Algorithm is presented which solves the job shop scheduling problem. This algorithm is tested in a dynamic environment under different workload situations. Thereby, a highly efficient decoding procedure is proposed which strongly improves the quality of schedules. Finally, this technique is tested for scheduling and rescheduling in a non-deterministic environment. It is shown by experiment that conventional methods of production control are clearly outperformed at reasonable run-time costs.
Origin and Diversification Dynamics of Self-Incompatibility Haplotypes
Gervais, Camille E.; Castric, Vincent; Ressayre, Adrienne; Billiard, Sylvain
2011-01-01
Self-incompatibility (SI) is a genetic system found in some hermaphrodite plants. Recognition of pollen by pistils expressing cognate specificities at two linked genes leads to rejection of self pollen and pollen from close relatives, i.e., to avoidance of self-fertilization and inbred matings, and thus increased outcrossing. These genes generally have many alleles, yet the conditions allowing the evolution of new alleles remain mysterious. Evolutionary changes are clearly necessary in both genes, since any mutation affecting only one of them would result in a nonfunctional self-compatible haplotype. Here, we study diversification at the S-locus (i.e., a stable increase in the total number of SI haplotypes in the population, through the incorporation of new SI haplotypes), both deterministically (by investigating analytically the fate of mutations in an infinite population) and by simulations of finite populations. We show that the conditions allowing diversification are far less stringent in finite populations with recurrent mutations of the pollen and pistil genes, suggesting that diversification is possible in a panmictic population. We find that new SI haplotypes emerge fastest in populations with few SI haplotypes, and we discuss some implications for empirical data on S-alleles. However, allele numbers in our simulations never reach values as high as observed in plants whose SI systems have been studied, and we suggest extensions of our models that may reconcile the theory and data. PMID:21515570
NASA Astrophysics Data System (ADS)
Lemarchand, A.; Lesne, A.; Mareschal, M.
1995-05-01
The reaction-diffusion equation associated with the Fisher chemical model A+B-->2A admits wave-front solutions by replacing an unstable stationary state with a stable one. The deterministic analysis concludes that their propagation velocity is not prescribed by the dynamics. For a large class of initial conditions the velocity which is spontaneously selected is equal to the minimum allowed velocity vmin, as predicted by the marginal stability criterion. In order to test the relevance of this deterministic description we investigate the macroscopic consequences, on the velocity and the width of the front, of the intrinsic stochasticity due to the underlying microscopic dynamics. We solve numerically the Langevin equations, deduced analytically from the master equation within a system size expansion procedure. We show that the mean profile associated with the stochastic solution propagates faster than the deterministic solution at a velocity up to 25% greater than vmin.
NASA Astrophysics Data System (ADS)
Han, Jiang; Chen, Ye-Hwa; Zhao, Xiaomin; Dong, Fangfang
2018-04-01
A novel fuzzy dynamical system approach to the control design of flexible joint manipulators with mismatched uncertainty is proposed. Uncertainties of the system are assumed to lie within prescribed fuzzy sets. The desired system performance includes a deterministic phase and a fuzzy phase. First, by creatively implanting a fictitious control, a robust control scheme is constructed to render the system uniformly bounded and uniformly ultimately bounded. Both the manipulator modelling and control scheme are deterministic and not IF-THEN heuristic rules-based. Next, a fuzzy-based performance index is proposed. An optimal design problem for a control design parameter is formulated as a constrained optimisation problem. The global solution to this problem can be obtained from solving two quartic equations. The fuzzy dynamical system approach is systematic and is able to assure the deterministic performance as well as to minimise the fuzzy performance index.
Expansion Under Climate Change: The Genetic Consequences.
Garnier, Jimmy; Lewis, Mark A
2016-11-01
Range expansion and range shifts are crucial population responses to climate change. Genetic consequences are not well understood but are clearly coupled to ecological dynamics that, in turn, are driven by shifting climate conditions. We model a population with a deterministic reaction-diffusion model coupled to a heterogeneous environment that develops in time due to climate change. We decompose the resulting travelling wave solution into neutral genetic components to analyse the spatio-temporal dynamics of its genetic structure. Our analysis shows that range expansions and range shifts under slow climate change preserve genetic diversity. This is because slow climate change creates range boundaries that promote spatial mixing of genetic components. Mathematically, the mixing leads to so-called pushed travelling wave solutions. This mixing phenomenon is not seen in spatially homogeneous environments, where range expansion reduces genetic diversity through gene surfing arising from pulled travelling wave solutions. However, the preservation of diversity is diminished when climate change occurs too quickly. Using diversity indices, we show that fast expansions and range shifts erode genetic diversity more than slow range expansions and range shifts. Our study provides analytical insight into the dynamics of travelling wave solutions in heterogeneous environments.
Abstract: Two physically based and deterministic models, CASC2-D and KINEROS are evaluated and compared for their performances on modeling sediment movement on a small agricultural watershed over several events. Each model has different conceptualization of a watershed. CASC...
Identifiability Of Systems With Modeling Errors
NASA Technical Reports Server (NTRS)
Hadaegh, Yadolah " fred"
1988-01-01
Advances in theory of modeling errors reported. Recent paper on errors in mathematical models of deterministic linear or weakly nonlinear systems. Extends theoretical work described in NPO-16661 and NPO-16785. Presents concrete way of accounting for difference in structure between mathematical model and physical process or system that it represents.
The goal of achieving verisimilitude of air quality simulations to observations is problematic. Chemical transport models such as the Community Multi-Scale Air Quality (CMAQ) modeling system produce volume averages of pollutant concentration fields. When grid sizes are such tha...
NASA Astrophysics Data System (ADS)
Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria
2014-05-01
Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII. The usable and realistic ground motion maps for urban areas are generated: - either from the assumption of a "reference earthquake" - or directly, showing values of macroseimic intensity generated by a damaging, real earthquake. In the study, applying deterministic approach, earthquake scenario in macroseismic intensity ("model" earthquake scenario) for the city of Sofia is generated. The deterministic "model" intensity scenario based on assumption of a "reference earthquake" is compared with a scenario based on observed macroseimic effects caused by the damaging 2012 earthquake (MW5.6). The difference between observed (Io) and predicted (Ip) intensities values is analyzed.
Individual Genetic Susceptibility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric J. Hall
2008-12-08
Risk estimates derived from epidemiological studies of exposed populations, as well as the maximum permissible doses allowed for occupational exposure and exposure of the public to ionizing radiation are all based on the assumption that the human population is uniform in its radiosensitivity, except for a small number of individuals, such as ATM homozygotes who are easily identified by their clinical symptoms. The hypothesis upon which this proposal is based is that the human population is not homogeneous in radiosensitiviry, but that radiosensitive sub-groups exist which are not easy to identify. These individuals would suffer an increased incidence of detrimentalmore » radiation effects, and distort the shape of the dose response relationship. The radiosensitivity of these groups depend on the expression levels of specific proteins. The plan was to investigate the effect of 3 relatively rare, high penetrate genes available in mice, namely Atm, mRad9 & Brca1. The purpose of radiation protection is to prevent! deterministic effects of clinical significance and limit stochastic effects to acceptable levels. We plan, therefore to compare with wild type animals the radiosensitivity of mice heterozygous for each of the genes mentioned above, as well as double heterozygotes for pairs of genes, using two biological endpoints: a) Ocular cataracts as an important and relevant deterministic effect, and b) Oncogenic transformation in cultured embryo fibroblasts, as a surrogate for carcinogenesis, the most relevant stochastic effect.« less
Understanding Rasch Measurement: Rasch Models Overview.
ERIC Educational Resources Information Center
Wright, Benjamin D.; Mok, Magdalena
2000-01-01
Presents an overview of Rasch measurement models that begins with a conceptualization of continuous experiences often captured as discrete observations. Discusses the mathematical properties of the Rasch family of models that allow the transformation of discrete deterministic counts into continuous probabilistic abstractions. Also discusses six of…
NASA Astrophysics Data System (ADS)
Gao, Yi
The development and utilization of wind energy for satisfying electrical demand has received considerable attention in recent years due to its tremendous environmental, social and economic benefits, together with public support and government incentives. Electric power generation from wind energy behaves quite differently from that of conventional sources. The fundamentally different operating characteristics of wind energy facilities therefore affect power system reliability in a different manner than those of conventional systems. The reliability impact of such a highly variable energy source is an important aspect that must be assessed when the wind power penetration is significant. The focus of the research described in this thesis is on the utilization of state sampling Monte Carlo simulation in wind integrated bulk electric system reliability analysis and the application of these concepts in system planning and decision making. Load forecast uncertainty is an important factor in long range planning and system development. This thesis describes two approximate approaches developed to reduce the number of steps in a load duration curve which includes load forecast uncertainty, and to provide reasonably accurate generating and bulk system reliability index predictions. The developed approaches are illustrated by application to two composite test systems. A method of generating correlated random numbers with uniform distributions and a specified correlation coefficient in the state sampling method is proposed and used to conduct adequacy assessment in generating systems and in bulk electric systems containing correlated wind farms in this thesis. The studies described show that it is possible to use the state sampling Monte Carlo simulation technique to quantitatively assess the reliability implications associated with adding wind power to a composite generation and transmission system including the effects of multiple correlated wind sites. This is an important development as it permits correlated wind farms to be incorporated in large practical system studies without requiring excessive increases in computer solution time. The procedures described in this thesis for creating monthly and seasonal wind farm models should prove useful in situations where time period models are required to incorporate scheduled maintenance of generation and transmission facilities. There is growing interest in combining deterministic considerations with probabilistic assessment in order to evaluate the quantitative system risk and conduct bulk power system planning. A relatively new approach that incorporates deterministic and probabilistic considerations in a single risk assessment framework has been designated as the joint deterministic-probabilistic approach. The research work described in this thesis illustrates that the joint deterministic-probabilistic approach can be effectively used to integrate wind power in bulk electric system planning. The studies described in this thesis show that the application of the joint deterministic-probabilistic method provides more stringent results for a system with wind power than the traditional deterministic N-1 method because the joint deterministic-probabilistic technique is driven by the deterministic N-1 criterion with an added probabilistic perspective which recognizes the power output characteristics of a wind turbine generator.
Huisman, J.A.; Breuer, L.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.; Willems, P.
2009-01-01
An ensemble of 10 hydrological models was applied to the same set of land use change scenarios. There was general agreement about the direction of changes in the mean annual discharge and 90% discharge percentile predicted by the ensemble members, although a considerable range in the magnitude of predictions for the scenarios and catchments under consideration was obvious. Differences in the magnitude of the increase were attributed to the different mean annual actual evapotranspiration rates for each land use type. The ensemble of model runs was further analyzed with deterministic and probabilistic ensemble methods. The deterministic ensemble method based on a trimmed mean resulted in a single somewhat more reliable scenario prediction. The probabilistic reliability ensemble averaging (REA) method allowed a quantification of the model structure uncertainty in the scenario predictions. It was concluded that the use of a model ensemble has greatly increased our confidence in the reliability of the model predictions. ?? 2008 Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.
2016-12-01
Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.
Wu, Fei; Sioshansi, Ramteen
2017-05-04
Here, we develop a model to optimize the location of public fast charging stations for electric vehicles (EVs). A difficulty in planning the placement of charging stations is uncertainty in where EV charging demands appear. For this reason, we use a stochastic flow-capturing location model (SFCLM). A sample-average approximation method and an averaged two-replication procedure are used to solve the problem and estimate the solution quality. We demonstrate the use of the SFCLM using a Central-Ohio based case study. We find that most of the stations built are concentrated around the urban core of the region. As the number ofmore » stations built increases, some appear on the outskirts of the region to provide an extended charging network. We find that the sets of optimal charging station locations as a function of the number of stations built are approximately nested. We demonstrate the benefits of the charging-station network in terms of how many EVs are able to complete their daily trips by charging midday—six public charging stations allow at least 60% of EVs that would otherwise not be able to complete their daily tours without the stations to do so. We finally compare the SFCLM to a deterministic model, in which EV flows are set equal to their expected values. We show that if a limited number of charging stations are to be built, the SFCLM outperforms the deterministic model. As the number of stations to be built increases, the SFCLM and deterministic model select very similar station locations.« less
Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)
NASA Astrophysics Data System (ADS)
Kędra, Mariola
2014-02-01
Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.
NASA Astrophysics Data System (ADS)
Turnbull, Heather; Omenzetter, Piotr
2018-03-01
vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.
Simulation of anaerobic digestion processes using stochastic algorithm.
Palanichamy, Jegathambal; Palani, Sundarambal
2014-01-01
The Anaerobic Digestion (AD) processes involve numerous complex biological and chemical reactions occurring simultaneously. Appropriate and efficient models are to be developed for simulation of anaerobic digestion systems. Although several models have been developed, mostly they suffer from lack of knowledge on constants, complexity and weak generalization. The basis of the deterministic approach for modelling the physico and bio-chemical reactions occurring in the AD system is the law of mass action, which gives the simple relationship between the reaction rates and the species concentrations. The assumptions made in the deterministic models are not hold true for the reactions involving chemical species of low concentration. The stochastic behaviour of the physicochemical processes can be modeled at mesoscopic level by application of the stochastic algorithms. In this paper a stochastic algorithm (Gillespie Tau Leap Method) developed in MATLAB was applied to predict the concentration of glucose, acids and methane formation at different time intervals. By this the performance of the digester system can be controlled. The processes given by ADM1 (Anaerobic Digestion Model 1) were taken for verification of the model. The proposed model was verified by comparing the results of Gillespie's algorithms with the deterministic solution for conversion of glucose into methane through degraders. At higher value of 'τ' (timestep), the computational time required for reaching the steady state is more since the number of chosen reactions is less. When the simulation time step is reduced, the results are similar to ODE solver. It was concluded that the stochastic algorithm is a suitable approach for the simulation of complex anaerobic digestion processes. The accuracy of the results depends on the optimum selection of tau value.
Kim, Sung-Cheol; Wunsch, Benjamin H; Hu, Huan; Smith, Joshua T; Austin, Robert H; Stolovitzky, Gustavo
2017-06-27
Deterministic lateral displacement (DLD) is a technique for size fractionation of particles in continuous flow that has shown great potential for biological applications. Several theoretical models have been proposed, but experimental evidence has demonstrated that a rich class of intermediate migration behavior exists, which is not predicted. We present a unified theoretical framework to infer the path of particles in the whole array on the basis of trajectories in a unit cell. This framework explains many of the unexpected particle trajectories reported and can be used to design arrays for even nanoscale particle fractionation. We performed experiments that verify these predictions and used our model to develop a condenser array that achieves full particle separation with a single fluidic input.
Developing Stochastic Models as Inputs for High-Frequency Ground Motion Simulations
NASA Astrophysics Data System (ADS)
Savran, William Harvey
High-frequency ( 10 Hz) deterministic ground motion simulations are challenged by our understanding of the small-scale structure of the earth's crust and the rupture process during an earthquake. We will likely never obtain deterministic models that can accurately describe these processes down to the meter scale length required for broadband wave propagation. Instead, we can attempt to explain the behavior, in a statistical sense, by including stochastic models defined by correlations observed in the natural earth and through physics based simulations of the earthquake rupture process. Toward this goal, we develop stochastic models to address both of the primary considerations for deterministic ground motion simulations: namely, the description of the material properties in the crust, and broadband earthquake source descriptions. Using borehole sonic log data recorded in Los Angeles basin, we estimate the spatial correlation structure of the small-scale fluctuations in P-wave velocities by determining the best-fitting parameters of a von Karman correlation function. We find that Hurst exponents, nu, between 0.0-0.2, vertical correlation lengths, az, of 15-150m, an standard deviation, sigma of about 5% characterize the variability in the borehole data. Usin these parameters, we generated a stochastic model of velocity and density perturbations and combined with leading seismic velocity models to perform a validation exercise for the 2008, Chino Hills, CA using heterogeneous media. We find that models of velocity and density perturbations can have significant effects on the wavefield at frequencies as low as 0.3 Hz, with ensemble median values of various ground motion metrics varying up to +/-50%, at certain stations, compared to those computed solely from the CVM. Finally, we develop a kinematic rupture generator based on dynamic rupture simulations on geometrically complex faults. We analyze 100 dynamic rupture simulations on strike-slip faults ranging from Mw 6.4-7.2. We find that our dynamic simulations follow empirical scaling relationships for inter-plate strike-slip events, and provide source spectra comparable with an o -2 model. Our rupture generator reproduces GMPE medians and intra-event standard deviations spectral accelerations for an ensemble of 10 Hz fully-deterministic ground motion simulations, as compared to NGA West2 GMPE relationships up to 0.2 seconds.
Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis
NASA Astrophysics Data System (ADS)
Lari, S.; Frattini, P.; Crosta, G. B.
2009-04-01
We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.
Critical thresholds for eventual extinction in randomly disturbed population growth models.
Peckham, Scott D; Waymire, Edward C; De Leenheer, Patrick
2018-02-16
This paper considers several single species growth models featuring a carrying capacity, which are subject to random disturbances that lead to instantaneous population reduction at the disturbance times. This is motivated in part by growing concerns about the impacts of climate change. Our main goal is to understand whether or not the species can persist in the long run. We consider the discrete-time stochastic process obtained by sampling the system immediately after the disturbances, and find various thresholds for several modes of convergence of this discrete process, including thresholds for the absence or existence of a positively supported invariant distribution. These thresholds are given explicitly in terms of the intensity and frequency of the disturbances on the one hand, and the population's growth characteristics on the other. We also perform a similar threshold analysis for the original continuous-time stochastic process, and obtain a formula that allows us to express the invariant distribution for this continuous-time process in terms of the invariant distribution of the discrete-time process, and vice versa. Examples illustrate that these distributions can differ, and this sends a cautionary message to practitioners who wish to parameterize these and related models using field data. Our analysis relies heavily on a particular feature shared by all the deterministic growth models considered here, namely that their solutions exhibit an exponentially weighted averaging property between a function of the initial condition, and the same function applied to the carrying capacity. This property is due to the fact that these systems can be transformed into affine systems.