Sample records for spatial stochastic simulation

  1. Spatially explicit and stochastic simulation of forest landscape fire disturbance and succession

    Treesearch

    Hong S. He; David J. Mladenoff

    1999-01-01

    Understanding disturbance and recovery of forest landscapes is a challenge because of complex interactions over a range of temporal and spatial scales. Landscape simulation models offer an approach to studying such systems at broad scales. Fire can be simulated spatially using mechanistic or stochastic approaches. We describe the fire module in a spatially explicit,...

  2. Binomial tau-leap spatial stochastic simulation algorithm for applications in chemical kinetics.

    PubMed

    Marquez-Lago, Tatiana T; Burrage, Kevin

    2007-09-14

    In cell biology, cell signaling pathway problems are often tackled with deterministic temporal models, well mixed stochastic simulators, and/or hybrid methods. But, in fact, three dimensional stochastic spatial modeling of reactions happening inside the cell is needed in order to fully understand these cell signaling pathways. This is because noise effects, low molecular concentrations, and spatial heterogeneity can all affect the cellular dynamics. However, there are ways in which important effects can be accounted without going to the extent of using highly resolved spatial simulators (such as single-particle software), hence reducing the overall computation time significantly. We present a new coarse grained modified version of the next subvolume method that allows the user to consider both diffusion and reaction events in relatively long simulation time spans as compared with the original method and other commonly used fully stochastic computational methods. Benchmarking of the simulation algorithm was performed through comparison with the next subvolume method and well mixed models (MATLAB), as well as stochastic particle reaction and transport simulations (CHEMCELL, Sandia National Laboratories). Additionally, we construct a model based on a set of chemical reactions in the epidermal growth factor receptor pathway. For this particular application and a bistable chemical system example, we analyze and outline the advantages of our presented binomial tau-leap spatial stochastic simulation algorithm, in terms of efficiency and accuracy, in scenarios of both molecular homogeneity and heterogeneity.

  3. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.

    PubMed

    Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M

    2016-12-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.

  4. HRSSA - Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    NASA Astrophysics Data System (ADS)

    Marchetti, Luca; Priami, Corrado; Thanh, Vo Hong

    2016-07-01

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance and accuracy of HRSSA against other state of the art algorithms.

  5. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  6. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology

    PubMed Central

    Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.

    2016-01-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915

  7. HRSSA – Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchetti, Luca, E-mail: marchetti@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; University of Trento, Department of Mathematics

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance andmore » accuracy of HRSSA against other state of the art algorithms.« less

  8. Analytical approximations for spatial stochastic gene expression in single cells and tissues

    PubMed Central

    Smith, Stephen; Cianci, Claudia; Grima, Ramon

    2016-01-01

    Gene expression occurs in an environment in which both stochastic and diffusive effects are significant. Spatial stochastic simulations are computationally expensive compared with their deterministic counterparts, and hence little is currently known of the significance of intrinsic noise in a spatial setting. Starting from the reaction–diffusion master equation (RDME) describing stochastic reaction–diffusion processes, we here derive expressions for the approximate steady-state mean concentrations which are explicit functions of the dimensionality of space, rate constants and diffusion coefficients. The expressions have a simple closed form when the system consists of one effective species. These formulae show that, even for spatially homogeneous systems, mean concentrations can depend on diffusion coefficients: this contradicts the predictions of deterministic reaction–diffusion processes, thus highlighting the importance of intrinsic noise. We confirm our theory by comparison with stochastic simulations, using the RDME and Brownian dynamics, of two models of stochastic and spatial gene expression in single cells and tissues. PMID:27146686

  9. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  10. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    PubMed

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  11. The Time Dependent Propensity Function for Acceleration of Spatial Stochastic Simulation of Reaction-Diffusion Systems

    PubMed Central

    Wu, Sheng; Li, Hong; Petzold, Linda R.

    2015-01-01

    The inhomogeneous stochastic simulation algorithm (ISSA) is a fundamental method for spatial stochastic simulation. However, when diffusion events occur more frequently than reaction events, simulating the diffusion events by ISSA is quite costly. To reduce this cost, we propose to use the time dependent propensity function in each step. In this way we can avoid simulating individual diffusion events, and use the time interval between two adjacent reaction events as the simulation stepsize. We demonstrate that the new algorithm can achieve orders of magnitude efficiency gains over widely-used exact algorithms, scales well with increasing grid resolution, and maintains a high level of accuracy. PMID:26609185

  12. The diffusive finite state projection algorithm for efficient simulation of the stochastic reaction-diffusion master equation.

    PubMed

    Drawert, Brian; Lawson, Michael J; Petzold, Linda; Khammash, Mustafa

    2010-02-21

    We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm.

  13. Efficient Simulation of Tropical Cyclone Pathways with Stochastic Perturbations

    NASA Astrophysics Data System (ADS)

    Webber, R.; Plotkin, D. A.; Abbot, D. S.; Weare, J.

    2017-12-01

    Global Climate Models (GCMs) are known to statistically underpredict intense tropical cyclones (TCs) because they fail to capture the rapid intensification and high wind speeds characteristic of the most destructive TCs. Stochastic parametrization schemes have the potential to improve the accuracy of GCMs. However, current analysis of these schemes through direct sampling is limited by the computational expense of simulating a rare weather event at fine spatial gridding. The present work introduces a stochastically perturbed parametrization tendency (SPPT) scheme to increase simulated intensity of TCs. We adapt the Weighted Ensemble algorithm to simulate the distribution of TCs at a fraction of the computational effort required in direct sampling. We illustrate the efficiency of the SPPT scheme by comparing simulations at different spatial resolutions and stochastic parameter regimes. Stochastic parametrization and rare event sampling strategies have great potential to improve TC prediction and aid understanding of tropical cyclogenesis. Since rising sea surface temperatures are postulated to increase the intensity of TCs, these strategies can also improve predictions about climate change-related weather patterns. The rare event sampling strategies used in the current work are not only a novel tool for studying TCs, but they may also be applied to sampling any range of extreme weather events.

  14. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Treesearch

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  15. Parallel STEPS: Large Scale Stochastic Spatial Reaction-Diffusion Simulation with High Performance Computers

    PubMed Central

    Chen, Weiliang; De Schutter, Erik

    2017-01-01

    Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation. PMID:28239346

  16. Parallel STEPS: Large Scale Stochastic Spatial Reaction-Diffusion Simulation with High Performance Computers.

    PubMed

    Chen, Weiliang; De Schutter, Erik

    2017-01-01

    Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation.

  17. Stochastic analysis of multiphase flow in porous media: II. Numerical simulations

    NASA Astrophysics Data System (ADS)

    Abin, A.; Kalurachchi, J. J.; Kemblowski, M. W.; Chang, C.-M.

    1996-08-01

    The first paper (Chang et al., 1995b) of this two-part series described the stochastic analysis using spectral/perturbation approach to analyze steady state two-phase (water and oil) flow in a, liquid-unsaturated, three fluid-phase porous medium. In this paper, the results between the numerical simulations and closed-form expressions obtained using the perturbation approach are compared. We present the solution to the one-dimensional, steady-state oil and water flow equations. The stochastic input processes are the spatially correlated logk where k is the intrinsic permeability and the soil retention parameter, α. These solutions are subsequently used in the numerical simulations to estimate the statistical properties of the key output processes. The comparison between the results of the perturbation analysis and numerical simulations showed a good agreement between the two methods over a wide range of logk variability with three different combinations of input stochastic processes of logk and soil parameter α. The results clearly demonstrated the importance of considering the spatial variability of key subsurface properties under a variety of physical scenarios. The variability of both capillary pressure and saturation is affected by the type of input stochastic process used to represent the spatial variability. The results also demonstrated the applicability of perturbation theory in predicting the system variability and defining effective fluid properties through the ergodic assumption.

  18. Multi-Algorithm Particle Simulations with Spatiocyte.

    PubMed

    Arjunan, Satya N V; Takahashi, Koichi

    2017-01-01

    As quantitative biologists get more measurements of spatially regulated systems such as cell division and polarization, simulation of reaction and diffusion of proteins using the data is becoming increasingly relevant to uncover the mechanisms underlying the systems. Spatiocyte is a lattice-based stochastic particle simulator for biochemical reaction and diffusion processes. Simulations can be performed at single molecule and compartment spatial scales simultaneously. Molecules can diffuse and react in 1D (filament), 2D (membrane), and 3D (cytosol) compartments. The implications of crowded regions in the cell can be investigated because each diffusing molecule has spatial dimensions. Spatiocyte adopts multi-algorithm and multi-timescale frameworks to simulate models that simultaneously employ deterministic, stochastic, and particle reaction-diffusion algorithms. Comparison of light microscopy images to simulation snapshots is supported by Spatiocyte microscopy visualization and molecule tagging features. Spatiocyte is open-source software and is freely available at http://spatiocyte.org .

  19. Rupture Propagation for Stochastic Fault Models

    NASA Astrophysics Data System (ADS)

    Favreau, P.; Lavallee, D.; Archuleta, R.

    2003-12-01

    The inversion of strong motion data of large earhquakes give the spatial distribution of pre-stress on the ruptured faults and it can be partially reproduced by stochastic models, but a fundamental question remains: how rupture propagates, constrained by the presence of spatial heterogeneity? For this purpose we investigate how the underlying random variables, that control the pre-stress spatial variability, condition the propagation of the rupture. Two stochastic models of prestress distributions are considered, respectively based on Cauchy and Gaussian random variables. The parameters of the two stochastic models have values corresponding to the slip distribution of the 1979 Imperial Valley earthquake. We use a finite difference code to simulate the spontaneous propagation of shear rupture on a flat fault in a 3D continuum elastic body. The friction law is the slip dependent friction law. The simulations show that the propagation of the rupture front is more complex, incoherent or snake-like for a prestress distribution based on Cauchy random variables. This may be related to the presence of a higher number of asperities in this case. These simulations suggest that directivity is stronger in the Cauchy scenario, compared to the smoother rupture of the Gauss scenario.

  20. Spatial patterns and biodiversity in off-lattice simulations of a cyclic three-species Lotka-Volterra model

    NASA Astrophysics Data System (ADS)

    Avelino, P. P.; Bazeia, D.; Losano, L.; Menezes, J.; de Oliveira, B. F.

    2018-02-01

    Stochastic simulations of cyclic three-species spatial predator-prey models are usually performed in square lattices with nearest-neighbour interactions starting from random initial conditions. In this letter we describe the results of off-lattice Lotka-Volterra stochastic simulations, showing that the emergence of spiral patterns does occur for sufficiently high values of the (conserved) total density of individuals. We also investigate the dynamics in our simulations, finding an empirical relation characterizing the dependence of the characteristic peak frequency and amplitude on the total density. Finally, we study the impact of the total density on the extinction probability, showing how a low population density may jeopardize biodiversity.

  1. Hybrid stochastic and deterministic simulations of calcium blips.

    PubMed

    Rüdiger, S; Shuai, J W; Huisinga, W; Nagaiah, C; Warnecke, G; Parker, I; Falcke, M

    2007-09-15

    Intracellular calcium release is a prime example for the role of stochastic effects in cellular systems. Recent models consist of deterministic reaction-diffusion equations coupled to stochastic transitions of calcium channels. The resulting dynamics is of multiple time and spatial scales, which complicates far-reaching computer simulations. In this article, we introduce a novel hybrid scheme that is especially tailored to accurately trace events with essential stochastic variations, while deterministic concentration variables are efficiently and accurately traced at the same time. We use finite elements to efficiently resolve the extreme spatial gradients of concentration variables close to a channel. We describe the algorithmic approach and we demonstrate its efficiency compared to conventional methods. Our single-channel model matches experimental data and results in intriguing dynamics if calcium is used as charge carrier. Random openings of the channel accumulate in bursts of calcium blips that may be central for the understanding of cellular calcium dynamics.

  2. The multinomial simulation algorithm for discrete stochastic simulation of reaction-diffusion systems.

    PubMed

    Lampoudi, Sotiria; Gillespie, Dan T; Petzold, Linda R

    2009-03-07

    The Inhomogeneous Stochastic Simulation Algorithm (ISSA) is a variant of the stochastic simulation algorithm in which the spatially inhomogeneous volume of the system is divided into homogeneous subvolumes, and the chemical reactions in those subvolumes are augmented by diffusive transfers of molecules between adjacent subvolumes. The ISSA can be prohibitively slow when the system is such that diffusive transfers occur much more frequently than chemical reactions. In this paper we present the Multinomial Simulation Algorithm (MSA), which is designed to, on the one hand, outperform the ISSA when diffusive transfer events outnumber reaction events, and on the other, to handle small reactant populations with greater accuracy than deterministic-stochastic hybrid algorithms. The MSA treats reactions in the usual ISSA fashion, but uses appropriately conditioned binomial random variables for representing the net numbers of molecules diffusing from any given subvolume to a neighbor within a prescribed distance. Simulation results illustrate the benefits of the algorithm.

  3. An accelerated algorithm for discrete stochastic simulation of reaction-diffusion systems using gradient-based diffusion and tau-leaping.

    PubMed

    Koh, Wonryull; Blackwell, Kim T

    2011-04-21

    Stochastic simulation of reaction-diffusion systems enables the investigation of stochastic events arising from the small numbers and heterogeneous distribution of molecular species in biological cells. Stochastic variations in intracellular microdomains and in diffusional gradients play a significant part in the spatiotemporal activity and behavior of cells. Although an exact stochastic simulation that simulates every individual reaction and diffusion event gives a most accurate trajectory of the system's state over time, it can be too slow for many practical applications. We present an accelerated algorithm for discrete stochastic simulation of reaction-diffusion systems designed to improve the speed of simulation by reducing the number of time-steps required to complete a simulation run. This method is unique in that it employs two strategies that have not been incorporated in existing spatial stochastic simulation algorithms. First, diffusive transfers between neighboring subvolumes are based on concentration gradients. This treatment necessitates sampling of only the net or observed diffusion events from higher to lower concentration gradients rather than sampling all diffusion events regardless of local concentration gradients. Second, we extend the non-negative Poisson tau-leaping method that was originally developed for speeding up nonspatial or homogeneous stochastic simulation algorithms. This method calculates each leap time in a unified step for both reaction and diffusion processes while satisfying the leap condition that the propensities do not change appreciably during the leap and ensuring that leaping does not cause molecular populations to become negative. Numerical results are presented that illustrate the improvement in simulation speed achieved by incorporating these two new strategies.

  4. A novel stochastic modeling method to simulate cooling loads in residential districts

    DOE PAGES

    An, Jingjing; Yan, Da; Hong, Tianzhen; ...

    2017-09-04

    District cooling systems are widely used in urban residential communities in China. Most of such systems are oversized, which leads to wasted investment, low operational efficiency and, thus, waste of energy. The accurate prediction of district cooling loads that can support the rightsizing of cooling plant equipment remains a challenge. This study develops a novel stochastic modeling method that consists of (1) six prototype house models representing most apartments in a district, (2) occupant behavior models of residential buildings reflecting their spatial and temporal diversity as well as their complexity based on a large-scale residential survey in China, and (3)more » a stochastic sampling process to represent all apartments and occupants in the district. The stochastic method was applied to a case study using the Designer's Simulation Toolkit (DeST) to simulate the cooling loads of a residential district in Wuhan, China. The simulation results agreed well with the measured data based on five performance metrics representing the aggregated cooling consumption, the peak cooling loads, the spatial load distribution, the temporal load distribution and the load profiles. Two prevalent simulation methods were also employed to simulate the district cooling loads. Here, the results showed that oversimplified assumptions about occupant behavior could lead to significant overestimation of the peak cooling load and the total cooling loads in the district. Future work will aim to simplify the workflow and data requirements of the stochastic method for its application, and to explore its use in predicting district heating loads and in commercial or mixed-use districts.« less

  5. A novel stochastic modeling method to simulate cooling loads in residential districts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Jingjing; Yan, Da; Hong, Tianzhen

    District cooling systems are widely used in urban residential communities in China. Most of such systems are oversized, which leads to wasted investment, low operational efficiency and, thus, waste of energy. The accurate prediction of district cooling loads that can support the rightsizing of cooling plant equipment remains a challenge. This study develops a novel stochastic modeling method that consists of (1) six prototype house models representing most apartments in a district, (2) occupant behavior models of residential buildings reflecting their spatial and temporal diversity as well as their complexity based on a large-scale residential survey in China, and (3)more » a stochastic sampling process to represent all apartments and occupants in the district. The stochastic method was applied to a case study using the Designer's Simulation Toolkit (DeST) to simulate the cooling loads of a residential district in Wuhan, China. The simulation results agreed well with the measured data based on five performance metrics representing the aggregated cooling consumption, the peak cooling loads, the spatial load distribution, the temporal load distribution and the load profiles. Two prevalent simulation methods were also employed to simulate the district cooling loads. Here, the results showed that oversimplified assumptions about occupant behavior could lead to significant overestimation of the peak cooling load and the total cooling loads in the district. Future work will aim to simplify the workflow and data requirements of the stochastic method for its application, and to explore its use in predicting district heating loads and in commercial or mixed-use districts.« less

  6. Stochastical analysis of surfactant-enhanced remediation of denser-than-water nonaqueous phase liquid (DNAPL)-contaminated soils.

    PubMed

    Zhang, Renduo; Wood, A Lynn; Enfield, Carl G; Jeong, Seung-Woo

    2003-01-01

    Stochastical analysis was performed to assess the effect of soil spatial variability and heterogeneity on the recovery of denser-than-water nonaqueous phase liquids (DNAPL) during the process of surfactant-enhanced remediation. UTCHEM, a three-dimensional, multicomponent, multiphase, compositional model, was used to simulate water flow and chemical transport processes in heterogeneous soils. Soil spatial variability and heterogeneity were accounted for by considering the soil permeability as a spatial random variable and a geostatistical method was used to generate random distributions of the permeability. The randomly generated permeability fields were incorporated into UTCHEM to simulate DNAPL transport in heterogeneous media and stochastical analysis was conducted based on the simulated results. From the analysis, an exponential relationship between average DNAPL recovery and soil heterogeneity (defined as the standard deviation of log of permeability) was established with a coefficient of determination (r2) of 0.991, which indicated that DNAPL recovery decreased exponentially with increasing soil heterogeneity. Temporal and spatial distributions of relative saturations in the water phase, DNAPL, and microemulsion in heterogeneous soils were compared with those in homogeneous soils and related to soil heterogeneity. Cleanup time and uncertainty to determine DNAPL distributions in heterogeneous soils were also quantified. The study would provide useful information to design strategies for the characterization and remediation of nonaqueous phase liquid-contaminated soils with spatial variability and heterogeneity.

  7. Ground motion simulation for the 23 August 2011, Mineral, Virginia earthquake using physics-based and stochastic broadband methods

    USGS Publications Warehouse

    Sun, Xiaodan; Hartzell, Stephen; Rezaeian, Sanaz

    2015-01-01

    Three broadband simulation methods are used to generate synthetic ground motions for the 2011 Mineral, Virginia, earthquake and compare with observed motions. The methods include a physics‐based model by Hartzell et al. (1999, 2005), a stochastic source‐based model by Boore (2009), and a stochastic site‐based model by Rezaeian and Der Kiureghian (2010, 2012). The ground‐motion dataset consists of 40 stations within 600 km of the epicenter. Several metrics are used to validate the simulations: (1) overall bias of response spectra and Fourier spectra (from 0.1 to 10 Hz); (2) spatial distribution of residuals for GMRotI50 peak ground acceleration (PGA), peak ground velocity, and pseudospectral acceleration (PSA) at various periods; (3) comparison with ground‐motion prediction equations (GMPEs) for the eastern United States. Our results show that (1) the physics‐based model provides satisfactory overall bias from 0.1 to 10 Hz and produces more realistic synthetic waveforms; (2) the stochastic site‐based model also yields more realistic synthetic waveforms and performs superiorly for frequencies greater than about 1 Hz; (3) the stochastic source‐based model has larger bias at lower frequencies (<0.5  Hz) and cannot reproduce the varying frequency content in the time domain. The spatial distribution of GMRotI50 residuals shows that there is no obvious pattern with distance in the simulation bias, but there is some azimuthal variability. The comparison between synthetics and GMPEs shows similar fall‐off with distance for all three models, comparable PGA and PSA amplitudes for the physics‐based and stochastic site‐based models, and systematic lower amplitudes for the stochastic source‐based model at lower frequencies (<0.5  Hz).

  8. A framework for discrete stochastic simulation on 3D moving boundary domains

    DOE PAGES

    Drawert, Brian; Hellander, Stefan; Trogdon, Michael; ...

    2016-11-14

    We have developed a method for modeling spatial stochastic biochemical reactions in complex, three-dimensional, and time-dependent domains using the reaction-diffusion master equation formalism. In particular, we look to address the fully coupled problems that arise in systems biology where the shape and mechanical properties of a cell are determined by the state of the biochemistry and vice versa. To validate our method and characterize the error involved, we compare our results for a carefully constructed test problem to those of a microscale implementation. Finally, we demonstrate the effectiveness of our method by simulating a model of polarization and shmoo formationmore » during the mating of yeast. The method is generally applicable to problems in systems biology where biochemistry and mechanics are coupled, and spatial stochastic effects are critical.« less

  9. Python-based geometry preparation and simulation visualization toolkits for STEPS

    PubMed Central

    Chen, Weiliang; De Schutter, Erik

    2014-01-01

    STEPS is a stochastic reaction-diffusion simulation engine that implements a spatial extension of Gillespie's Stochastic Simulation Algorithm (SSA) in complex tetrahedral geometries. An extensive Python-based interface is provided to STEPS so that it can interact with the large number of scientific packages in Python. However, a gap existed between the interfaces of these packages and the STEPS user interface, where supporting toolkits could reduce the amount of scripting required for research projects. This paper introduces two new supporting toolkits that support geometry preparation and visualization for STEPS simulations. PMID:24782754

  10. Regional-specific Stochastic Simulation of Spatially-distributed Ground-motion Time Histories using Wavelet Packet Analysis

    NASA Astrophysics Data System (ADS)

    Huang, D.; Wang, G.

    2014-12-01

    Stochastic simulation of spatially distributed ground-motion time histories is important for performance-based earthquake design of geographically distributed systems. In this study, we develop a novel technique to stochastically simulate regionalized ground-motion time histories using wavelet packet analysis. First, a transient acceleration time history is characterized by wavelet-packet parameters proposed by Yamamoto and Baker (2013). The wavelet-packet parameters fully characterize ground-motion time histories in terms of energy content, time- frequency-domain characteristics and time-frequency nonstationarity. This study further investigates the spatial cross-correlations of wavelet-packet parameters based on geostatistical analysis of 1500 regionalized ground motion data from eight well-recorded earthquakes in California, Mexico, Japan and Taiwan. The linear model of coregionalization (LMC) is used to develop a permissible spatial cross-correlation model for each parameter group. The geostatistical analysis of ground-motion data from different regions reveals significant dependence of the LMC structure on regional site conditions, which can be characterized by the correlation range of Vs30 in each region. In general, the spatial correlation and cross-correlation of wavelet-packet parameters are stronger if the site condition is more homogeneous. Using the regional-specific spatial cross-correlation model and cokriging technique, wavelet packet parameters at unmeasured locations can be best estimated, and regionalized ground-motion time histories can be synthesized. Case studies and blind tests demonstrated that the simulated ground motions generally agree well with the actual recorded data, if the influence of regional-site conditions is considered. The developed method has great potential to be used in computational-based seismic analysis and loss estimation in a regional scale.

  11. Stochastic simulation of biological reactions, and its applications for studying actin polymerization.

    PubMed

    Ichikawa, Kazuhisa; Suzuki, Takashi; Murata, Noboru

    2010-11-30

    Molecular events in biological cells occur in local subregions, where the molecules tend to be small in number. The cytoskeleton, which is important for both the structural changes of cells and their functions, is also a countable entity because of its long fibrous shape. To simulate the local environment using a computer, stochastic simulations should be run. We herein report a new method of stochastic simulation based on random walk and reaction by the collision of all molecules. The microscopic reaction rate P(r) is calculated from the macroscopic rate constant k. The formula involves only local parameters embedded for each molecule. The results of the stochastic simulations of simple second-order, polymerization, Michaelis-Menten-type and other reactions agreed quite well with those of deterministic simulations when the number of molecules was sufficiently large. An analysis of the theory indicated a relationship between variance and the number of molecules in the system, and results of multiple stochastic simulation runs confirmed this relationship. We simulated Ca²(+) dynamics in a cell by inward flow from a point on the cell surface and the polymerization of G-actin forming F-actin. Our results showed that this theory and method can be used to simulate spatially inhomogeneous events.

  12. InterSpread Plus: a spatial and stochastic simulation model of disease in animal populations.

    PubMed

    Stevenson, M A; Sanson, R L; Stern, M W; O'Leary, B D; Sujau, M; Moles-Benfell, N; Morris, R S

    2013-04-01

    We describe the spatially explicit, stochastic simulation model of disease spread, InterSpread Plus, in terms of its epidemiological framework, operation, and mode of use. The input data required by the model, the method for simulating contact and infection spread, and methods for simulating disease control measures are described. Data and parameters that are essential for disease simulation modelling using InterSpread Plus are distinguished from those that are non-essential, and it is suggested that a rational approach to simulating disease epidemics using this tool is to start with core data and parameters, adding additional layers of complexity if and when the specific requirements of the simulation exercise require it. We recommend that simulation models of disease are best developed as part of epidemic contingency planning so decision makers are familiar with model outputs and assumptions and are well-positioned to evaluate their strengths and weaknesses to make informed decisions in times of crisis. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Acceleration of discrete stochastic biochemical simulation using GPGPU.

    PubMed

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130.

  14. Acceleration of discrete stochastic biochemical simulation using GPGPU

    PubMed Central

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130. PMID:25762936

  15. Improved estimation of hydraulic conductivity by combining stochastically simulated hydrofacies with geophysical data.

    PubMed

    Zhu, Lin; Gong, Huili; Chen, Yun; Li, Xiaojuan; Chang, Xiang; Cui, Yijiao

    2016-03-01

    Hydraulic conductivity is a major parameter affecting the output accuracy of groundwater flow and transport models. The most commonly used semi-empirical formula for estimating conductivity is Kozeny-Carman equation. However, this method alone does not work well with heterogeneous strata. Two important parameters, grain size and porosity, often show spatial variations at different scales. This study proposes a method for estimating conductivity distributions by combining a stochastic hydrofacies model with geophysical methods. The Markov chain model with transition probability matrix was adopted to re-construct structures of hydrofacies for deriving spatial deposit information. The geophysical and hydro-chemical data were used to estimate the porosity distribution through the Archie's law. Results show that the stochastic simulated hydrofacies model reflects the sedimentary features with an average model accuracy of 78% in comparison with borehole log data in the Chaobai alluvial fan. The estimated conductivity is reasonable and of the same order of magnitude of the outcomes of the pumping tests. The conductivity distribution is consistent with the sedimentary distributions. This study provides more reliable spatial distributions of the hydraulic parameters for further numerical modeling.

  16. An agent-based stochastic Occupancy Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  17. An agent-based stochastic Occupancy Simulator

    DOE PAGES

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    2017-06-01

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  18. Stochastic simulation of the spray formation assisted by a high pressure

    NASA Astrophysics Data System (ADS)

    Gorokhovski, M.; Chtab-Desportes, A.; Voloshina, I.; Askarova, A.

    2010-03-01

    The stochastic model of spray formation in the vicinity of the injector and in the far-field has been described and assessed by comparison with measurements in Diesel-like conditions. In the proposed mesh-free approach, the 3D configuration of continuous liquid core is simulated stochastically by ensemble of spatial trajectories of the specifically introduced stochastic particles. The parameters of the stochastic process are presumed from the physics of primary atomization. The spray formation model consists in computation of spatial distribution of the probability of finding the non-fragmented liquid jet in the near-to-injector region. This model is combined with KIVA II computation of atomizing Diesel spray in two-ways. First, simultaneously with the gas phase RANS computation, the ensemble of stochastic particles is tracking and the probability field of their positions is calculated, which is used for sampling of initial locations of primary blobs. Second, the velocity increment of the gas due to the liquid injection is computed from the mean volume fraction of the simulated liquid core. Two novelties are proposed in the secondary atomization modeling. The first one is due to unsteadiness of the injection velocity. When the injection velocity increment in time is decreasing, the supplementary breakup may be induced. Therefore the critical Weber number is based on such increment. Second, a new stochastic model of the secondary atomization is proposed, in which the intermittent turbulent stretching is taken into account as the main mechanism. The measurements reported by Arcoumanis et al. (time-history of the mean axial centre-line velocity of droplet, and of the centre-line Sauter Mean Diameter), are compared with computations.

  19. Compartmental and Spatial Rule-Based Modeling with Virtual Cell.

    PubMed

    Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M

    2017-10-03

    In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  20. The role of stochastic storms on hillslope runoff generation and connectivity in a dryland basin

    NASA Astrophysics Data System (ADS)

    Michaelides, K.; Singer, M. B.; Mudd, S. M.

    2016-12-01

    Despite low annual rainfall, dryland basins can generate significant surface runoff during certain rainstorms, which can cause flash flooding and high rates of erosion. However, it remains challenging to anticipate the nature and frequency of runoff generation in hydrological systems which are driven by spatially and temporally stochastic rainstorms. In particular, the stochasticity of rainfall presents challenges to simulating the hydrological response of dryland basins and understanding flow connectivity from hillslopes to the channel. Here we simulate hillslope runoff generation using rainfall characteristics produced by a simple stochastic rainfall generator, which is based on a rich rainfall dataset from the Walnut Gulch Experimental Watershed (WGEW) in Arizona, USA. We assess hillslope runoff generation using the hydrological model, COUP2D, driven by a subset of characteristic output from multiple ensembles of decadal monsoonal rainfall from the stochastic rainfall generator. The rainfall generator operates across WGEW by simulating storms with areas smaller than the basin and enables explicit characterization of rainfall characteristics at any location. We combine the characteristics of rainfall intensity and duration with data on rainstorm area and location to model the surface runoff properties (depth, velocity, duration, distance downslope) on a range of hillslopes within the basin derived from LiDAR analysis. We also analyze connectivity of flow from hillslopes to the channel for various combinations of hillslopes and storms. This approach provides a framework for understanding spatial and temporal dynamics of runoff generation and connectivity that is faithful to the hydrological characteristics of dryland environments.

  1. Coupled stochastic spatial and non-spatial simulations of ErbB1 signaling pathways demonstrate the importance of spatial organization in signal transduction.

    PubMed

    Costa, Michelle N; Radhakrishnan, Krishnan; Wilson, Bridget S; Vlachos, Dionisios G; Edwards, Jeremy S

    2009-07-23

    The ErbB family of receptors activates intracellular signaling pathways that control cellular proliferation, growth, differentiation and apoptosis. Given these central roles, it is not surprising that overexpression of the ErbB receptors is often associated with carcinogenesis. Therefore, extensive laboratory studies have been devoted to understanding the signaling events associated with ErbB activation. Systems biology has contributed significantly to our current understanding of ErbB signaling networks. However, although computational models have grown in complexity over the years, little work has been done to consider the spatial-temporal dynamics of receptor interactions and to evaluate how spatial organization of membrane receptors influences signaling transduction. Herein, we explore the impact of spatial organization of the epidermal growth factor receptor (ErbB1/EGFR) on the initiation of downstream signaling. We describe the development of an algorithm that couples a spatial stochastic model of membrane receptors with a nonspatial stochastic model of the reactions and interactions in the cytosol. This novel algorithm provides a computationally efficient method to evaluate the effects of spatial heterogeneity on the coupling of receptors to cytosolic signaling partners. Mathematical models of signal transduction rarely consider the contributions of spatial organization due to high computational costs. A hybrid stochastic approach simplifies analyses of the spatio-temporal aspects of cell signaling and, as an example, demonstrates that receptor clustering contributes significantly to the efficiency of signal propagation from ligand-engaged growth factor receptors.

  2. An advanced stochastic weather generator for simulating 2-D high-resolution climate variables

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo

    2017-07-01

    A new stochastic weather generator, Advanced WEather GENerator for a two-dimensional grid (AWE-GEN-2d) is presented. The model combines physical and stochastic approaches to simulate key meteorological variables at high spatial and temporal resolution: 2 km × 2 km and 5 min for precipitation and cloud cover and 100 m × 100 m and 1 h for near-surface air temperature, solar radiation, vapor pressure, atmospheric pressure, and near-surface wind. The model requires spatially distributed data for the calibration process, which can nowadays be obtained by remote sensing devices (weather radar and satellites), reanalysis data sets and ground stations. AWE-GEN-2d is parsimonious in terms of computational demand and therefore is particularly suitable for studies where exploring internal climatic variability at multiple spatial and temporal scales is fundamental. Applications of the model include models of environmental systems, such as hydrological and geomorphological models, where high-resolution spatial and temporal meteorological forcing is crucial. The weather generator was calibrated and validated for the Engelberg region, an area with complex topography in the Swiss Alps. Model test shows that the climate variables are generated by AWE-GEN-2d with a level of accuracy that is sufficient for many practical applications.

  3. ISIM3D: AN ANSI-C THREE-DIMENSIONAL MULTIPLE INDICATOR CONDITIONAL SIMULATION PROGRAM

    EPA Science Inventory

    The indicator conditional simulation technique provides stochastic simulations of a variable that (i) honor the initial data and (ii) can feature a richer family of spatial structures not limited by Gaussianity. he data are encoded into a series of indicators which then are used ...

  4. An adaptive tau-leaping method for stochastic simulations of reaction-diffusion systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padgett, Jill M. A.; Ilie, Silvana, E-mail: silvana@ryerson.ca

    2016-03-15

    Stochastic modelling is critical for studying many biochemical processes in a cell, in particular when some reacting species have low population numbers. For many such cellular processes the spatial distribution of the molecular species plays a key role. The evolution of spatially heterogeneous biochemical systems with some species in low amounts is accurately described by the mesoscopic model of the Reaction-Diffusion Master Equation. The Inhomogeneous Stochastic Simulation Algorithm provides an exact strategy to numerically solve this model, but it is computationally very expensive on realistic applications. We propose a novel adaptive time-stepping scheme for the tau-leaping method for approximating themore » solution of the Reaction-Diffusion Master Equation. This technique combines effective strategies for variable time-stepping with path preservation to reduce the computational cost, while maintaining the desired accuracy. The numerical tests on various examples arising in applications show the improved efficiency achieved by the new adaptive method.« less

  5. Mesoscopic-microscopic spatial stochastic simulation with automatic system partitioning.

    PubMed

    Hellander, Stefan; Hellander, Andreas; Petzold, Linda

    2017-12-21

    The reaction-diffusion master equation (RDME) is a model that allows for efficient on-lattice simulation of spatially resolved stochastic chemical kinetics. Compared to off-lattice hard-sphere simulations with Brownian dynamics or Green's function reaction dynamics, the RDME can be orders of magnitude faster if the lattice spacing can be chosen coarse enough. However, strongly diffusion-controlled reactions mandate a very fine mesh resolution for acceptable accuracy. It is common that reactions in the same model differ in their degree of diffusion control and therefore require different degrees of mesh resolution. This renders mesoscopic simulation inefficient for systems with multiscale properties. Mesoscopic-microscopic hybrid methods address this problem by resolving the most challenging reactions with a microscale, off-lattice simulation. However, all methods to date require manual partitioning of a system, effectively limiting their usefulness as "black-box" simulation codes. In this paper, we propose a hybrid simulation algorithm with automatic system partitioning based on indirect a priori error estimates. We demonstrate the accuracy and efficiency of the method on models of diffusion-controlled networks in 3D.

  6. Improved estimation of hydraulic conductivity by combining stochastically simulated hydrofacies with geophysical data

    PubMed Central

    Zhu, Lin; Gong, Huili; Chen, Yun; Li, Xiaojuan; Chang, Xiang; Cui, Yijiao

    2016-01-01

    Hydraulic conductivity is a major parameter affecting the output accuracy of groundwater flow and transport models. The most commonly used semi-empirical formula for estimating conductivity is Kozeny-Carman equation. However, this method alone does not work well with heterogeneous strata. Two important parameters, grain size and porosity, often show spatial variations at different scales. This study proposes a method for estimating conductivity distributions by combining a stochastic hydrofacies model with geophysical methods. The Markov chain model with transition probability matrix was adopted to re-construct structures of hydrofacies for deriving spatial deposit information. The geophysical and hydro-chemical data were used to estimate the porosity distribution through the Archie’s law. Results show that the stochastic simulated hydrofacies model reflects the sedimentary features with an average model accuracy of 78% in comparison with borehole log data in the Chaobai alluvial fan. The estimated conductivity is reasonable and of the same order of magnitude of the outcomes of the pumping tests. The conductivity distribution is consistent with the sedimentary distributions. This study provides more reliable spatial distributions of the hydraulic parameters for further numerical modeling. PMID:26927886

  7. Modeling precipitation-runoff relationships to determine water yield from a ponderosa pine forest watershed

    Treesearch

    Assefa S. Desta

    2006-01-01

    A stochastic precipitation-runoff modeling is used to estimate a cold and warm-seasons water yield from a ponderosa pine forested watershed in the north-central Arizona. The model consists of two parts namely, simulation of the temporal and spatial distribution of precipitation using a stochastic, event-based approach and estimation of water yield from the watershed...

  8. Stochastic simulation of spatially correlated geo-processes

    USGS Publications Warehouse

    Christakos, G.

    1987-01-01

    In this study, developments in the theory of stochastic simulation are discussed. The unifying element is the notion of Radon projection in Euclidean spaces. This notion provides a natural way of reconstructing the real process from a corresponding process observable on a reduced dimensionality space, where analysis is theoretically easier and computationally tractable. Within this framework, the concept of space transformation is defined and several of its properties, which are of significant importance within the context of spatially correlated processes, are explored. The turning bands operator is shown to follow from this. This strengthens considerably the theoretical background of the geostatistical method of simulation, and some new results are obtained in both the space and frequency domains. The inverse problem is solved generally and the applicability of the method is extended to anisotropic as well as integrated processes. Some ill-posed problems of the inverse operator are discussed. Effects of the measurement error and impulses at origin are examined. Important features of the simulated process as described by geomechanical laws, the morphology of the deposit, etc., may be incorporated in the analysis. The simulation may become a model-dependent procedure and this, in turn, may provide numerical solutions to spatial-temporal geologic models. Because the spatial simu??lation may be technically reduced to unidimensional simulations, various techniques of generating one-dimensional realizations are reviewed. To link theory and practice, an example is computed in detail. ?? 1987 International Association for Mathematical Geology.

  9. Constraining Stochastic Parametrisation Schemes Using High-Resolution Model Simulations

    NASA Astrophysics Data System (ADS)

    Christensen, H. M.; Dawson, A.; Palmer, T.

    2017-12-01

    Stochastic parametrisations are used in weather and climate models as a physically motivated way to represent model error due to unresolved processes. Designing new stochastic schemes has been the target of much innovative research over the last decade. While a focus has been on developing physically motivated approaches, many successful stochastic parametrisation schemes are very simple, such as the European Centre for Medium-Range Weather Forecasts (ECMWF) multiplicative scheme `Stochastically Perturbed Parametrisation Tendencies' (SPPT). The SPPT scheme improves the skill of probabilistic weather and seasonal forecasts, and so is widely used. However, little work has focused on assessing the physical basis of the SPPT scheme. We address this matter by using high-resolution model simulations to explicitly measure the `error' in the parametrised tendency that SPPT seeks to represent. The high resolution simulations are first coarse-grained to the desired forecast model resolution before they are used to produce initial conditions and forcing data needed to drive the ECMWF Single Column Model (SCM). By comparing SCM forecast tendencies with the evolution of the high resolution model, we can measure the `error' in the forecast tendencies. In this way, we provide justification for the multiplicative nature of SPPT, and for the temporal and spatial scales of the stochastic perturbations. However, we also identify issues with the SPPT scheme. It is therefore hoped these measurements will improve both holistic and process based approaches to stochastic parametrisation. Figure caption: Instantaneous snapshot of the optimal SPPT stochastic perturbation, derived by comparing high-resolution simulations with a low resolution forecast model.

  10. Simulation of spatially evolving turbulence and the applicability of Taylor's hypothesis in compressible flow

    NASA Technical Reports Server (NTRS)

    Lee, Sangsan; Lele, Sanjiva K.; Moin, Parviz

    1992-01-01

    For the numerical simulation of inhomogeneous turbulent flows, a method is developed for generating stochastic inflow boundary conditions with a prescribed power spectrum. Turbulence statistics from spatial simulations using this method with a low fluctuation Mach number are in excellent agreement with the experimental data, which validates the procedure. Turbulence statistics from spatial simulations are also compared to those from temporal simulations using Taylor's hypothesis. Statistics such as turbulence intensity, vorticity, and velocity derivative skewness compare favorably with the temporal simulation. However, the statistics of dilatation show a significant departure from those obtained in the temporal simulation. To directly check the applicability of Taylor's hypothesis, space-time correlations of fluctuations in velocity, vorticity, and dilatation are investigated. Convection velocities based on vorticity and velocity fluctuations are computed as functions of the spatial and temporal separations. The profile of the space-time correlation of dilatation fluctuations is explained via a wave propagation model.

  11. Smoldyn on graphics processing units: massively parallel Brownian dynamics simulations.

    PubMed

    Dematté, Lorenzo

    2012-01-01

    Space is a very important aspect in the simulation of biochemical systems; recently, the need for simulation algorithms able to cope with space is becoming more and more compelling. Complex and detailed models of biochemical systems need to deal with the movement of single molecules and particles, taking into consideration localized fluctuations, transportation phenomena, and diffusion. A common drawback of spatial models lies in their complexity: models can become very large, and their simulation could be time consuming, especially if we want to capture the systems behavior in a reliable way using stochastic methods in conjunction with a high spatial resolution. In order to deliver the promise done by systems biology to be able to understand a system as whole, we need to scale up the size of models we are able to simulate, moving from sequential to parallel simulation algorithms. In this paper, we analyze Smoldyn, a widely diffused algorithm for stochastic simulation of chemical reactions with spatial resolution and single molecule detail, and we propose an alternative, innovative implementation that exploits the parallelism of Graphics Processing Units (GPUs). The implementation executes the most computational demanding steps (computation of diffusion, unimolecular, and bimolecular reaction, as well as the most common cases of molecule-surface interaction) on the GPU, computing them in parallel on each molecule of the system. The implementation offers good speed-ups and real time, high quality graphics output

  12. Stochastic simulation in systems biology

    PubMed Central

    Székely, Tamás; Burrage, Kevin

    2014-01-01

    Natural systems are, almost by definition, heterogeneous: this can be either a boon or an obstacle to be overcome, depending on the situation. Traditionally, when constructing mathematical models of these systems, heterogeneity has typically been ignored, despite its critical role. However, in recent years, stochastic computational methods have become commonplace in science. They are able to appropriately account for heterogeneity; indeed, they are based around the premise that systems inherently contain at least one source of heterogeneity (namely, intrinsic heterogeneity). In this mini-review, we give a brief introduction to theoretical modelling and simulation in systems biology and discuss the three different sources of heterogeneity in natural systems. Our main topic is an overview of stochastic simulation methods in systems biology. There are many different types of stochastic methods. We focus on one group that has become especially popular in systems biology, biochemistry, chemistry and physics. These discrete-state stochastic methods do not follow individuals over time; rather they track only total populations. They also assume that the volume of interest is spatially homogeneous. We give an overview of these methods, with a discussion of the advantages and disadvantages of each, and suggest when each is more appropriate to use. We also include references to software implementations of them, so that beginners can quickly start using stochastic methods for practical problems of interest. PMID:25505503

  13. Accurate reaction-diffusion operator splitting on tetrahedral meshes for parallel stochastic molecular simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hepburn, I.; De Schutter, E., E-mail: erik@oist.jp; Theoretical Neurobiology & Neuroengineering, University of Antwerp, Antwerp 2610

    Spatial stochastic molecular simulations in biology are limited by the intense computation required to track molecules in space either in a discrete time or discrete space framework, which has led to the development of parallel methods that can take advantage of the power of modern supercomputers in recent years. We systematically test suggested components of stochastic reaction-diffusion operator splitting in the literature and discuss their effects on accuracy. We introduce an operator splitting implementation for irregular meshes that enhances accuracy with minimal performance cost. We test a range of models in small-scale MPI simulations from simple diffusion models to realisticmore » biological models and find that multi-dimensional geometry partitioning is an important consideration for optimum performance. We demonstrate performance gains of 1-3 orders of magnitude in the parallel implementation, with peak performance strongly dependent on model specification.« less

  14. Global climate impacts of stochastic deep convection parameterization in the NCAR CAM5

    DOE PAGES

    Wang, Yong; Zhang, Guang J.

    2016-09-29

    In this paper, the stochastic deep convection parameterization of Plant and Craig (PC) is implemented in the Community Atmospheric Model version 5 (CAM5) to incorporate the stochastic processes of convection into the Zhang-McFarlane (ZM) deterministic deep convective scheme. Its impacts on deep convection, shallow convection, large-scale precipitation and associated dynamic and thermodynamic fields are investigated. Results show that with the introduction of the PC stochastic parameterization, deep convection is decreased while shallow convection is enhanced. The decrease in deep convection is mainly caused by the stochastic process and the spatial averaging of input quantities for the PC scheme. More detrainedmore » liquid water associated with more shallow convection leads to significant increase in liquid water and ice water paths, which increases large-scale precipitation in tropical regions. Specific humidity, relative humidity, zonal wind in the tropics, and precipitable water are all improved. The simulation of shortwave cloud forcing (SWCF) is also improved. The PC stochastic parameterization decreases the global mean SWCF from -52.25 W/m 2 in the standard CAM5 to -48.86 W/m 2, close to -47.16 W/m 2 in observations. The improvement in SWCF over the tropics is due to decreased low cloud fraction simulated by the stochastic scheme. Sensitivity tests of tuning parameters are also performed to investigate the sensitivity of simulated climatology to uncertain parameters in the stochastic deep convection scheme.« less

  15. Global climate impacts of stochastic deep convection parameterization in the NCAR CAM5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yong; Zhang, Guang J.

    In this paper, the stochastic deep convection parameterization of Plant and Craig (PC) is implemented in the Community Atmospheric Model version 5 (CAM5) to incorporate the stochastic processes of convection into the Zhang-McFarlane (ZM) deterministic deep convective scheme. Its impacts on deep convection, shallow convection, large-scale precipitation and associated dynamic and thermodynamic fields are investigated. Results show that with the introduction of the PC stochastic parameterization, deep convection is decreased while shallow convection is enhanced. The decrease in deep convection is mainly caused by the stochastic process and the spatial averaging of input quantities for the PC scheme. More detrainedmore » liquid water associated with more shallow convection leads to significant increase in liquid water and ice water paths, which increases large-scale precipitation in tropical regions. Specific humidity, relative humidity, zonal wind in the tropics, and precipitable water are all improved. The simulation of shortwave cloud forcing (SWCF) is also improved. The PC stochastic parameterization decreases the global mean SWCF from -52.25 W/m 2 in the standard CAM5 to -48.86 W/m 2, close to -47.16 W/m 2 in observations. The improvement in SWCF over the tropics is due to decreased low cloud fraction simulated by the stochastic scheme. Sensitivity tests of tuning parameters are also performed to investigate the sensitivity of simulated climatology to uncertain parameters in the stochastic deep convection scheme.« less

  16. Confinement and diffusion modulate bistability and stochastic switching in a reaction network with positive feedback

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mlynarczyk, Paul J.; Pullen, Robert H.; Abel, Steven M., E-mail: abel@utk.edu

    2016-01-07

    Positive feedback is a common feature in signal transduction networks and can lead to phenomena such as bistability and signal propagation by domain growth. Physical features of the cellular environment, such as spatial confinement and the mobility of proteins, play important but inadequately understood roles in shaping the behavior of signaling networks. Here, we use stochastic, spatially resolved kinetic Monte Carlo simulations to explore a positive feedback network as a function of system size, system shape, and mobility of molecules. We show that these physical properties can markedly alter characteristics of bistability and stochastic switching when compared with well-mixed simulations.more » Notably, systems of equal volume but different shapes can exhibit qualitatively different behaviors under otherwise identical conditions. We show that stochastic switching to a state maintained by positive feedback occurs by cluster formation and growth. Additionally, the frequency at which switching occurs depends nontrivially on the diffusion coefficient, which can promote or suppress switching relative to the well-mixed limit. Taken together, the results provide a framework for understanding how confinement and protein mobility influence emergent features of the positive feedback network by modulating molecular concentrations, diffusion-influenced rate parameters, and spatiotemporal correlations between molecules.« less

  17. How noise and coupling influence leading indicators of population extinction in a spatially extended ecological system.

    PubMed

    O'Regan, Suzanne M

    2018-12-01

    Anticipating critical transitions in spatially extended systems is a key topic of interest to ecologists. Gradually declining metapopulations are an important example of a spatially extended biological system that may exhibit a critical transition. Theory for spatially extended systems approaching extinction that accounts for environmental stochasticity and coupling is currently lacking. Here, we develop spatially implicit two-patch models with additive and multiplicative forms of environmental stochasticity that are slowly forced through population collapse, through changing environmental conditions. We derive patch-specific expressions for candidate indicators of extinction and test their performance via a simulation study. Coupling and spatial heterogeneities decrease the magnitude of the proposed indicators in coupled populations relative to isolated populations, and the noise regime and the degree of coupling together determine trends in summary statistics. This theory may be readily applied to other spatially extended ecological systems, such as coupled infectious disease systems on the verge of elimination.

  18. Identification of gene regulation models from single-cell data

    NASA Astrophysics Data System (ADS)

    Weber, Lisa; Raymond, William; Munsky, Brian

    2018-09-01

    In quantitative analyses of biological processes, one may use many different scales of models (e.g. spatial or non-spatial, deterministic or stochastic, time-varying or at steady-state) or many different approaches to match models to experimental data (e.g. model fitting or parameter uncertainty/sloppiness quantification with different experiment designs). These different analyses can lead to surprisingly different results, even when applied to the same data and the same model. We use a simplified gene regulation model to illustrate many of these concerns, especially for ODE analyses of deterministic processes, chemical master equation and finite state projection analyses of heterogeneous processes, and stochastic simulations. For each analysis, we employ MATLAB and PYTHON software to consider a time-dependent input signal (e.g. a kinase nuclear translocation) and several model hypotheses, along with simulated single-cell data. We illustrate different approaches (e.g. deterministic and stochastic) to identify the mechanisms and parameters of the same model from the same simulated data. For each approach, we explore how uncertainty in parameter space varies with respect to the chosen analysis approach or specific experiment design. We conclude with a discussion of how our simulated results relate to the integration of experimental and computational investigations to explore signal-activated gene expression models in yeast (Neuert et al 2013 Science 339 584–7) and human cells (Senecal et al 2014 Cell Rep. 8 75–83)5.

  19. Compensating for estimation smoothing in kriging

    USGS Publications Warehouse

    Olea, R.A.; Pawlowsky, Vera

    1996-01-01

    Smoothing is a characteristic inherent to all minimum mean-square-error spatial estimators such as kriging. Cross-validation can be used to detect and model such smoothing. Inversion of the model produces a new estimator-compensated kriging. A numerical comparison based on an exhaustive permeability sampling of a 4-fr2 slab of Berea Sandstone shows that the estimation surface generated by compensated kriging has properties intermediate between those generated by ordinary kriging and stochastic realizations resulting from simulated annealing and sequential Gaussian simulation. The frequency distribution is well reproduced by the compensated kriging surface, which also approximates the experimental semivariogram well - better than ordinary kriging, but not as well as stochastic realizations. Compensated kriging produces surfaces that are more accurate than stochastic realizations, but not as accurate as ordinary kriging. ?? 1996 International Association for Mathematical Geology.

  20. Two stochastic models useful in petroleum exploration

    NASA Technical Reports Server (NTRS)

    Kaufman, G. M.; Bradley, P. G.

    1972-01-01

    A model of the petroleum exploration process that tests empirically the hypothesis that at an early stage in the exploration of a basin, the process behaves like sampling without replacement is proposed along with a model of the spatial distribution of petroleum reserviors that conforms to observed facts. In developing the model of discovery, the following topics are discussed: probabilitistic proportionality, likelihood function, and maximum likelihood estimation. In addition, the spatial model is described, which is defined as a stochastic process generating values of a sequence or random variables in a way that simulates the frequency distribution of areal extent, the geographic location, and shape of oil deposits

  1. Synchronous parallel spatially resolved stochastic cluster dynamics

    DOE PAGES

    Dunn, Aaron; Dingreville, Rémi; Martínez, Enrique; ...

    2016-04-23

    In this work, a spatially resolved stochastic cluster dynamics (SRSCD) model for radiation damage accumulation in metals is implemented using a synchronous parallel kinetic Monte Carlo algorithm. The parallel algorithm is shown to significantly increase the size of representative volumes achievable in SRSCD simulations of radiation damage accumulation. Additionally, weak scaling performance of the method is tested in two cases: (1) an idealized case of Frenkel pair diffusion and annihilation, and (2) a characteristic example problem including defect cluster formation and growth in α-Fe. For the latter case, weak scaling is tested using both Frenkel pair and displacement cascade damage.more » To improve scaling of simulations with cascade damage, an explicit cascade implantation scheme is developed for cases in which fast-moving defects are created in displacement cascades. For the first time, simulation of radiation damage accumulation in nanopolycrystals can be achieved with a three dimensional rendition of the microstructure, allowing demonstration of the effect of grain size on defect accumulation in Frenkel pair-irradiated α-Fe.« less

  2. Spatial modeling of cell signaling networks.

    PubMed

    Cowan, Ann E; Moraru, Ion I; Schaff, James C; Slepchenko, Boris M; Loew, Leslie M

    2012-01-01

    The shape of a cell, the sizes of subcellular compartments, and the spatial distribution of molecules within the cytoplasm can all control how molecules interact to produce a cellular behavior. This chapter describes how these spatial features can be included in mechanistic mathematical models of cell signaling. The Virtual Cell computational modeling and simulation software is used to illustrate the considerations required to build a spatial model. An explanation of how to appropriately choose between physical formulations that implicitly or explicitly account for cell geometry and between deterministic versus stochastic formulations for molecular dynamics is provided, along with a discussion of their respective strengths and weaknesses. As a first step toward constructing a spatial model, the geometry needs to be specified and associated with the molecules, reactions, and membrane flux processes of the network. Initial conditions, diffusion coefficients, velocities, and boundary conditions complete the specifications required to define the mathematics of the model. The numerical methods used to solve reaction-diffusion problems both deterministically and stochastically are then described and some guidance is provided in how to set up and run simulations. A study of cAMP signaling in neurons ends the chapter, providing an example of the insights that can be gained in interpreting experimental results through the application of spatial modeling. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Regression and Geostatistical Techniques: Considerations and Observations from Experiences in NE-FIA

    Treesearch

    Rachel Riemann; Andrew Lister

    2005-01-01

    Maps of forest variables improve our understanding of the forest resource by allowing us to view and analyze it spatially. The USDA Forest Service's Northeastern Forest Inventory and Analysis unit (NE-FIA) has used geostatistical techniques, particularly stochastic simulation, to produce maps and spatial data sets of FIA variables. That work underscores the...

  4. Temporal acceleration of spatially distributed kinetic Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Abhijit; Vlachos, Dionisios G.

    The computational intensity of kinetic Monte Carlo (KMC) simulation is a major impediment in simulating large length and time scales. In recent work, an approximate method for KMC simulation of spatially uniform systems, termed the binomial {tau}-leap method, was introduced [A. Chatterjee, D.G. Vlachos, M.A. Katsoulakis, Binomial distribution based {tau}-leap accelerated stochastic simulation, J. Chem. Phys. 122 (2005) 024112], where molecular bundles instead of individual processes are executed over coarse-grained time increments. This temporal coarse-graining can lead to significant computational savings but its generalization to spatially lattice KMC simulation has not been realized yet. Here we extend the binomial {tau}-leapmore » method to lattice KMC simulations by combining it with spatially adaptive coarse-graining. Absolute stability and computational speed-up analyses for spatial systems along with simulations provide insights into the conditions where accuracy and substantial acceleration of the new spatio-temporal coarse-graining method are ensured. Model systems demonstrate that the r-time increment criterion of Chatterjee et al. obeys the absolute stability limit for values of r up to near 1.« less

  5. Stochastic simulation of reaction-diffusion systems: A fluctuating-hydrodynamics approach

    NASA Astrophysics Data System (ADS)

    Kim, Changho; Nonaka, Andy; Bell, John B.; Garcia, Alejandro L.; Donev, Aleksandar

    2017-03-01

    We develop numerical methods for stochastic reaction-diffusion systems based on approaches used for fluctuating hydrodynamics (FHD). For hydrodynamic systems, the FHD formulation is formally described by stochastic partial differential equations (SPDEs). In the reaction-diffusion systems we consider, our model becomes similar to the reaction-diffusion master equation (RDME) description when our SPDEs are spatially discretized and reactions are modeled as a source term having Poisson fluctuations. However, unlike the RDME, which becomes prohibitively expensive for an increasing number of molecules, our FHD-based description naturally extends from the regime where fluctuations are strong, i.e., each mesoscopic cell has few (reactive) molecules, to regimes with moderate or weak fluctuations, and ultimately to the deterministic limit. By treating diffusion implicitly, we avoid the severe restriction on time step size that limits all methods based on explicit treatments of diffusion and construct numerical methods that are more efficient than RDME methods, without compromising accuracy. Guided by an analysis of the accuracy of the distribution of steady-state fluctuations for the linearized reaction-diffusion model, we construct several two-stage (predictor-corrector) schemes, where diffusion is treated using a stochastic Crank-Nicolson method, and reactions are handled by the stochastic simulation algorithm of Gillespie or a weakly second-order tau leaping method. We find that an implicit midpoint tau leaping scheme attains second-order weak accuracy in the linearized setting and gives an accurate and stable structure factor for a time step size of an order of magnitude larger than the hopping time scale of diffusing molecules. We study the numerical accuracy of our methods for the Schlögl reaction-diffusion model both in and out of thermodynamic equilibrium. We demonstrate and quantify the importance of thermodynamic fluctuations to the formation of a two-dimensional Turing-like pattern and examine the effect of fluctuations on three-dimensional chemical front propagation. By comparing stochastic simulations to deterministic reaction-diffusion simulations, we show that fluctuations accelerate pattern formation in spatially homogeneous systems and lead to a qualitatively different disordered pattern behind a traveling wave.

  6. Stochastic simulation of reaction-diffusion systems: A fluctuating-hydrodynamics approach

    DOE PAGES

    Kim, Changho; Nonaka, Andy; Bell, John B.; ...

    2017-03-24

    Here, we develop numerical methods for stochastic reaction-diffusion systems based on approaches used for fluctuating hydrodynamics (FHD). For hydrodynamic systems, the FHD formulation is formally described by stochastic partial differential equations (SPDEs). In the reaction-diffusion systems we consider, our model becomes similar to the reaction-diffusion master equation (RDME) description when our SPDEs are spatially discretized and reactions are modeled as a source term having Poisson fluctuations. However, unlike the RDME, which becomes prohibitively expensive for an increasing number of molecules, our FHD-based description naturally extends from the regime where fluctuations are strong, i.e., each mesoscopic cell has few (reactive) molecules,more » to regimes with moderate or weak fluctuations, and ultimately to the deterministic limit. By treating diffusion implicitly, we avoid the severe restriction on time step size that limits all methods based on explicit treatments of diffusion and construct numerical methods that are more efficient than RDME methods, without compromising accuracy. Guided by an analysis of the accuracy of the distribution of steady-state fluctuations for the linearized reaction-diffusion model, we construct several two-stage (predictor-corrector) schemes, where diffusion is treated using a stochastic Crank-Nicolson method, and reactions are handled by the stochastic simulation algorithm of Gillespie or a weakly second-order tau leaping method. We find that an implicit midpoint tau leaping scheme attains second-order weak accuracy in the linearized setting and gives an accurate and stable structure factor for a time step size of an order of magnitude larger than the hopping time scale of diffusing molecules. We study the numerical accuracy of our methods for the Schlögl reaction-diffusion model both in and out of thermodynamic equilibrium. We demonstrate and quantify the importance of thermodynamic fluctuations to the formation of a two-dimensional Turing-like pattern and examine the effect of fluctuations on three-dimensional chemical front propagation. Furthermore, by comparing stochastic simulations to deterministic reaction-diffusion simulations, we show that fluctuations accelerate pattern formation in spatially homogeneous systems and lead to a qualitatively different disordered pattern behind a traveling wave.« less

  7. Optoelectronic analogs of self-programming neural nets - Architecture and methodologies for implementing fast stochastic learning by simulated annealing

    NASA Technical Reports Server (NTRS)

    Farhat, Nabil H.

    1987-01-01

    Self-organization and learning is a distinctive feature of neural nets and processors that sets them apart from conventional approaches to signal processing. It leads to self-programmability which alleviates the problem of programming complexity in artificial neural nets. In this paper architectures for partitioning an optoelectronic analog of a neural net into distinct layers with prescribed interconnectivity pattern to enable stochastic learning by simulated annealing in the context of a Boltzmann machine are presented. Stochastic learning is of interest because of its relevance to the role of noise in biological neural nets. Practical considerations and methodologies for appreciably accelerating stochastic learning in such a multilayered net are described. These include the use of parallel optical computing of the global energy of the net, the use of fast nonvolatile programmable spatial light modulators to realize fast plasticity, optical generation of random number arrays, and an adaptive noisy thresholding scheme that also makes stochastic learning more biologically plausible. The findings reported predict optoelectronic chips that can be used in the realization of optical learning machines.

  8. A coupled stochastic inverse-management framework for dealing with nonpoint agriculture pollution under groundwater parameter uncertainty

    NASA Astrophysics Data System (ADS)

    Llopis-Albert, Carlos; Palacios-Marqués, Daniel; Merigó, José M.

    2014-04-01

    In this paper a methodology for the stochastic management of groundwater quality problems is presented, which can be used to provide agricultural advisory services. A stochastic algorithm to solve the coupled flow and mass transport inverse problem is combined with a stochastic management approach to develop methods for integrating uncertainty; thus obtaining more reliable policies on groundwater nitrate pollution control from agriculture. The stochastic inverse model allows identifying non-Gaussian parameters and reducing uncertainty in heterogeneous aquifers by constraining stochastic simulations to data. The management model determines the spatial and temporal distribution of fertilizer application rates that maximizes net benefits in agriculture constrained by quality requirements in groundwater at various control sites. The quality constraints can be taken, for instance, by those given by water laws such as the EU Water Framework Directive (WFD). Furthermore, the methodology allows providing the trade-off between higher economic returns and reliability in meeting the environmental standards. Therefore, this new technology can help stakeholders in the decision-making process under an uncertainty environment. The methodology has been successfully applied to a 2D synthetic aquifer, where an uncertainty assessment has been carried out by means of Monte Carlo simulation techniques.

  9. Cox process representation and inference for stochastic reaction-diffusion processes

    NASA Astrophysics Data System (ADS)

    Schnoerr, David; Grima, Ramon; Sanguinetti, Guido

    2016-05-01

    Complex behaviour in many systems arises from the stochastic interactions of spatially distributed particles or agents. Stochastic reaction-diffusion processes are widely used to model such behaviour in disciplines ranging from biology to the social sciences, yet they are notoriously difficult to simulate and calibrate to observational data. Here we use ideas from statistical physics and machine learning to provide a solution to the inverse problem of learning a stochastic reaction-diffusion process from data. Our solution relies on a non-trivial connection between stochastic reaction-diffusion processes and spatio-temporal Cox processes, a well-studied class of models from computational statistics. This connection leads to an efficient and flexible algorithm for parameter inference and model selection. Our approach shows excellent accuracy on numeric and real data examples from systems biology and epidemiology. Our work provides both insights into spatio-temporal stochastic systems, and a practical solution to a long-standing problem in computational modelling.

  10. Estimation of Kubo number and correlation length of fluctuating magnetic fields and pressure in BOUT + + edge pedestal collapse simulation

    NASA Astrophysics Data System (ADS)

    Kim, Jaewook; Lee, W.-J.; Jhang, Hogun; Kaang, H. H.; Ghim, Y.-C.

    2017-10-01

    Stochastic magnetic fields are thought to be as one of the possible mechanisms for anomalous transport of density, momentum and heat across the magnetic field lines. Kubo number and Chirikov parameter are quantifications of the stochasticity, and previous studies show that perpendicular transport strongly depends on the magnetic Kubo number (MKN). If MKN is smaller than one, diffusion process will follow Rechester-Rosenbluth model; whereas if it is larger than one, percolation theory dominates the diffusion process. Thus, estimation of Kubo number plays an important role to understand diffusion process caused by stochastic magnetic fields. However, spatially localized experimental measurement of fluctuating magnetic fields in a tokamak is difficult, and we attempt to estimate MKNs using BOUT + + simulation data with pedestal collapse. In addition, we calculate correlation length of fluctuating pressures and Chirikov parameters to investigate variation correlation lengths in the simulation. We, then, discuss how one may experimentally estimate MKNs.

  11. Spatial distribution and optimal harvesting of an age-structured population in a fluctuating environment.

    PubMed

    Engen, Steinar; Lee, Aline Magdalena; Sæther, Bernt-Erik

    2018-02-01

    We analyze a spatial age-structured model with density regulation, age specific dispersal, stochasticity in vital rates and proportional harvesting. We include two age classes, juveniles and adults, where juveniles are subject to logistic density dependence. There are environmental stochastic effects with arbitrary spatial scales on all birth and death rates, and individuals of both age classes are subject to density independent dispersal with given rates and specified distributions of dispersal distances. We show how to simulate the joint density fields of the age classes and derive results for the spatial scales of all spatial autocovariance functions for densities. A general result is that the squared scale has an additive term equal to the squared scale of the environmental noise, corresponding to the Moran effect, as well as additive terms proportional to the dispersal rate and variance of dispersal distance for the age classes and approximately inversely proportional to the strength of density regulation. We show that the optimal harvesting strategy in the deterministic case is to harvest only juveniles when their relative value (e.g. financial) is large, and otherwise only adults. With increasing environmental stochasticity there is an interval of increasing length of values of juveniles relative to adults where both age classes should be harvested. Harvesting generally tends to increase all spatial scales of the autocovariances of densities. Copyright © 2017. Published by Elsevier Inc.

  12. A Coupled Approach with Stochastic Rainfall-Runoff Simulation and Hydraulic Modeling for Extreme Flood Estimation on Large Watersheds

    NASA Astrophysics Data System (ADS)

    Paquet, E.

    2015-12-01

    The SCHADEX method aims at estimating the distribution of peak and daily discharges up to extreme quantiles. It couples a precipitation probabilistic model based on weather patterns, with a stochastic rainfall-runoff simulation process using a conceptual lumped model. It allows exploring an exhaustive set of hydrological conditions and watershed responses to intense rainfall events. Since 2006, it has been widely applied in France to about one hundred watersheds for dam spillway design, and also aboard (Norway, Canada and central Europe among others). However, its application to large watersheds (above 10 000 km²) faces some significant issues: spatial heterogeneity of rainfall and hydrological processes and flood peak damping due to hydraulic effects (flood plains, natural or man-made embankment) being the more important. This led to the development of an extreme flood simulation framework for large and heterogeneous watersheds, based on the SCHADEX method. Its main features are: Division of the large (or main) watershed into several smaller sub-watersheds, where the spatial homogeneity of the hydro-meteorological processes can reasonably be assumed, and where the hydraulic effects can be neglected. Identification of pilot watersheds where discharge data are available, thus where rainfall-runoff models can be calibrated. They will be parameters donors to non-gauged watersheds. Spatially coherent stochastic simulations for all the sub-watersheds at the daily time step. Identification of a selection of simulated events for a given return period (according to the distribution of runoff volumes at the scale of the main watershed). Generation of the complete hourly hydrographs at each of the sub-watersheds outlets. Routing to the main outlet with hydraulic 1D or 2D models. The presentation will be illustrated with the case-study of the Isère watershed (9981 km), a French snow-driven watershed. The main novelties of this method will be underlined, as well as its perspectives and future improvements.

  13. Modelling and simulation techniques for membrane biology.

    PubMed

    Burrage, Kevin; Hancock, John; Leier, André; Nicolau, Dan V

    2007-07-01

    One of the most important aspects of Computational Cell Biology is the understanding of the complicated dynamical processes that take place on plasma membranes. These processes are often so complicated that purely temporal models cannot always adequately capture the dynamics. On the other hand, spatial models can have large computational overheads. In this article, we review some of these issues with respect to chemistry, membrane microdomains and anomalous diffusion and discuss how to select appropriate modelling and simulation paradigms based on some or all the following aspects: discrete, continuous, stochastic, delayed and complex spatial processes.

  14. Stochastic simulation of multiscale complex systems with PISKaS: A rule-based approach.

    PubMed

    Perez-Acle, Tomas; Fuenzalida, Ignacio; Martin, Alberto J M; Santibañez, Rodrigo; Avaria, Rodrigo; Bernardin, Alejandro; Bustos, Alvaro M; Garrido, Daniel; Dushoff, Jonathan; Liu, James H

    2018-03-29

    Computational simulation is a widely employed methodology to study the dynamic behavior of complex systems. Although common approaches are based either on ordinary differential equations or stochastic differential equations, these techniques make several assumptions which, when it comes to biological processes, could often lead to unrealistic models. Among others, model approaches based on differential equations entangle kinetics and causality, failing when complexity increases, separating knowledge from models, and assuming that the average behavior of the population encompasses any individual deviation. To overcome these limitations, simulations based on the Stochastic Simulation Algorithm (SSA) appear as a suitable approach to model complex biological systems. In this work, we review three different models executed in PISKaS: a rule-based framework to produce multiscale stochastic simulations of complex systems. These models span multiple time and spatial scales ranging from gene regulation up to Game Theory. In the first example, we describe a model of the core regulatory network of gene expression in Escherichia coli highlighting the continuous model improvement capacities of PISKaS. The second example describes a hypothetical outbreak of the Ebola virus occurring in a compartmentalized environment resembling cities and highways. Finally, in the last example, we illustrate a stochastic model for the prisoner's dilemma; a common approach from social sciences describing complex interactions involving trust within human populations. As whole, these models demonstrate the capabilities of PISKaS providing fertile scenarios where to explore the dynamics of complex systems. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Numerical simulation of backward erosion piping in heterogeneous fields

    NASA Astrophysics Data System (ADS)

    Liang, Yue; Yeh, Tian-Chyi Jim; Wang, Yu-Li; Liu, Mingwei; Wang, Junjie; Hao, Yonghong

    2017-04-01

    Backward erosion piping (BEP) is one of the major causes of seepage failures in levees. Seepage fields dictate the BEP behaviors and are influenced by the heterogeneity of soil properties. To investigate the effects of the heterogeneity on the seepage failures, we develop a numerical algorithm and conduct simulations to study BEP progressions in geologic media with spatially stochastic parameters. Specifically, the void ratio e, the hydraulic conductivity k, and the ratio of the particle contents r of the media are represented as the stochastic variables. They are characterized by means and variances, the spatial correlation structures, and the cross correlation between variables. Results of the simulations reveal that the heterogeneity accelerates the development of preferential flow paths, which profoundly increase the likelihood of seepage failures. To account for unknown heterogeneity, we define the probability of the seepage instability (PI) to evaluate the failure potential of a given site. Using Monte-Carlo simulation (MCS), we demonstrate that the PI value is significantly influenced by the mean and the variance of ln k and its spatial correlation scales. But the other parameters, such as means and variances of e and r, and their cross correlation, have minor impacts. Based on PI analyses, we introduce a risk rating system to classify the field into different regions according to risk levels. This rating system is useful for seepage failures prevention and assists decision making when BEP occurs.

  16. Self-organization of the magnetization in ferromagnetic nanowires

    NASA Astrophysics Data System (ADS)

    Ivanov, A. A.; Orlov, V. A.

    2017-10-01

    In this work we demonstrate the occurrence of the characteristic spatial scale in the distribution of magnetization unrelated to the domain wall or crystallite size with using computer simulation of magnetization in a polycrystalline ferromagnetic nanowire. This is the stochastic domain size. We show that this length is included in the spectral density of the pinning force of domain wall on inhomogeneities of the crystallographic anisotropy. The constant and distribution of easy axes directions of the effective anisotropy of stochastic domain, are analytically calculated.

  17. Stochastic dynamic modeling of regular and slow earthquakes

    NASA Astrophysics Data System (ADS)

    Aso, N.; Ando, R.; Ide, S.

    2017-12-01

    Both regular and slow earthquakes are slip phenomena on plate boundaries and are simulated by a (quasi-)dynamic modeling [Liu and Rice, 2005]. In these numerical simulations, spatial heterogeneity is usually considered not only for explaining real physical properties but also for evaluating the stability of the calculations or the sensitivity of the results on the condition. However, even though we discretize the model space with small grids, heterogeneity at smaller scales than the grid size is not considered in the models with deterministic governing equations. To evaluate the effect of heterogeneity at the smaller scales we need to consider stochastic interactions between slip and stress in a dynamic modeling. Tidal stress is known to trigger or affect both regular and slow earthquakes [Yabe et al., 2015; Ide et al., 2016], and such an external force with fluctuation can also be considered as a stochastic external force. A healing process of faults may also be stochastic, so we introduce stochastic friction law. In the present study, we propose a stochastic dynamic model to explain both regular and slow earthquakes. We solve mode III problem, which corresponds to the rupture propagation along the strike direction. We use BIEM (boundary integral equation method) scheme to simulate slip evolution, but we add stochastic perturbations in the governing equations, which is usually written in a deterministic manner. As the simplest type of perturbations, we adopt Gaussian deviations in the formulation of the slip-stress kernel, external force, and friction. By increasing the amplitude of perturbations of the slip-stress kernel, we reproduce complicated rupture process of regular earthquakes including unilateral and bilateral ruptures. By perturbing external force, we reproduce slow rupture propagation at a scale of km/day. The slow propagation generated by a combination of fast interaction at S-wave velocity is analogous to the kinetic theory of gasses: thermal diffusion appears much slower than the particle velocity of each molecule. The concept of stochastic triggering originates in the Brownian walk model [Ide, 2008], and the present study introduces the stochastic dynamics into dynamic simulations. The stochastic dynamic model has the potential to explain both regular and slow earthquakes more realistically.

  18. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 1: theoretical development

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The Saint-Venant equations are commonly used as the governing equations to solve for modeling the spatially varied unsteady flow in open channels. The presence of uncertainties in the channel or flow parameters renders these equations stochastic, thus requiring their solution in a stochastic framework in order to quantify the ensemble behavior and the variability of the process. While the Monte Carlo approach can be used for such a solution, its computational expense and its large number of simulations act to its disadvantage. This study proposes, explains, and derives a new methodology for solving the stochastic Saint-Venant equations in only one shot, without the need for a large number of simulations. The proposed methodology is derived by developing the nonlocal Lagrangian-Eulerian Fokker-Planck equation of the characteristic form of the stochastic Saint-Venant equations for an open-channel flow process, with an uncertain roughness coefficient. A numerical method for its solution is subsequently devised. The application and validation of this methodology are provided in a companion paper, in which the statistical results computed by the proposed methodology are compared against the results obtained by the Monte Carlo approach.

  19. Mean field analysis of a spatial stochastic model of a gene regulatory network.

    PubMed

    Sturrock, M; Murray, P J; Matzavinos, A; Chaplain, M A J

    2015-10-01

    A gene regulatory network may be defined as a collection of DNA segments which interact with each other indirectly through their RNA and protein products. Such a network is said to contain a negative feedback loop if its products inhibit gene transcription, and a positive feedback loop if a gene product promotes its own production. Negative feedback loops can create oscillations in mRNA and protein levels while positive feedback loops are primarily responsible for signal amplification. It is often the case in real biological systems that both negative and positive feedback loops operate in parameter regimes that result in low copy numbers of gene products. In this paper we investigate the spatio-temporal dynamics of a single feedback loop in a eukaryotic cell. We first develop a simplified spatial stochastic model of a canonical feedback system (either positive or negative). Using a Gillespie's algorithm, we compute sample trajectories and analyse their corresponding statistics. We then derive a system of equations that describe the spatio-temporal evolution of the stochastic means. Subsequently, we examine the spatially homogeneous case and compare the results of numerical simulations with the spatially explicit case. Finally, using a combination of steady-state analysis and data clustering techniques, we explore model behaviour across a subregion of the parameter space that is difficult to access experimentally and compare the parameter landscape of our spatio-temporal and spatially-homogeneous models.

  20. A stochastic spatial model of HIV dynamics with an asymmetric battle between the virus and the immune system

    NASA Astrophysics Data System (ADS)

    Lin, Hai; Shuai, J. W.

    2010-04-01

    A stochastic spatial model based on the Monte Carlo approach is developed to study the dynamics of human immunodeficiency virus (HIV) infection. We aim to propose a more detailed and realistic simulation frame by incorporating many important features of HIV dynamics, which include infections, replications and mutations of viruses, antigen recognitions, activations and proliferations of lymphocytes, and diffusions, encounters and interactions of virions and lymphocytes. Our model successfully reproduces the three-phase pattern observed in HIV infection, and the simulation results for the time distribution from infection to AIDS onset are also in good agreement with the clinical data. The interactions of viruses and the immune system in all the three phases are investigated. We assess the relative importance of various immune system components in the acute phase. The dynamics of how the two important factors, namely the viral diversity and the asymmetric battle between HIV and the immune system, result in AIDS are investigated in detail with the model.

  1. Analytical approximation of a stochastic, spatial simulation model of fire and forest landscape dynamics

    Treesearch

    A.J. Tepley; E.A. Thomann

    2012-01-01

    Recent increases in computation power have prompted enormous growth in the use of simulation models in ecological research. These models are valued for their ability to account for much of the ecological complexity found in field studies, but this ability usually comes at the cost of losing transparency into how the models work. In order to foster greater understanding...

  2. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE PAGES

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    2017-10-26

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  3. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  4. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  5. Unbiased Rare Event Sampling in Spatial Stochastic Systems Biology Models Using a Weighted Ensemble of Trajectories

    PubMed Central

    Donovan, Rory M.; Tapia, Jose-Juan; Sullivan, Devin P.; Faeder, James R.; Murphy, Robert F.; Dittrich, Markus; Zuckerman, Daniel M.

    2016-01-01

    The long-term goal of connecting scales in biological simulation can be facilitated by scale-agnostic methods. We demonstrate that the weighted ensemble (WE) strategy, initially developed for molecular simulations, applies effectively to spatially resolved cell-scale simulations. The WE approach runs an ensemble of parallel trajectories with assigned weights and uses a statistical resampling strategy of replicating and pruning trajectories to focus computational effort on difficult-to-sample regions. The method can also generate unbiased estimates of non-equilibrium and equilibrium observables, sometimes with significantly less aggregate computing time than would be possible using standard parallelization. Here, we use WE to orchestrate particle-based kinetic Monte Carlo simulations, which include spatial geometry (e.g., of organelles, plasma membrane) and biochemical interactions among mobile molecular species. We study a series of models exhibiting spatial, temporal and biochemical complexity and show that although WE has important limitations, it can achieve performance significantly exceeding standard parallel simulation—by orders of magnitude for some observables. PMID:26845334

  6. High-resolution stochastic generation of extreme rainfall intensity for urban drainage modelling applications

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Blumensaat, Frank; Molnar, Peter; Fatichi, Simone; Burlando, Paolo

    2016-04-01

    Urban drainage response is highly dependent on the spatial and temporal structure of rainfall. Therefore, measuring and simulating rainfall at a high spatial and temporal resolution is a fundamental step to fully assess urban drainage system reliability and related uncertainties. This is even more relevant when considering extreme rainfall events. However, the current space-time rainfall models have limitations in capturing extreme rainfall intensity statistics for short durations. Here, we use the STREAP (Space-Time Realizations of Areal Precipitation) model, which is a novel stochastic rainfall generator for simulating high-resolution rainfall fields that preserve the spatio-temporal structure of rainfall and its statistical characteristics. The model enables a generation of rain fields at 102 m and minute scales in a fast and computer-efficient way matching the requirements for hydrological analysis of urban drainage systems. The STREAP model was applied successfully in the past to generate high-resolution extreme rainfall intensities over a small domain. A sub-catchment in the city of Luzern (Switzerland) was chosen as a case study to: (i) evaluate the ability of STREAP to disaggregate extreme rainfall intensities for urban drainage applications; (ii) assessing the role of stochastic climate variability of rainfall in flow response and (iii) evaluate the degree of non-linearity between extreme rainfall intensity and system response (i.e. flow) for a small urban catchment. The channel flow at the catchment outlet is simulated by means of a calibrated hydrodynamic sewer model.

  7. Identification of hydraulic conductivity structure in sand and gravel aquifers: Cape Cod data set

    USGS Publications Warehouse

    Eggleston, J.R.; Rojstaczer, S.A.; Peirce, J.J.

    1996-01-01

    This study evaluates commonly used geostatistical methods to assess reproduction of hydraulic conductivity (K) structure and sensitivity under limiting amounts of data. Extensive conductivity measurements from the Cape Cod sand and gravel aquifer are used to evaluate two geostatistical estimation methods, conditional mean as an estimate and ordinary kriging, and two stochastic simulation methods, simulated annealing and sequential Gaussian simulation. Our results indicate that for relatively homogeneous sand and gravel aquifers such as the Cape Cod aquifer, neither estimation methods nor stochastic simulation methods give highly accurate point predictions of hydraulic conductivity despite the high density of collected data. Although the stochastic simulation methods yielded higher errors than the estimation methods, the stochastic simulation methods yielded better reproduction of the measured In (K) distribution and better reproduction of local contrasts in In (K). The inability of kriging to reproduce high In (K) values, as reaffirmed by this study, provides a strong instigation for choosing stochastic simulation methods to generate conductivity fields when performing fine-scale contaminant transport modeling. Results also indicate that estimation error is relatively insensitive to the number of hydraulic conductivity measurements so long as more than a threshold number of data are used to condition the realizations. This threshold occurs for the Cape Cod site when there are approximately three conductivity measurements per integral volume. The lack of improvement with additional data suggests that although fine-scale hydraulic conductivity structure is evident in the variogram, it is not accurately reproduced by geostatistical estimation methods. If the Cape Cod aquifer spatial conductivity characteristics are indicative of other sand and gravel deposits, then the results on predictive error versus data collection obtained here have significant practical consequences for site characterization. Heavily sampled sand and gravel aquifers, such as Cape Cod and Borden, may have large amounts of redundant data, while in more common real world settings, our results suggest that denser data collection will likely improve understanding of permeability structure.

  8. Simulating biological processes: stochastic physics from whole cells to colonies.

    PubMed

    Earnest, Tyler M; Cole, John A; Luthey-Schulten, Zaida

    2018-05-01

    The last few decades have revealed the living cell to be a crowded spatially heterogeneous space teeming with biomolecules whose concentrations and activities are governed by intrinsically random forces. It is from this randomness, however, that a vast array of precisely timed and intricately coordinated biological functions emerge that give rise to the complex forms and behaviors we see in the biosphere around us. This seemingly paradoxical nature of life has drawn the interest of an increasing number of physicists, and recent years have seen stochastic modeling grow into a major subdiscipline within biological physics. Here we review some of the major advances that have shaped our understanding of stochasticity in biology. We begin with some historical context, outlining a string of important experimental results that motivated the development of stochastic modeling. We then embark upon a fairly rigorous treatment of the simulation methods that are currently available for the treatment of stochastic biological models, with an eye toward comparing and contrasting their realms of applicability, and the care that must be taken when parameterizing them. Following that, we describe how stochasticity impacts several key biological functions, including transcription, translation, ribosome biogenesis, chromosome replication, and metabolism, before considering how the functions may be coupled into a comprehensive model of a 'minimal cell'. Finally, we close with our expectation for the future of the field, focusing on how mesoscopic stochastic methods may be augmented with atomic-scale molecular modeling approaches in order to understand life across a range of length and time scales.

  9. Simulating biological processes: stochastic physics from whole cells to colonies

    NASA Astrophysics Data System (ADS)

    Earnest, Tyler M.; Cole, John A.; Luthey-Schulten, Zaida

    2018-05-01

    The last few decades have revealed the living cell to be a crowded spatially heterogeneous space teeming with biomolecules whose concentrations and activities are governed by intrinsically random forces. It is from this randomness, however, that a vast array of precisely timed and intricately coordinated biological functions emerge that give rise to the complex forms and behaviors we see in the biosphere around us. This seemingly paradoxical nature of life has drawn the interest of an increasing number of physicists, and recent years have seen stochastic modeling grow into a major subdiscipline within biological physics. Here we review some of the major advances that have shaped our understanding of stochasticity in biology. We begin with some historical context, outlining a string of important experimental results that motivated the development of stochastic modeling. We then embark upon a fairly rigorous treatment of the simulation methods that are currently available for the treatment of stochastic biological models, with an eye toward comparing and contrasting their realms of applicability, and the care that must be taken when parameterizing them. Following that, we describe how stochasticity impacts several key biological functions, including transcription, translation, ribosome biogenesis, chromosome replication, and metabolism, before considering how the functions may be coupled into a comprehensive model of a ‘minimal cell’. Finally, we close with our expectation for the future of the field, focusing on how mesoscopic stochastic methods may be augmented with atomic-scale molecular modeling approaches in order to understand life across a range of length and time scales.

  10. Integrating pixel- and polygon-based approaches to wildfire risk assessment: Application to a high-value watershed on the Pike and San Isabel National Forests, Colorado, USA

    Treesearch

    Matthew P. Thompson; Julie W. Gilbertson-Day; Joe H. Scott

    2015-01-01

    We develop a novel risk assessment approach that integrates complementary, yet distinct, spatial modeling approaches currently used in wildfire risk assessment. Motivation for this work stems largely from limitations of existing stochastic wildfire simulation systems, which can generate pixel-based outputs of fire behavior as well as polygon-based outputs of simulated...

  11. Stochastic weighted particle methods for population balance equations with coagulation, fragmentation and spatial inhomogeneity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Kok Foong; Patterson, Robert I.A.; Wagner, Wolfgang

    2015-12-15

    Graphical abstract: -- Highlights: •Problems concerning multi-compartment population balance equations are studied. •A class of fragmentation weight transfer functions is presented. •Three stochastic weighted algorithms are compared against the direct simulation algorithm. •The numerical errors of the stochastic solutions are assessed as a function of fragmentation rate. •The algorithms are applied to a multi-dimensional granulation model. -- Abstract: This paper introduces stochastic weighted particle algorithms for the solution of multi-compartment population balance equations. In particular, it presents a class of fragmentation weight transfer functions which are constructed such that the number of computational particles stays constant during fragmentation events. Themore » weight transfer functions are constructed based on systems of weighted computational particles and each of it leads to a stochastic particle algorithm for the numerical treatment of population balance equations. Besides fragmentation, the algorithms also consider physical processes such as coagulation and the exchange of mass with the surroundings. The numerical properties of the algorithms are compared to the direct simulation algorithm and an existing method for the fragmentation of weighted particles. It is found that the new algorithms show better numerical performance over the two existing methods especially for systems with significant amount of large particles and high fragmentation rates.« less

  12. The stochastic runoff-runon process: Extending its analysis to a finite hillslope

    NASA Astrophysics Data System (ADS)

    Jones, O. D.; Lane, P. N. J.; Sheridan, G. J.

    2016-10-01

    The stochastic runoff-runon process models the volume of infiltration excess runoff from a hillslope via the overland flow path. Spatial variability is represented in the model by the spatial distribution of rainfall and infiltration, and their ;correlation scale;, that is, the scale at which the spatial correlation of rainfall and infiltration become negligible. Notably, the process can produce runoff even when the mean rainfall rate is less than the mean infiltration rate, and it displays a gradual increase in net runoff as the rainfall rate increases. In this paper we present a number of contributions to the analysis of the stochastic runoff-runon process. Firstly we illustrate the suitability of the process by fitting it to experimental data. Next we extend previous asymptotic analyses to include the cases where the mean rainfall rate equals or exceeds the mean infiltration rate, and then use Monte Carlo simulation to explore the range of parameters for which the asymptotic limit gives a good approximation on finite hillslopes. Finally we use this to obtain an equation for the mean net runoff, consistent with our asymptotic results but providing an excellent approximation for finite hillslopes. Our function uses a single parameter to capture spatial variability, and varying this parameter gives us a family of curves which interpolate between known upper and lower bounds for the mean net runoff.

  13. A simple stochastic rainstorm generator for simulating spatially and temporally varying rainfall

    NASA Astrophysics Data System (ADS)

    Singer, M. B.; Michaelides, K.; Nichols, M.; Nearing, M. A.

    2016-12-01

    In semi-arid to arid drainage basins, rainstorms often control both water supply and flood risk to marginal communities of people. They also govern the availability of water to vegetation and other ecological communities, as well as spatial patterns of sediment, nutrient, and contaminant transport and deposition on local to basin scales. All of these landscape responses are sensitive to changes in climate that are projected to occur throughout western North America. Thus, it is important to improve characterization of rainstorms in a manner that enables statistical assessment of rainfall at spatial scales below that of existing gauging networks and the prediction of plausible manifestations of climate change. Here we present a simple, stochastic rainstorm generator that was created using data from a rich and dense network of rain gauges at the Walnut Gulch Experimental Watershed (WGEW) in SE Arizona, but which is applicable anywhere. We describe our methods for assembling pdfs of relevant rainstorm characteristics including total annual rainfall, storm area, storm center location, and storm duration. We also generate five fitted intensity-duration curves and apply a spatial rainfall gradient to generate precipitation at spatial scales below gauge spacing. The model then runs by Monte Carlo simulation in which a total annual rainfall is selected before we generate rainstorms until the annual precipitation total is reached. The procedure continues for decadal simulations. Thus, we keep track of the hydrologic impact of individual storms and the integral of precipitation over multiple decades. We first test the model using ensemble predictions until we reach statistical similarity to the input data from WGEW. We then employ the model to assess decadal precipitation under simulations of climate change in which we separately vary the distribution of total annual rainfall (trend in moisture) and the intensity-duration curves used for simulation (trends in storminess). We demonstrate the model output through spatial maps of rainfall and through statistical comparisons of relevant parameters and distributions. Finally, discuss how the model can be used to understand basin-scale hydrology in terms of soil moisture, runoff, and erosion.

  14. A stochastic spatiotemporal model of a response-regulator network in the Caulobacter crescentus cell cycle

    NASA Astrophysics Data System (ADS)

    Li, Fei; Subramanian, Kartik; Chen, Minghan; Tyson, John J.; Cao, Yang

    2016-06-01

    The asymmetric cell division cycle in Caulobacter crescentus is controlled by an elaborate molecular mechanism governing the production, activation and spatial localization of a host of interacting proteins. In previous work, we proposed a deterministic mathematical model for the spatiotemporal dynamics of six major regulatory proteins. In this paper, we study a stochastic version of the model, which takes into account molecular fluctuations of these regulatory proteins in space and time during early stages of the cell cycle of wild-type Caulobacter cells. We test the stochastic model with regard to experimental observations of increased variability of cycle time in cells depleted of the divJ gene product. The deterministic model predicts that overexpression of the divK gene blocks cell cycle progression in the stalked stage; however, stochastic simulations suggest that a small fraction of the mutants cells do complete the cell cycle normally.

  15. Relativistic analysis of stochastic kinematics

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    The relativistic analysis of stochastic kinematics is developed in order to determine the transformation of the effective diffusivity tensor in inertial frames. Poisson-Kac stochastic processes are initially considered. For one-dimensional spatial models, the effective diffusion coefficient measured in a frame Σ moving with velocity w with respect to the rest frame of the stochastic process is inversely proportional to the third power of the Lorentz factor γ (w ) =(1-w2/c2) -1 /2 . Subsequently, higher-dimensional processes are analyzed and it is shown that the diffusivity tensor in a moving frame becomes nonisotropic: The diffusivities parallel and orthogonal to the velocity of the moving frame scale differently with respect to γ (w ) . The analysis of discrete space-time diffusion processes permits one to obtain a general transformation theory of the tensor diffusivity, confirmed by several different simulation experiments. Several implications of the theory are also addressed and discussed.

  16. The adaptation rate of a quantitative trait in an environmental gradient

    NASA Astrophysics Data System (ADS)

    Hermsen, R.

    2016-12-01

    The spatial range of a species habitat is generally determined by the ability of the species to cope with biotic and abiotic variables that vary in space. Therefore, the species range is itself an evolvable property. Indeed, environmental gradients permit a mode of evolution in which range expansion and adaptation go hand in hand. This process can contribute to rapid evolution of drug resistant bacteria and viruses, because drug concentrations in humans and livestock treated with antibiotics are far from uniform. Here, we use a minimal stochastic model of discrete, interacting organisms evolving in continuous space to study how the rate of adaptation of a quantitative trait depends on the steepness of the gradient and various population parameters. We discuss analytical results for the mean-field limit as well as extensive stochastic simulations. These simulations were performed using an exact, event-driven simulation scheme that can deal with continuous time-, density- and coordinate-dependent reaction rates and could be used for a wide variety of stochastic systems. The results reveal two qualitative regimes. If the gradient is shallow, the rate of adaptation is limited by dispersion and increases linearly with the gradient slope. If the gradient is steep, the adaptation rate is limited by mutation. In this regime, the mean-field result is highly misleading: it predicts that the adaptation rate continues to increase with the gradient slope, whereas stochastic simulations show that it in fact decreases with the square root of the slope. This discrepancy underscores the importance of discreteness and stochasticity even at high population densities; mean-field results, including those routinely used in quantitative genetics, should be interpreted with care.

  17. The adaptation rate of a quantitative trait in an environmental gradient.

    PubMed

    Hermsen, R

    2016-11-30

    The spatial range of a species habitat is generally determined by the ability of the species to cope with biotic and abiotic variables that vary in space. Therefore, the species range is itself an evolvable property. Indeed, environmental gradients permit a mode of evolution in which range expansion and adaptation go hand in hand. This process can contribute to rapid evolution of drug resistant bacteria and viruses, because drug concentrations in humans and livestock treated with antibiotics are far from uniform. Here, we use a minimal stochastic model of discrete, interacting organisms evolving in continuous space to study how the rate of adaptation of a quantitative trait depends on the steepness of the gradient and various population parameters. We discuss analytical results for the mean-field limit as well as extensive stochastic simulations. These simulations were performed using an exact, event-driven simulation scheme that can deal with continuous time-, density- and coordinate-dependent reaction rates and could be used for a wide variety of stochastic systems. The results reveal two qualitative regimes. If the gradient is shallow, the rate of adaptation is limited by dispersion and increases linearly with the gradient slope. If the gradient is steep, the adaptation rate is limited by mutation. In this regime, the mean-field result is highly misleading: it predicts that the adaptation rate continues to increase with the gradient slope, whereas stochastic simulations show that it in fact decreases with the square root of the slope. This discrepancy underscores the importance of discreteness and stochasticity even at high population densities; mean-field results, including those routinely used in quantitative genetics, should be interpreted with care.

  18. Use of artificial landscapes to isolate controls on burn probability

    Treesearch

    Marc-Andre Parisien; Carol Miller; Alan A. Ager; Mark A. Finney

    2010-01-01

    Techniques for modeling burn probability (BP) combine the stochastic components of fire regimes (ignitions and weather) with sophisticated fire growth algorithms to produce high-resolution spatial estimates of the relative likelihood of burning. Despite the numerous investigations of fire patterns from either observed or simulated sources, the specific influence of...

  19. Using neutral models to identify constraints on low-severity fire regimes.

    Treesearch

    Donald McKenzie; Amy E. Hessl; Lara-Karena B. Kellogg

    2006-01-01

    Climate, topography, fuel loadings, and human activities all affect spatial and temporal patterns of fire occurrence. Because fire is modeled as a stochastic process, for which each fire history is only one realization, a simulation approach is necessary to understand baseline variability, thereby identifying constraints, or forcing functions, that affect fire regimes...

  20. State-and-transition simulation models: a framework for forecasting landscape change

    USGS Publications Warehouse

    Daniel, Colin; Frid, Leonardo; Sleeter, Benjamin M.; Fortin, Marie-Josée

    2016-01-01

    SummaryA wide range of spatially explicit simulation models have been developed to forecast landscape dynamics, including models for projecting changes in both vegetation and land use. While these models have generally been developed as separate applications, each with a separate purpose and audience, they share many common features.We present a general framework, called a state-and-transition simulation model (STSM), which captures a number of these common features, accompanied by a software product, called ST-Sim, to build and run such models. The STSM method divides a landscape into a set of discrete spatial units and simulates the discrete state of each cell forward as a discrete-time-inhomogeneous stochastic process. The method differs from a spatially interacting Markov chain in several important ways, including the ability to add discrete counters such as age and time-since-transition as state variables, to specify one-step transition rates as either probabilities or target areas, and to represent multiple types of transitions between pairs of states.We demonstrate the STSM method using a model of land-use/land-cover (LULC) change for the state of Hawai'i, USA. Processes represented in this example include expansion/contraction of agricultural lands, urbanization, wildfire, shrub encroachment into grassland and harvest of tree plantations; the model also projects shifts in moisture zones due to climate change. Key model output includes projections of the future spatial and temporal distribution of LULC classes and moisture zones across the landscape over the next 50 years.State-and-transition simulation models can be applied to a wide range of landscapes, including questions of both land-use change and vegetation dynamics. Because the method is inherently stochastic, it is well suited for characterizing uncertainty in model projections. When combined with the ST-Sim software, STSMs offer a simple yet powerful means for developing a wide range of models of landscape dynamics.

  1. A hybrid algorithm for coupling partial differential equation and compartment-based dynamics.

    PubMed

    Harrison, Jonathan U; Yates, Christian A

    2016-09-01

    Stochastic simulation methods can be applied successfully to model exact spatio-temporally resolved reaction-diffusion systems. However, in many cases, these methods can quickly become extremely computationally intensive with increasing particle numbers. An alternative description of many of these systems can be derived in the diffusive limit as a deterministic, continuum system of partial differential equations (PDEs). Although the numerical solution of such PDEs is, in general, much more efficient than the full stochastic simulation, the deterministic continuum description is generally not valid when copy numbers are low and stochastic effects dominate. Therefore, to take advantage of the benefits of both of these types of models, each of which may be appropriate in different parts of a spatial domain, we have developed an algorithm that can be used to couple these two types of model together. This hybrid coupling algorithm uses an overlap region between the two modelling regimes. By coupling fluxes at one end of the interface and using a concentration-matching condition at the other end, we ensure that mass is appropriately transferred between PDE- and compartment-based regimes. Our methodology gives notable reductions in simulation time in comparison with using a fully stochastic model, while maintaining the important stochastic features of the system and providing detail in appropriate areas of the domain. We test our hybrid methodology robustly by applying it to several biologically motivated problems including diffusion and morphogen gradient formation. Our analysis shows that the resulting error is small, unbiased and does not grow over time. © 2016 The Authors.

  2. A hybrid algorithm for coupling partial differential equation and compartment-based dynamics

    PubMed Central

    Yates, Christian A.

    2016-01-01

    Stochastic simulation methods can be applied successfully to model exact spatio-temporally resolved reaction–diffusion systems. However, in many cases, these methods can quickly become extremely computationally intensive with increasing particle numbers. An alternative description of many of these systems can be derived in the diffusive limit as a deterministic, continuum system of partial differential equations (PDEs). Although the numerical solution of such PDEs is, in general, much more efficient than the full stochastic simulation, the deterministic continuum description is generally not valid when copy numbers are low and stochastic effects dominate. Therefore, to take advantage of the benefits of both of these types of models, each of which may be appropriate in different parts of a spatial domain, we have developed an algorithm that can be used to couple these two types of model together. This hybrid coupling algorithm uses an overlap region between the two modelling regimes. By coupling fluxes at one end of the interface and using a concentration-matching condition at the other end, we ensure that mass is appropriately transferred between PDE- and compartment-based regimes. Our methodology gives notable reductions in simulation time in comparison with using a fully stochastic model, while maintaining the important stochastic features of the system and providing detail in appropriate areas of the domain. We test our hybrid methodology robustly by applying it to several biologically motivated problems including diffusion and morphogen gradient formation. Our analysis shows that the resulting error is small, unbiased and does not grow over time. PMID:27628171

  3. A spatial stochastic programming model for timber and core area management under risk of stand-replacing fire

    Treesearch

    Dung Tuan Nguyen

    2012-01-01

    Forest harvest scheduling has been modeled using deterministic and stochastic programming models. Past models seldom address explicit spatial forest management concerns under the influence of natural disturbances. In this research study, we employ multistage full recourse stochastic programming models to explore the challenges and advantages of building spatial...

  4. A spatial stochastic programming model for timber and core area management under risk of fires

    Treesearch

    Yu Wei; Michael Bevers; Dung Nguyen; Erin Belval

    2014-01-01

    Previous stochastic models in harvest scheduling seldom address explicit spatial management concerns under the influence of natural disturbances. We employ multistage stochastic programming models to explore the challenges and advantages of building spatial optimization models that account for the influences of random stand-replacing fires. Our exploratory test models...

  5. Effects of intrinsic stochasticity on delayed reaction-diffusion patterning systems.

    PubMed

    Woolley, Thomas E; Baker, Ruth E; Gaffney, Eamonn A; Maini, Philip K; Seirin-Lee, Sungrim

    2012-05-01

    Cellular gene expression is a complex process involving many steps, including the transcription of DNA and translation of mRNA; hence the synthesis of proteins requires a considerable amount of time, from ten minutes to several hours. Since diffusion-driven instability has been observed to be sensitive to perturbations in kinetic delays, the application of Turing patterning mechanisms to the problem of producing spatially heterogeneous differential gene expression has been questioned. In deterministic systems a small delay in the reactions can cause a large increase in the time it takes a system to pattern. Recently, it has been observed that in undelayed systems intrinsic stochasticity can cause pattern initiation to occur earlier than in the analogous deterministic simulations. Here we are interested in adding both stochasticity and delays to Turing systems in order to assess whether stochasticity can reduce the patterning time scale in delayed Turing systems. As analytical insights to this problem are difficult to attain and often limited in their use, we focus on stochastically simulating delayed systems. We consider four different Turing systems and two different forms of delay. Our results are mixed and lead to the conclusion that, although the sensitivity to delays in the Turing mechanism is not completely removed by the addition of intrinsic noise, the effects of the delays are clearly ameliorated in certain specific cases.

  6. Transition probability-based stochastic geological modeling using airborne geophysical data and borehole data

    NASA Astrophysics Data System (ADS)

    He, Xin; Koch, Julian; Sonnenborg, Torben O.; Jørgensen, Flemming; Schamper, Cyril; Christian Refsgaard, Jens

    2014-04-01

    Geological heterogeneity is a very important factor to consider when developing geological models for hydrological purposes. Using statistically based stochastic geological simulations, the spatial heterogeneity in such models can be accounted for. However, various types of uncertainties are associated with both the geostatistical method and the observation data. In the present study, TProGS is used as the geostatistical modeling tool to simulate structural heterogeneity for glacial deposits in a head water catchment in Denmark. The focus is on how the observation data uncertainty can be incorporated in the stochastic simulation process. The study uses two types of observation data: borehole data and airborne geophysical data. It is commonly acknowledged that the density of the borehole data is usually too sparse to characterize the horizontal heterogeneity. The use of geophysical data gives an unprecedented opportunity to obtain high-resolution information and thus to identify geostatistical properties more accurately especially in the horizontal direction. However, since such data are not a direct measurement of the lithology, larger uncertainty of point estimates can be expected as compared to the use of borehole data. We have proposed a histogram probability matching method in order to link the information on resistivity to hydrofacies, while considering the data uncertainty at the same time. Transition probabilities and Markov Chain models are established using the transformed geophysical data. It is shown that such transformation is in fact practical; however, the cutoff value for dividing the resistivity data into facies is difficult to determine. The simulated geological realizations indicate significant differences of spatial structure depending on the type of conditioning data selected. It is to our knowledge the first time that grid-to-grid airborne geophysical data including the data uncertainty are used in conditional geostatistical simulations in TProGS. Therefore, it provides valuable insights regarding the advantages and challenges of using such comprehensive data.

  7. Conditional Stochastic Models in Reduced Space: Towards Efficient Simulation of Tropical Cyclone Precipitation Patterns

    NASA Astrophysics Data System (ADS)

    Dodov, B.

    2017-12-01

    Stochastic simulation of realistic and statistically robust patterns of Tropical Cyclone (TC) induced precipitation is a challenging task. It is even more challenging in a catastrophe modeling context, where tens of thousands of typhoon seasons need to be simulated in order to provide a complete view of flood risk. Ultimately, one could run a coupled global climate model and regional Numerical Weather Prediction (NWP) model, but this approach is not feasible in the catastrophe modeling context and, most importantly, may not provide TC track patterns consistent with observations. Rather, we propose to leverage NWP output for the observed TC precipitation patterns (in terms of downscaled reanalysis 1979-2015) collected on a Lagrangian frame along the historical TC tracks and reduced to the leading spatial principal components of the data. The reduced data from all TCs is then grouped according to timing, storm evolution stage (developing, mature, dissipating, ETC transitioning) and central pressure and used to build a dictionary of stationary (within a group) and non-stationary (for transitions between groups) covariance models. Provided that the stochastic storm tracks with all the parameters describing the TC evolution are already simulated, a sequence of conditional samples from the covariance models chosen according to the TC characteristics at a given moment in time are concatenated, producing a continuous non-stationary precipitation pattern in a Lagrangian framework. The simulated precipitation for each event is finally distributed along the stochastic TC track and blended with a non-TC background precipitation using a data assimilation technique. The proposed framework provides means of efficient simulation (10000 seasons simulated in a couple of days) and robust typhoon precipitation patterns consistent with observed regional climate and visually undistinguishable from high resolution NWP output. The framework is used to simulate a catalog of 10000 typhoon seasons implemented in a flood risk model for Japan.

  8. Population dynamics, information transfer, and spatial organization in a chemical reaction network under spatial confinement and crowding conditions

    NASA Astrophysics Data System (ADS)

    Bellesia, Giovanni; Bales, Benjamin B.

    2016-10-01

    We investigate, via Brownian dynamics simulations, the reaction dynamics of a generic, nonlinear chemical network under spatial confinement and crowding conditions. In detail, the Willamowski-Rossler chemical reaction system has been "extended" and considered as a prototype reaction-diffusion system. Our results are potentially relevant to a number of open problems in biophysics and biochemistry, such as the synthesis of primitive cellular units (protocells) and the definition of their role in the chemical origin of life and the characterization of vesicle-mediated drug delivery processes. More generally, the computational approach presented in this work makes the case for the use of spatial stochastic simulation methods for the study of biochemical networks in vivo where the "well-mixed" approximation is invalid and both thermal and intrinsic fluctuations linked to the possible presence of molecular species in low number copies cannot be averaged out.

  9. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    PubMed

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.

  10. Detecting seasonal variations of soil parameters via field measurements and stochastic simulations in the hillslope

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; An, Hyunuk; Kim, Sanghyun

    2015-04-01

    Soil moisture, a critical factor in hydrologic systems, plays a key role in synthesizing interactions among soil, climate, hydrological response, solute transport and ecosystem dynamics. The spatial and temporal distribution of soil moisture at a hillslope scale is essential for understanding hillslope runoff generation processes. In this study, we implement Monte Carlo simulations in the hillslope scale using a three-dimensional surface-subsurface integrated model (3D model). Numerical simulations are compared with multiple soil moistures which had been measured using TDR(Mini_TRASE) for 22 locations in 2 or 3 depths during a whole year at a hillslope (area: 2100 square meters) located in Bongsunsa Watershed, South Korea. In stochastic simulations via Monte Carlo, uncertainty of the soil parameters and input forcing are considered and model ensembles showing good performance are selected separately for several seasonal periods. The presentation will be focused on the characterization of seasonal variations of model parameters based on simulations with field measurements. In addition, structural limitations of the contemporary modeling method will be discussed.

  11. Assessing the expected effects of wildfire on vegetation condition on the Bridger-Teton National Forest, Wyoming, USA

    Treesearch

    J. H. Scott; D. J. Helmbrecht; M. P. Thompson

    2014-01-01

    Characterizing wildfire risk to a fire-adapted ecosystem presents particular challenges due to its broad spatial extent, inherent complexity, and the difficulty in defining wildfire-induced losses and benefits. Our approach couples stochastic wildfire simulation with a vegetation condition assessment framework to estimate the conditional and expected response of...

  12. Stochastic Analysis and Probabilistic Downscaling of Soil Moisture

    NASA Astrophysics Data System (ADS)

    Deshon, J. P.; Niemann, J. D.; Green, T. R.; Jones, A. S.

    2017-12-01

    Soil moisture is a key variable for rainfall-runoff response estimation, ecological and biogeochemical flux estimation, and biodiversity characterization, each of which is useful for watershed condition assessment. These applications require not only accurate, fine-resolution soil-moisture estimates but also confidence limits on those estimates and soil-moisture patterns that exhibit realistic statistical properties (e.g., variance and spatial correlation structure). The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution (9-40 km) soil moisture from satellite remote sensing or land-surface models to produce fine-resolution (10-30 m) estimates. The model was designed to produce accurate deterministic soil-moisture estimates at multiple points, but the resulting patterns do not reproduce the variance or spatial correlation of observed soil-moisture patterns. The primary objective of this research is to generalize the EMT+VS model to produce a probability density function (pdf) for soil moisture at each fine-resolution location and time. Each pdf has a mean that is equal to the deterministic soil-moisture estimate, and the pdf can be used to quantify the uncertainty in the soil-moisture estimates and to simulate soil-moisture patterns. Different versions of the generalized model are hypothesized based on how uncertainty enters the model, whether the uncertainty is additive or multiplicative, and which distributions describe the uncertainty. These versions are then tested by application to four catchments with detailed soil-moisture observations (Tarrawarra, Satellite Station, Cache la Poudre, and Nerrigundah). The performance of the generalized models is evaluated by comparing the statistical properties of the simulated soil-moisture patterns to those of the observations and the deterministic EMT+VS model. The versions of the generalized EMT+VS model with normally distributed stochastic components produce soil-moisture patterns with more realistic statistical properties than the deterministic model. Additionally, the results suggest that the variance and spatial correlation of the stochastic soil-moisture variations do not vary consistently with the spatial-average soil moisture.

  13. Spatial scaling patterns and functional redundancies in a changing boreal lake landscape

    USGS Publications Warehouse

    Angeler, David G.; Allen, Craig R.; Uden, Daniel R.; Johnson, Richard K.

    2015-01-01

    Global transformations extend beyond local habitats; therefore, larger-scale approaches are needed to assess community-level responses and resilience to unfolding environmental changes. Using longterm data (1996–2011), we evaluated spatial patterns and functional redundancies in the littoral invertebrate communities of 85 Swedish lakes, with the objective of assessing their potential resilience to environmental change at regional scales (that is, spatial resilience). Multivariate spatial modeling was used to differentiate groups of invertebrate species exhibiting spatial patterns in composition and abundance (that is, deterministic species) from those lacking spatial patterns (that is, stochastic species). We then determined the functional feeding attributes of the deterministic and stochastic invertebrate species, to infer resilience. Between one and three distinct spatial patterns in invertebrate composition and abundance were identified in approximately one-third of the species; the remainder were stochastic. We observed substantial differences in metrics between deterministic and stochastic species. Functional richness and diversity decreased over time in the deterministic group, suggesting a loss of resilience in regional invertebrate communities. However, taxon richness and redundancy increased monotonically in the stochastic group, indicating the capacity of regional invertebrate communities to adapt to change. Our results suggest that a refined picture of spatial resilience emerges if patterns of both the deterministic and stochastic species are accounted for. Spatially extensive monitoring may help increase our mechanistic understanding of community-level responses and resilience to regional environmental change, insights that are critical for developing management and conservation agendas in this current period of rapid environmental transformation.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jager, Yetta; Efroymson, Rebecca Ann; Sublette, K.

    Quantitative tools are needed to evaluate the ecological effects of increasing petroleum production. In this article, we describe two stochastic models for simulating the spatial distribution of brine spills on a landscape. One model uses general assumptions about the spatial arrangement of spills and their sizes; the second model distributes spills by siting rectangular well complexes and conditioning spill probabilities on the configuration of pipes. We present maps of landscapes with spills produced by the two methods and compare the ability of the models to reproduce a specified spill area. A strength of the models presented here is their abilitymore » to extrapolate from the existing landscape to simulate landscapes with a higher (or lower) density of oil wells.« less

  15. Spatial vs. individual variability with inheritance in a stochastic Lotka-Volterra system

    NASA Astrophysics Data System (ADS)

    Dobramysl, Ulrich; Tauber, Uwe C.

    2012-02-01

    We investigate a stochastic spatial Lotka-Volterra predator-prey model with randomized interaction rates that are either affixed to the lattice sites and quenched, and / or specific to individuals in either population. In the latter situation, we include rate inheritance with mutations from the particles' progenitors. Thus we arrive at a simple model for competitive evolution with environmental variability and selection pressure. We employ Monte Carlo simulations in zero and two dimensions to study the time evolution of both species' densities and their interaction rate distributions. The predator and prey concentrations in the ensuing steady states depend crucially on the environmental variability, whereas the temporal evolution of the individualized rate distributions leads to largely neutral optimization. Contrary to, e.g., linear gene expression models, this system does not experience fixation at extreme values. An approximate description of the resulting data is achieved by means of an effective master equation approach for the interaction rate distribution.

  16. 3D aquifer characterization using stochastic streamline calibration

    NASA Astrophysics Data System (ADS)

    Jang, Minchul

    2007-03-01

    In this study, a new inverse approach, stochastic streamline calibration is proposed. Using both a streamline concept and a stochastic technique, stochastic streamline calibration optimizes an identified field to fit in given observation data in a exceptionally fast and stable fashion. In the stochastic streamline calibration, streamlines are adopted as basic elements not only for describing fluid flow but also for identifying the permeability distribution. Based on the streamline-based inversion by Agarwal et al. [Agarwal B, Blunt MJ. Streamline-based method with full-physics forward simulation for history matching performance data of a North sea field. SPE J 2003;8(2):171-80], Wang and Kovscek [Wang Y, Kovscek AR. Streamline approach for history matching production data. SPE J 2000;5(4):353-62], permeability is modified rather along streamlines than at the individual gridblocks. Permeabilities in the gridblocks which a streamline passes are adjusted by being multiplied by some factor such that we can match flow and transport properties of the streamline. This enables the inverse process to achieve fast convergence. In addition, equipped with a stochastic module, the proposed technique supportively calibrates the identified field in a stochastic manner, while incorporating spatial information into the field. This prevents the inverse process from being stuck in local minima and helps search for a globally optimized solution. Simulation results indicate that stochastic streamline calibration identifies an unknown permeability exceptionally quickly. More notably, the identified permeability distribution reflected realistic geological features, which had not been achieved in the original work by Agarwal et al. with the limitations of the large modifications along streamlines for matching production data only. The constructed model by stochastic streamline calibration forecasted transport of plume which was similar to that of a reference model. By this, we can expect the proposed approach to be applied to the construction of an aquifer model and forecasting of the aquifer performances of interest.

  17. Particle-based simulations of polarity establishment reveal stochastic promotion of Turing pattern formation

    PubMed Central

    Ramirez, Samuel A.; Elston, Timothy C.

    2018-01-01

    Polarity establishment, the spontaneous generation of asymmetric molecular distributions, is a crucial component of many cellular functions. Saccharomyces cerevisiae (yeast) undergoes directed growth during budding and mating, and is an ideal model organism for studying polarization. In yeast and many other cell types, the Rho GTPase Cdc42 is the key molecular player in polarity establishment. During yeast polarization, multiple patches of Cdc42 initially form, then resolve into a single front. Because polarization relies on strong positive feedback, it is likely that the amplification of molecular-level fluctuations underlies the generation of multiple nascent patches. In the absence of spatial cues, these fluctuations may be key to driving polarization. Here we used particle-based simulations to investigate the role of stochastic effects in a Turing-type model of yeast polarity establishment. In the model, reactions take place either between two molecules on the membrane, or between a cytosolic and a membrane-bound molecule. Thus, we developed a computational platform that explicitly simulates molecules at and near the cell membrane, and implicitly handles molecules away from the membrane. To evaluate stochastic effects, we compared particle simulations to deterministic reaction-diffusion equation simulations. Defining macroscopic rate constants that are consistent with the microscopic parameters for this system is challenging, because diffusion occurs in two dimensions and particles exchange between the membrane and cytoplasm. We address this problem by empirically estimating macroscopic rate constants from appropriately designed particle-based simulations. Ultimately, we find that stochastic fluctuations speed polarity establishment and permit polarization in parameter regions predicted to be Turing stable. These effects can operate at Cdc42 abundances expected of yeast cells, and promote polarization on timescales consistent with experimental results. To our knowledge, our work represents the first particle-based simulations of a model for yeast polarization that is based on a Turing mechanism. PMID:29529021

  18. Baseline Error Analysis and Experimental Validation for Height Measurement of Formation Insar Satellite

    NASA Astrophysics Data System (ADS)

    Gao, X.; Li, T.; Zhang, X.; Geng, X.

    2018-04-01

    In this paper, we proposed the stochastic model of InSAR height measurement by considering the interferometric geometry of InSAR height measurement. The model directly described the relationship between baseline error and height measurement error. Then the simulation analysis in combination with TanDEM-X parameters was implemented to quantitatively evaluate the influence of baseline error to height measurement. Furthermore, the whole emulation validation of InSAR stochastic model was performed on the basis of SRTM DEM and TanDEM-X parameters. The spatial distribution characteristics and error propagation rule of InSAR height measurement were fully evaluated.

  19. Influence of stochastic sea ice parametrization on climate and the role of atmosphere–sea ice–ocean interaction

    PubMed Central

    Juricke, Stephan; Jung, Thomas

    2014-01-01

    The influence of a stochastic sea ice strength parametrization on the mean climate is investigated in a coupled atmosphere–sea ice–ocean model. The results are compared with an uncoupled simulation with a prescribed atmosphere. It is found that the stochastic sea ice parametrization causes an effective weakening of the sea ice. In the uncoupled model this leads to an Arctic sea ice volume increase of about 10–20% after an accumulation period of approximately 20–30 years. In the coupled model, no such increase is found. Rather, the stochastic perturbations lead to a spatial redistribution of the Arctic sea ice thickness field. A mechanism involving a slightly negative atmospheric feedback is proposed that can explain the different responses in the coupled and uncoupled system. Changes in integrated Antarctic sea ice quantities caused by the stochastic parametrization are generally small, as memory is lost during the melting season because of an almost complete loss of sea ice. However, stochastic sea ice perturbations affect regional sea ice characteristics in the Southern Hemisphere, both in the uncoupled and coupled model. Remote impacts of the stochastic sea ice parametrization on the mean climate of non-polar regions were found to be small. PMID:24842027

  20. Integrating continuous stocks and flows into state-and-transition simulation models of landscape change

    USGS Publications Warehouse

    Daniel, Colin J.; Sleeter, Benjamin M.; Frid, Leonardo; Fortin, Marie-Josée

    2018-01-01

    State-and-transition simulation models (STSMs) provide a general framework for forecasting landscape dynamics, including projections of both vegetation and land-use/land-cover (LULC) change. The STSM method divides a landscape into spatially-referenced cells and then simulates the state of each cell forward in time, as a discrete-time stochastic process using a Monte Carlo approach, in response to any number of possible transitions. A current limitation of the STSM method, however, is that all of the state variables must be discrete.Here we present a new approach for extending a STSM, in order to account for continuous state variables, called a state-and-transition simulation model with stocks and flows (STSM-SF). The STSM-SF method allows for any number of continuous stocks to be defined for every spatial cell in the STSM, along with a suite of continuous flows specifying the rates at which stock levels change over time. The change in the level of each stock is then simulated forward in time, for each spatial cell, as a discrete-time stochastic process. The method differs from the traditional systems dynamics approach to stock-flow modelling in that the stocks and flows can be spatially-explicit, and the flows can be expressed as a function of the STSM states and transitions.We demonstrate the STSM-SF method by integrating a spatially-explicit carbon (C) budget model with a STSM of LULC change for the state of Hawai'i, USA. In this example, continuous stocks are pools of terrestrial C, while the flows are the possible fluxes of C between these pools. Importantly, several of these C fluxes are triggered by corresponding LULC transitions in the STSM. Model outputs include changes in the spatial and temporal distribution of C pools and fluxes across the landscape in response to projected future changes in LULC over the next 50 years.The new STSM-SF method allows both discrete and continuous state variables to be integrated into a STSM, including interactions between them. With the addition of stocks and flows, STSMs provide a conceptually simple yet powerful approach for characterizing uncertainties in projections of a wide range of questions regarding landscape change.

  1. On the use of a PM2.5 exposure simulator to explain birthweight

    PubMed Central

    Berrocal, Veronica J.; Gelfand, Alan E.; Holland, David M.; Burke, Janet; Miranda, Marie Lynn

    2010-01-01

    In relating pollution to birth outcomes, maternal exposure has usually been described using monitoring data. Such characterization provides a misrepresentation of exposure as it (i) does not take into account the spatial misalignment between an individual’s residence and monitoring sites, and (ii) it ignores the fact that individuals spend most of their time indoors and typically in more than one location. In this paper, we break with previous studies by using a stochastic simulator to describe personal exposure (to particulate matter) and then relate simulated exposures at the individual level to the health outcome (birthweight) rather than aggregating to a selected spatial unit. We propose a hierarchical model that, at the first stage, specifies a linear relationship between birthweight and personal exposure, adjusting for individual risk factors and introduces random spatial effects for the census tract of maternal residence. At the second stage, our hierarchical model specifies the distribution of each individual’s personal exposure using the empirical distribution yielded by the stochastic simulator as well as a model for the spatial random effects. We have applied our framework to analyze birthweight data from 14 counties in North Carolina in years 2001 and 2002. We investigate whether there are certain aspects and time windows of exposure that are more detrimental to birthweight by building different exposure metrics which we incorporate, one by one, in our hierarchical model. To assess the difference in relating ambient exposure to birthweight versus personal exposure to birthweight, we compare estimates of the effect of air pollution obtained from hierarchical models that linearly relate ambient exposure and birthweight versus those obtained from our modeling framework. Our analysis does not show a significant effect of PM2.5 on birthweight for reasons which we discuss. However, our modeling framework serves as a template for analyzing the relationship between personal exposure and longer term health endpoints. PMID:21691413

  2. Stochastic Geometric Models with Non-stationary Spatial Correlations in Lagrangian Fluid Flows

    NASA Astrophysics Data System (ADS)

    Gay-Balmaz, François; Holm, Darryl D.

    2018-01-01

    Inspired by spatiotemporal observations from satellites of the trajectories of objects drifting near the surface of the ocean in the National Oceanic and Atmospheric Administration's "Global Drifter Program", this paper develops data-driven stochastic models of geophysical fluid dynamics (GFD) with non-stationary spatial correlations representing the dynamical behaviour of oceanic currents. Three models are considered. Model 1 from Holm (Proc R Soc A 471:20140963, 2015) is reviewed, in which the spatial correlations are time independent. Two new models, called Model 2 and Model 3, introduce two different symmetry breaking mechanisms by which the spatial correlations may be advected by the flow. These models are derived using reduction by symmetry of stochastic variational principles, leading to stochastic Hamiltonian systems, whose momentum maps, conservation laws and Lie-Poisson bracket structures are used in developing the new stochastic Hamiltonian models of GFD.

  3. Stochastic Geometric Models with Non-stationary Spatial Correlations in Lagrangian Fluid Flows

    NASA Astrophysics Data System (ADS)

    Gay-Balmaz, François; Holm, Darryl D.

    2018-06-01

    Inspired by spatiotemporal observations from satellites of the trajectories of objects drifting near the surface of the ocean in the National Oceanic and Atmospheric Administration's "Global Drifter Program", this paper develops data-driven stochastic models of geophysical fluid dynamics (GFD) with non-stationary spatial correlations representing the dynamical behaviour of oceanic currents. Three models are considered. Model 1 from Holm (Proc R Soc A 471:20140963, 2015) is reviewed, in which the spatial correlations are time independent. Two new models, called Model 2 and Model 3, introduce two different symmetry breaking mechanisms by which the spatial correlations may be advected by the flow. These models are derived using reduction by symmetry of stochastic variational principles, leading to stochastic Hamiltonian systems, whose momentum maps, conservation laws and Lie-Poisson bracket structures are used in developing the new stochastic Hamiltonian models of GFD.

  4. Spatially heterogeneous stochasticity and the adaptive diversification of dormancy.

    PubMed

    Rajon, E; Venner, S; Menu, F

    2009-10-01

    Diversified bet-hedging, a strategy that leads several individuals with the same genotype to express distinct phenotypes in a given generation, is now well established as a common evolutionary response to environmental stochasticity. Life-history traits defined as diversified bet-hedging (e.g. germination or diapause strategies) display marked differences between populations in spatial proximity. In order to find out whether such differences can be explained by local adaptations to spatially heterogeneous environmental stochasticity, we explored the evolution of bet-hedging dormancy strategies in a metapopulation using a two-patch model with patch differences in stochastic juvenile survival. We found that spatial differences in the level of environmental stochasticity, restricted dispersal, increased fragmentation and intermediate survival during dormancy all favour the adaptive diversification of bet-hedging dormancy strategies. Density dependency also plays a major role in the diversification of dormancy strategies because: (i) it may interact locally with environmental stochasticity and amplify its effects; however, (ii) it can also generate chaotic population dynamics that may impede diversification. Our work proposes new hypotheses to explain the spatial patterns of bet-hedging strategies that we hope will encourage new empirical studies of this topic.

  5. Deploying initial attack resources for wildfire suppression: spatial coordination, budget constraints, and capacity constraints

    Treesearch

    Yohan Lee; Jeremy S. Fried; Heidi J. Albers; Robert G. Haight

    2013-01-01

    We combine a scenario-based, standard-response optimization model with stochastic simulation to improve the efficiency of resource deployment for initial attack on wildland fires in three planning units in California. The optimization model minimizes the expected number of fires that do not receive a standard response--defined as the number of resources by type that...

  6. A hybrid continuous-discrete method for stochastic reaction-diffusion processes.

    PubMed

    Lo, Wing-Cheong; Zheng, Likun; Nie, Qing

    2016-09-01

    Stochastic fluctuations in reaction-diffusion processes often have substantial effect on spatial and temporal dynamics of signal transductions in complex biological systems. One popular approach for simulating these processes is to divide the system into small spatial compartments assuming that molecules react only within the same compartment and jump between adjacent compartments driven by the diffusion. While the approach is convenient in terms of its implementation, its computational cost may become prohibitive when diffusive jumps occur significantly more frequently than reactions, as in the case of rapid diffusion. Here, we present a hybrid continuous-discrete method in which diffusion is simulated using continuous approximation while reactions are based on the Gillespie algorithm. Specifically, the diffusive jumps are approximated as continuous Gaussian random vectors with time-dependent means and covariances, allowing use of a large time step, even for rapid diffusion. By considering the correlation among diffusive jumps, the approximation is accurate for the second moment of the diffusion process. In addition, a criterion is obtained for identifying the region in which such diffusion approximation is required to enable adaptive calculations for better accuracy. Applications to a linear diffusion system and two nonlinear systems of morphogens demonstrate the effectiveness and benefits of the new hybrid method.

  7. Quasi-dynamic earthquake fault systems with rheological heterogeneity

    NASA Astrophysics Data System (ADS)

    Brietzke, G. B.; Hainzl, S.; Zoeller, G.; Holschneider, M.

    2009-12-01

    Seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates, such models cannot allow for physical statements of the described seismicity. In contrary such empirical stochastic models, physics based earthquake fault systems models allow for a physical reasoning and interpretation of the produced seismicity and system dynamics. Recently different fault system earthquake simulators based on frictional stick-slip behavior have been used to study effects of stress heterogeneity, rheological heterogeneity, or geometrical complexity on earthquake occurrence, spatial and temporal clustering of earthquakes, and system dynamics. Here we present a comparison of characteristics of synthetic earthquake catalogs produced by two different formulations of quasi-dynamic fault system earthquake simulators. Both models are based on discretized frictional faults embedded in an elastic half-space. While one (1) is governed by rate- and state-dependent friction with allowing three evolutionary stages of independent fault patches, the other (2) is governed by instantaneous frictional weakening with scheduled (and therefore causal) stress transfer. We analyze spatial and temporal clustering of events and characteristics of system dynamics by means of physical parameters of the two approaches.

  8. A Discrete Probability Function Method for the Equation of Radiative Transfer

    NASA Technical Reports Server (NTRS)

    Sivathanu, Y. R.; Gore, J. P.

    1993-01-01

    A discrete probability function (DPF) method for the equation of radiative transfer is derived. The DPF is defined as the integral of the probability density function (PDF) over a discrete interval. The derivation allows the evaluation of the PDF of intensities leaving desired radiation paths including turbulence-radiation interactions without the use of computer intensive stochastic methods. The DPF method has a distinct advantage over conventional PDF methods since the creation of a partial differential equation from the equation of transfer is avoided. Further, convergence of all moments of intensity is guaranteed at the basic level of simulation unlike the stochastic method where the number of realizations for convergence of higher order moments increases rapidly. The DPF method is described for a representative path with approximately integral-length scale-sized spatial discretization. The results show good agreement with measurements in a propylene/air flame except for the effects of intermittency resulting from highly correlated realizations. The method can be extended to the treatment of spatial correlations as described in the Appendix. However, information regarding spatial correlations in turbulent flames is needed prior to the execution of this extension.

  9. Surrogate modelling for the prediction of spatial fields based on simultaneous dimensionality reduction of high-dimensional input/output spaces.

    PubMed

    Crevillén-García, D

    2018-04-01

    Time-consuming numerical simulators for solving groundwater flow and dissolution models of physico-chemical processes in deep aquifers normally require some of the model inputs to be defined in high-dimensional spaces in order to return realistic results. Sometimes, the outputs of interest are spatial fields leading to high-dimensional output spaces. Although Gaussian process emulation has been satisfactorily used for computing faithful and inexpensive approximations of complex simulators, these have been mostly applied to problems defined in low-dimensional input spaces. In this paper, we propose a method for simultaneously reducing the dimensionality of very high-dimensional input and output spaces in Gaussian process emulators for stochastic partial differential equation models while retaining the qualitative features of the original models. This allows us to build a surrogate model for the prediction of spatial fields in such time-consuming simulators. We apply the methodology to a model of convection and dissolution processes occurring during carbon capture and storage.

  10. Stochastic modelling of infectious diseases for heterogeneous populations.

    PubMed

    Ming, Rui-Xing; Liu, Ji-Ming; W Cheung, William K; Wan, Xiang

    2016-12-22

    Infectious diseases such as SARS and H1N1 can significantly impact people's lives and cause severe social and economic damages. Recent outbreaks have stressed the urgency of effective research on the dynamics of infectious disease spread. However, it is difficult to predict when and where outbreaks may emerge and how infectious diseases spread because many factors affect their transmission, and some of them may be unknown. One feasible means to promptly detect an outbreak and track the progress of disease spread is to implement surveillance systems in regional or national health and medical centres. The accumulated surveillance data, including temporal, spatial, clinical, and demographic information can provide valuable information that can be exploited to better understand and model the dynamics of infectious disease spread. The aim of this work is to develop and empirically evaluate a stochastic model that allows the investigation of transmission patterns of infectious diseases in heterogeneous populations. We test the proposed model on simulation data and apply it to the surveillance data from the 2009 H1N1 pandemic in Hong Kong. In the simulation experiment, our model achieves high accuracy in parameter estimation (less than 10.0 % mean absolute percentage error). In terms of the forward prediction of case incidence, the mean absolute percentage errors are 17.3 % for the simulation experiment and 20.0 % for the experiment on the real surveillance data. We propose a stochastic model to study the dynamics of infectious disease spread in heterogeneous populations from temporal-spatial surveillance data. The proposed model is evaluated using both simulated data and the real data from the 2009 H1N1 epidemic in Hong Kong and achieves acceptable prediction accuracy. We believe that our model can provide valuable insights for public health authorities to predict the effect of disease spread and analyse its underlying factors and to guide new control efforts.

  11. Performance evaluation of an agent-based occupancy simulation model

    DOE PAGES

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing; ...

    2017-01-17

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  12. Performance evaluation of an agent-based occupancy simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  13. Pan-European stochastic flood event set

    NASA Astrophysics Data System (ADS)

    Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav

    2017-04-01

    Impact Forecasting (IF), the model development center of Aon Benfield, has been developing a large suite of catastrophe flood models on probabilistic bases for individual countries in Europe. Such natural catastrophes do not follow national boundaries: for example, the major flood in 2016 was responsible for the Europe's largest insured loss of USD3.4bn and affected Germany, France, Belgium, Austria and parts of several other countries. Reflecting such needs, IF initiated a pan-European flood event set development which combines cross-country exposures with country based loss distributions to provide more insightful data to re/insurers. Because the observed discharge data are not available across the whole Europe in sufficient quantity and quality to permit a detailed loss evaluation purposes, a top-down approach was chosen. This approach is based on simulating precipitation from a GCM/RCM model chain followed by a calculation of discharges using rainfall-runoff modelling. IF set up this project in a close collaboration with Karlsruhe Institute of Technology (KIT) regarding the precipitation estimates and with University of East Anglia (UEA) in terms of the rainfall-runoff modelling. KIT's main objective is to provide high resolution daily historical and stochastic time series of key meteorological variables. A purely dynamical downscaling approach with the regional climate model COSMO-CLM (CCLM) is used to generate the historical time series, using re-analysis data as boundary conditions. The resulting time series are validated against the gridded observational dataset E-OBS, and different bias-correction methods are employed. The generation of the stochastic time series requires transfer functions between large-scale atmospheric variables and regional temperature and precipitation fields. These transfer functions are developed for the historical time series using reanalysis data as predictors and bias-corrected CCLM simulated precipitation and temperature as predictands. Finally, the transfer functions are applied to a large ensemble of GCM simulations with forcing corresponding to present day climate conditions to generate highly resolved stochastic time series of precipitation and temperature for several thousand years. These time series form the input for the rainfall-runoff model developed by the UEA team. It is a spatially distributed model adapted from the HBV model and will be calibrated for individual basins using historical discharge data. The calibrated model will be driven by the precipitation time series generated by the KIT team to simulate discharges at a daily time step. The uncertainties in the simulated discharges will be analysed using multiple model parameter sets. A number of statistical methods will be used to assess return periods, changes in the magnitudes, changes in the characteristics of floods such as time base and time to peak, and spatial correlations of large flood events. The Pan-European flood stochastic event set will permit a better view of flood risk for market applications.

  14. Goal-oriented sensitivity analysis for lattice kinetic Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Georgios, E-mail: garab@math.uoc.gr; Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003; Katsoulakis, Markos A., E-mail: markos@math.umass.edu

    2014-03-28

    In this paper we propose a new class of coupling methods for the sensitivity analysis of high dimensional stochastic systems and in particular for lattice Kinetic Monte Carlo (KMC). Sensitivity analysis for stochastic systems is typically based on approximating continuous derivatives with respect to model parameters by the mean value of samples from a finite difference scheme. Instead of using independent samples the proposed algorithm reduces the variance of the estimator by developing a strongly correlated-“coupled”- stochastic process for both the perturbed and unperturbed stochastic processes, defined in a common state space. The novelty of our construction is that themore » new coupled process depends on the targeted observables, e.g., coverage, Hamiltonian, spatial correlations, surface roughness, etc., hence we refer to the proposed method as goal-oriented sensitivity analysis. In particular, the rates of the coupled Continuous Time Markov Chain are obtained as solutions to a goal-oriented optimization problem, depending on the observable of interest, by considering the minimization functional of the corresponding variance. We show that this functional can be used as a diagnostic tool for the design and evaluation of different classes of couplings. Furthermore, the resulting KMC sensitivity algorithm has an easy implementation that is based on the Bortz–Kalos–Lebowitz algorithm's philosophy, where events are divided in classes depending on level sets of the observable of interest. Finally, we demonstrate in several examples including adsorption, desorption, and diffusion Kinetic Monte Carlo that for the same confidence interval and observable, the proposed goal-oriented algorithm can be two orders of magnitude faster than existing coupling algorithms for spatial KMC such as the Common Random Number approach. We also provide a complete implementation of the proposed sensitivity analysis algorithms, including various spatial KMC examples, in a supplementary MATLAB source code.« less

  15. Ultimate open pit stochastic optimization

    NASA Astrophysics Data System (ADS)

    Marcotte, Denis; Caron, Josiane

    2013-02-01

    Classical open pit optimization (maximum closure problem) is made on block estimates, without directly considering the block grades uncertainty. We propose an alternative approach of stochastic optimization. The stochastic optimization is taken as the optimal pit computed on the block expected profits, rather than expected grades, computed from a series of conditional simulations. The stochastic optimization generates, by construction, larger ore and waste tonnages than the classical optimization. Contrary to the classical approach, the stochastic optimization is conditionally unbiased for the realized profit given the predicted profit. A series of simulated deposits with different variograms are used to compare the stochastic approach, the classical approach and the simulated approach that maximizes expected profit among simulated designs. Profits obtained with the stochastic optimization are generally larger than the classical or simulated pit. The main factor controlling the relative gain of stochastic optimization compared to classical approach and simulated pit is shown to be the information level as measured by the boreholes spacing/range ratio. The relative gains of the stochastic approach over the classical approach increase with the treatment costs but decrease with mining costs. The relative gains of the stochastic approach over the simulated pit approach increase both with the treatment and mining costs. At early stages of an open pit project, when uncertainty is large, the stochastic optimization approach appears preferable to the classical approach or the simulated pit approach for fair comparison of the values of alternative projects and for the initial design and planning of the open pit.

  16. Experimental study and simulation of space charge stimulated discharge

    NASA Astrophysics Data System (ADS)

    Noskov, M. D.; Malinovski, A. S.; Cooke, C. M.; Wright, K. A.; Schwab, A. J.

    2002-11-01

    The electrical discharge of volume distributed space charge in poly(methylmethacrylate) (PMMA) has been investigated both experimentally and by computer simulation. The experimental space charge was implanted in dielectric samples by exposure to a monoenergetic electron beam of 3 MeV. Electrical breakdown through the implanted space charge region within the sample was initiated by a local electric field enhancement applied to the sample surface. A stochastic-deterministic dynamic model for electrical discharge was developed and used in a computer simulation of these breakdowns. The model employs stochastic rules to describe the physical growth of the discharge channels, and deterministic laws to describe the electric field, the charge, and energy dynamics within the discharge channels and the dielectric. Simulated spatial-temporal and current characteristics of the expanding discharge structure during physical growth are quantitatively compared with the experimental data to confirm the discharge model. It was found that a single fixed set of physically based dielectric parameter values was adequate to simulate the complete family of experimental space charge discharges in PMMA. It is proposed that such a set of parameters also provides a useful means to quantify the breakdown properties of other dielectrics.

  17. [Stochastic characteristics of daily precipitation and its spatiotemporal difference over China based on information entropy].

    PubMed

    Li, Xin Xin; Sang, Yan Fang; Xie, Ping; Liu, Chang Ming

    2018-04-01

    Daily precipitation process in China showed obvious randomness and spatiotemporal variation. It is important to accurately understand the influence of precipitation changes on control of flood and waterlogging disaster. Using the daily precipitation data measured at 520 stations in China during 1961-2013, we quantified the stochastic characteristics of daily precipitation over China based on the index of information entropy. Results showed that the randomness of daily precipitation in the southeast region were larger than that in the northwest region. Moreover, the spatial distribution of stochastic characteristics of precipitation was different at various grades. Stochastic characteri-stics of P 0 (precipitation at 0.1-10 mm) was large, but the spatial variation was not obvious. The stochastic characteristics of P 10 (precipitation at 10-25 mm) and P 25 (precipitation at 25-50 mm) were the largest and their spatial difference was obvious. P 50 (precipitation ≥50 mm) had the smallest stochastic characteristics and the most obviously spatial difference. Generally, the entropy values of precipitation obviously increased over the last five decades, indicating more significantly stochastic characteristics of precipitation (especially the obvious increase of heavy precipitation events) in most region over China under the scenarios of global climate change. Given that the spatial distribution and long-term trend of entropy values of daily precipitation could reflect thespatial distribution of stochastic characteristics of precipitation, our results could provide scientific basis for the control of flood and waterlogging disaster, the layout of agricultural planning, and the planning of ecological environment.

  18. Preemptive spatial competition under a reproduction-mortality constraint.

    PubMed

    Allstadt, Andrew; Caraco, Thomas; Korniss, G

    2009-06-21

    Spatially structured ecological interactions can shape selection pressures experienced by a population's different phenotypes. We study spatial competition between phenotypes subject to antagonistic pleiotropy between reproductive effort and mortality rate. The constraint we invoke reflects a previous life-history analysis; the implied dependence indicates that although propagation and mortality rates both vary, their ratio is fixed. We develop a stochastic invasion approximation predicting that phenotypes with higher propagation rates will invade an empty environment (no biotic resistance) faster, despite their higher mortality rate. However, once population density approaches demographic equilibrium, phenotypes with lower mortality are favored, despite their lower propagation rate. We conducted a set of pairwise invasion analyses by simulating an individual-based model of preemptive competition. In each case, the phenotype with the lowest mortality rate and (via antagonistic pleiotropy) the lowest propagation rate qualified as evolutionarily stable among strategies simulated. This result, for a fixed propagation to mortality ratio, suggests that a selective response to spatial competition can extend the time scale of the population's dynamics, which in turn decelerates phenotypic evolution.

  19. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data. Performance of the statistical model is illustrated through comparisons of generated realizations with the `true' numerical simulations. Finally, we demonstrate how these realizations can be used to determine statistically optimal locations for further interrogation of the subsurface.

  20. Stochastic simulation of karst conduit networks

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Dowd, Peter A.; Xu, Chaoshui; Durán-Valsero, Juan José

    2012-01-01

    Karst aquifers have very high spatial heterogeneity. Essentially, they comprise a system of pipes (i.e., the network of conduits) superimposed on rock porosity and on a network of stratigraphic surfaces and fractures. This heterogeneity strongly influences the hydraulic behavior of the karst and it must be reproduced in any realistic numerical model of the karst system that is used as input to flow and transport modeling. However, the directly observed karst conduits are only a small part of the complete karst conduit system and knowledge of the complete conduit geometry and topology remains spatially limited and uncertain. Thus, there is a special interest in the stochastic simulation of networks of conduits that can be combined with fracture and rock porosity models to provide a realistic numerical model of the karst system. Furthermore, the simulated model may be of interest per se and other uses could be envisaged. The purpose of this paper is to present an efficient method for conditional and non-conditional stochastic simulation of karst conduit networks. The method comprises two stages: generation of conduit geometry and generation of topology. The approach adopted is a combination of a resampling method for generating conduit geometries from templates and a modified diffusion-limited aggregation method for generating the network topology. The authors show that the 3D karst conduit networks generated by the proposed method are statistically similar to observed karst conduit networks or to a hypothesized network model. The statistical similarity is in the sense of reproducing the tortuosity index of conduits, the fractal dimension of the network, the direction rose of directions, the Z-histogram and Ripley's K-function of the bifurcation points (which differs from a random allocation of those bifurcation points). The proposed method (1) is very flexible, (2) incorporates any experimental data (conditioning information) and (3) can easily be modified when implemented in a hydraulic inverse modeling procedure. Several synthetic examples are given to illustrate the methodology and real conduit network data are used to generate simulated networks that mimic real geometries and topology.

  1. Stochastic Downscaling of Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Rasera, Luiz Gustavo; Mariethoz, Gregoire; Lane, Stuart N.

    2016-04-01

    High-resolution digital elevation models (HR-DEMs) are extremely important for the understanding of small-scale geomorphic processes in Alpine environments. In the last decade, remote sensing techniques have experienced a major technological evolution, enabling fast and precise acquisition of HR-DEMs. However, sensors designed to measure elevation data still feature different spatial resolution and coverage capabilities. Terrestrial altimetry allows the acquisition of HR-DEMs with centimeter to millimeter-level precision, but only within small spatial extents and often with dead ground problems. Conversely, satellite radiometric sensors are able to gather elevation measurements over large areas but with limited spatial resolution. In the present study, we propose an algorithm to downscale low-resolution satellite-based DEMs using topographic patterns extracted from HR-DEMs derived for example from ground-based and airborne altimetry. The method consists of a multiple-point geostatistical simulation technique able to generate high-resolution elevation data from low-resolution digital elevation models (LR-DEMs). Initially, two collocated DEMs with different spatial resolutions serve as an input to construct a database of topographic patterns, which is also used to infer the statistical relationships between the two scales. High-resolution elevation patterns are then retrieved from the database to downscale a LR-DEM through a stochastic simulation process. The output of the simulations are multiple equally probable DEMs with higher spatial resolution that also depict the large-scale geomorphic structures present in the original LR-DEM. As these multiple models reflect the uncertainty related to the downscaling, they can be employed to quantify the uncertainty of phenomena that are dependent on fine topography, such as catchment hydrological processes. The proposed methodology is illustrated for a case study in the Swiss Alps. A swissALTI3D HR-DEM (with 5 m resolution) and a SRTM-derived LR-DEM from the Western Alps are used to downscale a SRTM-based LR-DEM from the eastern part of the Alps. The results show that the method is capable of generating multiple high-resolution synthetic DEMs that reproduce the spatial structure and statistics of the original DEM.

  2. Time-ordered product expansions for computational stochastic system biology.

    PubMed

    Mjolsness, Eric

    2013-06-01

    The time-ordered product framework of quantum field theory can also be used to understand salient phenomena in stochastic biochemical networks. It is used here to derive Gillespie's stochastic simulation algorithm (SSA) for chemical reaction networks; consequently, the SSA can be interpreted in terms of Feynman diagrams. It is also used here to derive other, more general simulation and parameter-learning algorithms including simulation algorithms for networks of stochastic reaction-like processes operating on parameterized objects, and also hybrid stochastic reaction/differential equation models in which systems of ordinary differential equations evolve the parameters of objects that can also undergo stochastic reactions. Thus, the time-ordered product expansion can be used systematically to derive simulation and parameter-fitting algorithms for stochastic systems.

  3. Stochastic characteristics of different duration annual maximum rainfall and its spatial difference in China based on information entropy

    NASA Astrophysics Data System (ADS)

    Li, X.; Sang, Y. F.

    2017-12-01

    Mountain torrents, urban floods and other disasters caused by extreme precipitation bring great losses to the ecological environment, social and economic development, people's lives and property security. So there is of great significance to floods prevention and control by the study of its spatial distribution. Based on the annual maximum rainfall data of 60min, 6h and 24h, the paper generate long sequences following Pearson-III distribution, and then use the information entropy index to study the spatial distribution and difference of different duration. The results show that the information entropy value of annual maximum rainfall in the south region is greater than that in the north region, indicating more obvious stochastic characteristics of annual maximum rainfall in the latter. However, the spatial distribution of stochastic characteristics is different in different duration. For example, stochastic characteristics of 60min annual maximum rainfall in the Eastern Tibet is smaller than surrounding, but 6h and 24h annual maximum rainfall is larger than surrounding area. In the Haihe River Basin and the Huaihe River Basin, the stochastic characteristics of the 60min annual maximum rainfall was not significantly different from that in the surrounding area, and stochastic characteristics of 6h and 24h was smaller than that in the surrounding area. We conclude that the spatial distribution of information entropy values of annual maximum rainfall in different duration can reflect the spatial distribution of its stochastic characteristics, thus the results can be an importantly scientific basis for the flood prevention and control, agriculture, economic-social developments and urban flood control and waterlogging.

  4. Unveiling Galaxy Bias via the Halo Model, KiDS and GAMA

    NASA Astrophysics Data System (ADS)

    Dvornik, Andrej; Hoekstra, Henk; Kuijken, Konrad; Schneider, Peter; Amon, Alexandra; Nakajima, Reiko; Viola, Massimo; Choi, Ami; Erben, Thomas; Farrow, Daniel J.; Heymans, Catherine; Hildebrandt, Hendrik; Sifón, Cristóbal; Wang, Lingyu

    2018-06-01

    We measure the projected galaxy clustering and galaxy-galaxy lensing signals using the Galaxy And Mass Assembly (GAMA) survey and Kilo-Degree Survey (KiDS) to study galaxy bias. We use the concept of non-linear and stochastic galaxy biasing in the framework of halo occupation statistics to constrain the parameters of the halo occupation statistics and to unveil the origin of galaxy biasing. The bias function Γgm(rp), where rp is the projected comoving separation, is evaluated using the analytical halo model from which the scale dependence of Γgm(rp), and the origin of the non-linearity and stochasticity in halo occupation models can be inferred. Our observations unveil the physical reason for the non-linearity and stochasticity, further explored using hydrodynamical simulations, with the stochasticity mostly originating from the non-Poissonian behaviour of satellite galaxies in the dark matter haloes and their spatial distribution, which does not follow the spatial distribution of dark matter in the halo. The observed non-linearity is mostly due to the presence of the central galaxies, as was noted from previous theoretical work on the same topic. We also see that overall, more massive galaxies reveal a stronger scale dependence, and out to a larger radius. Our results show that a wealth of information about galaxy bias is hidden in halo occupation models. These models should therefore be used to determine the influence of galaxy bias in cosmological studies.

  5. Stochastic output error vibration-based damage detection and assessment in structures under earthquake excitation

    NASA Astrophysics Data System (ADS)

    Sakellariou, J. S.; Fassois, S. D.

    2006-11-01

    A stochastic output error (OE) vibration-based methodology for damage detection and assessment (localization and quantification) in structures under earthquake excitation is introduced. The methodology is intended for assessing the state of a structure following potential damage occurrence by exploiting vibration signal measurements produced by low-level earthquake excitations. It is based upon (a) stochastic OE model identification, (b) statistical hypothesis testing procedures for damage detection, and (c) a geometric method (GM) for damage assessment. The methodology's advantages include the effective use of the non-stationary and limited duration earthquake excitation, the handling of stochastic uncertainties, the tackling of the damage localization and quantification subproblems, the use of "small" size, simple and partial (in both the spatial and frequency bandwidth senses) identified OE-type models, and the use of a minimal number of measured vibration signals. Its feasibility and effectiveness are assessed via Monte Carlo experiments employing a simple simulation model of a 6 storey building. It is demonstrated that damage levels of 5% and 20% reduction in a storey's stiffness characteristics may be properly detected and assessed using noise-corrupted vibration signals.

  6. Stochastic reservoir simulation for the modeling of uncertainty in coal seam degasification

    PubMed Central

    Karacan, C. Özgen; Olea, Ricardo A.

    2018-01-01

    Coal seam degasification improves coal mine safety by reducing the gas content of coal seams and also by generating added value as an energy source. Coal seam reservoir simulation is one of the most effective ways to help with these two main objectives. As in all modeling and simulation studies, how the reservoir is defined and whether observed productions can be predicted are important considerations. Using geostatistical realizations as spatial maps of different coal reservoir properties is a more realistic approach than assuming uniform properties across the field. In fact, this approach can help with simultaneous history matching of multiple wellbores to enhance the confidence in spatial models of different coal properties that are pertinent to degasification. The problem that still remains is the uncertainty in geostatistical simulations originating from the partial sampling of the seam that does not properly reflect the stochastic nature of coal property realizations. Stochastic simulations and using individual realizations, rather than E-type, make evaluation of uncertainty possible. This work is an advancement over Karacan et al. (2014) in the sense of assessing uncertainty that stems from geostatistical maps. In this work, we batched 100 individual realizations of 10 coal properties that were randomly generated to create 100 bundles and used them in 100 separate coal seam reservoir simulations for simultaneous history matching. We then evaluated the history matching errors for each bundle and defined the single set of realizations that would minimize the error for all wells. We further compared the errors with those of E-type and the average realization of the best matches. Unlike in Karacan et al. (2014), which used E-type maps and average of quantile maps, using these 100 bundles created 100 different history match results from separate simulations, and distributions of results for in-place gas quantity, for example, from which uncertainty in coal property realizations could be evaluated. The study helped to determine the realization bundle that consisted of the spatial maps of coal properties, which resulted in minimum error. In addition, it was shown that both E-type and the average of realizations that gave the best match for invidual approximated the same properties resonably. Moreover, the determined realization bundle showed that the study field initially had 151.5 million m3 (cubic meter) of gas and 1.04 million m3 water in the coal, corresponding to Q90 of the entire range of probability for gas and close to Q75 for water. In 2013, in-place fluid amounts decreased to 138.9 million m3 and 0.997 million m3 for gas and water, respectively. PMID:29563647

  7. Rapid simulation of spatial epidemics: a spectral method.

    PubMed

    Brand, Samuel P C; Tildesley, Michael J; Keeling, Matthew J

    2015-04-07

    Spatial structure and hence the spatial position of host populations plays a vital role in the spread of infection. In the majority of situations, it is only possible to predict the spatial spread of infection using simulation models, which can be computationally demanding especially for large population sizes. Here we develop an approximation method that vastly reduces this computational burden. We assume that the transmission rates between individuals or sub-populations are determined by a spatial transmission kernel. This kernel is assumed to be isotropic, such that the transmission rate is simply a function of the distance between susceptible and infectious individuals; as such this provides the ideal mechanism for modelling localised transmission in a spatial environment. We show that the spatial force of infection acting on all susceptibles can be represented as a spatial convolution between the transmission kernel and a spatially extended 'image' of the infection state. This representation allows the rapid calculation of stochastic rates of infection using fast-Fourier transform (FFT) routines, which greatly improves the computational efficiency of spatial simulations. We demonstrate the efficiency and accuracy of this fast spectral rate recalculation (FSR) method with two examples: an idealised scenario simulating an SIR-type epidemic outbreak amongst N habitats distributed across a two-dimensional plane; the spread of infection between US cattle farms, illustrating that the FSR method makes continental-scale outbreak forecasting feasible with desktop processing power. The latter model demonstrates which areas of the US are at consistently high risk for cattle-infections, although predictions of epidemic size are highly dependent on assumptions about the tail of the transmission kernel. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Smoldyn: particle-based simulation with rule-based modeling, improved molecular interaction and a library interface.

    PubMed

    Andrews, Steven S

    2017-03-01

    Smoldyn is a spatial and stochastic biochemical simulator. It treats each molecule of interest as an individual particle in continuous space, simulating molecular diffusion, molecule-membrane interactions and chemical reactions, all with good accuracy. This article presents several new features. Smoldyn now supports two types of rule-based modeling. These are a wildcard method, which is very convenient, and the BioNetGen package with extensions for spatial simulation, which is better for complicated models. Smoldyn also includes new algorithms for simulating the diffusion of surface-bound molecules and molecules with excluded volume. Both are exact in the limit of short time steps and reasonably good with longer steps. In addition, Smoldyn supports single-molecule tracking simulations. Finally, the Smoldyn source code can be accessed through a C/C ++ language library interface. Smoldyn software, documentation, code, and examples are at http://www.smoldyn.org . steven.s.andrews@gmail.com. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. MOAB: a spatially explicit, individual-based expert system for creating animal foraging models

    USGS Publications Warehouse

    Carter, J.; Finn, John T.

    1999-01-01

    We describe the development, structure, and corroboration process of a simulation model of animal behavior (MOAB). MOAB can create spatially explicit, individual-based animal foraging models. Users can create or replicate heterogeneous landscape patterns, and place resources and individual animals of a goven species on that landscape to simultaneously simulate the foraging behavior of multiple species. The heuristic rules for animal behavior are maintained in a user-modifiable expert system. MOAB can be used to explore hypotheses concerning the influence of landscape patttern on animal movement and foraging behavior. A red fox (Vulpes vulpes L.) foraging and nest predation model was created to test MOAB's capabilities. Foxes were simulated for 30-day periods using both expert system and random movement rules. Home range size, territory formation and other available simulation studies. A striped skunk (Mephitis mephitis L.) model also was developed. The expert system model proved superior to stochastic in respect to territory formation, general movement patterns and home range size.

  10. A conditional stochastic weather generator for seasonal to multi-decadal simulations

    NASA Astrophysics Data System (ADS)

    Verdin, Andrew; Rajagopalan, Balaji; Kleiber, William; Podestá, Guillermo; Bert, Federico

    2018-01-01

    We present the application of a parametric stochastic weather generator within a nonstationary context, enabling simulations of weather sequences conditioned on interannual and multi-decadal trends. The generalized linear model framework of the weather generator allows any number of covariates to be included, such as large-scale climate indices, local climate information, seasonal precipitation and temperature, among others. Here we focus on the Salado A basin of the Argentine Pampas as a case study, but the methodology is portable to any region. We include domain-averaged (e.g., areal) seasonal total precipitation and mean maximum and minimum temperatures as covariates for conditional simulation. Areal covariates are motivated by a principal component analysis that indicates the seasonal spatial average is the dominant mode of variability across the domain. We find this modification to be effective in capturing the nonstationarity prevalent in interseasonal precipitation and temperature data. We further illustrate the ability of this weather generator to act as a spatiotemporal downscaler of seasonal forecasts and multidecadal projections, both of which are generally of coarse resolution.

  11. A hybrid continuous-discrete method for stochastic reaction–diffusion processes

    PubMed Central

    Zheng, Likun; Nie, Qing

    2016-01-01

    Stochastic fluctuations in reaction–diffusion processes often have substantial effect on spatial and temporal dynamics of signal transductions in complex biological systems. One popular approach for simulating these processes is to divide the system into small spatial compartments assuming that molecules react only within the same compartment and jump between adjacent compartments driven by the diffusion. While the approach is convenient in terms of its implementation, its computational cost may become prohibitive when diffusive jumps occur significantly more frequently than reactions, as in the case of rapid diffusion. Here, we present a hybrid continuous-discrete method in which diffusion is simulated using continuous approximation while reactions are based on the Gillespie algorithm. Specifically, the diffusive jumps are approximated as continuous Gaussian random vectors with time-dependent means and covariances, allowing use of a large time step, even for rapid diffusion. By considering the correlation among diffusive jumps, the approximation is accurate for the second moment of the diffusion process. In addition, a criterion is obtained for identifying the region in which such diffusion approximation is required to enable adaptive calculations for better accuracy. Applications to a linear diffusion system and two nonlinear systems of morphogens demonstrate the effectiveness and benefits of the new hybrid method. PMID:27703710

  12. Stochastic seismic inversion based on an improved local gradual deformation method

    NASA Astrophysics Data System (ADS)

    Yang, Xiuwei; Zhu, Peimin

    2017-12-01

    A new stochastic seismic inversion method based on the local gradual deformation method is proposed, which can incorporate seismic data, well data, geology and their spatial correlations into the inversion process. Geological information, such as sedimentary facies and structures, could provide significant a priori information to constrain an inversion and arrive at reasonable solutions. The local a priori conditional cumulative distributions at each node of model to be inverted are first established by indicator cokriging, which integrates well data as hard data and geological information as soft data. Probability field simulation is used to simulate different realizations consistent with the spatial correlations and local conditional cumulative distributions. The corresponding probability field is generated by the fast Fourier transform moving average method. Then, optimization is performed to match the seismic data via an improved local gradual deformation method. Two improved strategies are proposed to be suitable for seismic inversion. The first strategy is that we select and update local areas of bad fitting between synthetic seismic data and real seismic data. The second one is that we divide each seismic trace into several parts and obtain the optimal parameters for each part individually. The applications to a synthetic example and a real case study demonstrate that our approach can effectively find fine-scale acoustic impedance models and provide uncertainty estimations.

  13. Pluviometric characterization of the Coca river basin by using a stochastic rainfall model

    NASA Astrophysics Data System (ADS)

    González-Zeas, Dunia; Chávez-Jiménez, Adriadna; Coello-Rubio, Xavier; Correa, Ángel; Martínez-Codina, Ángela

    2014-05-01

    An adequate design of the hydraulic infrastructures, as well as, the prediction and simulation of a river basin require historical records with a greater temporal and spatial resolution. However, the lack of an extensive network of precipitation data, the short time scale data and the incomplete information provided by the available rainfall stations limit the analysis and design of complex hydraulic engineering systems. As a consequence, it is necessary to develop new quantitative tools in order to face this obstacle imposed by ungauged or poorly gauged basins. In this context, the use of a spatial-temporal rainfall model allows to simulate the historical behavior of the precipitation and at the same time, to obtain long-term synthetic series that preserve the extremal behavior. This paper provides a characterization of the precipitation in the Coca river basin located in Ecuador by using RainSim V3, a robust and well tested stochastic rainfall model based on a spatial-temporal Neyman-Scott rectangular pulses process. A preliminary consistency analysis of the historical rainfall data available has been done in order to identify climatic regions with similar precipitation behavior patterns. Mean and maximum yearly and monthly fields of precipitation of high resolution spaced grids have been obtained through the use of interpolation techniques. According to the climatological similarity, long time series of daily temporal resolution of precipitation have been generated in order to evaluate the model skill in capturing the structure of daily observed precipitation. The results show a good performance of the model in reproducing very well the gross statistics, including the extreme values of rainfall at daily scale. The spatial pattern represented by the observed and simulated precipitation fields highlights the existence of two important regions characterized by different pluviometric comportment, with lower precipitation in the upper part of the basin and higher precipitation in the lower part of the basin.

  14. Estimating the Spatial Extent of Unsaturated Zones in Heterogeneous River-Aquifer Systems

    NASA Astrophysics Data System (ADS)

    Schilling, Oliver S.; Irvine, Dylan J.; Hendricks Franssen, Harrie-Jan; Brunner, Philip

    2017-12-01

    The presence of unsaturated zones at the river-aquifer interface has large implications on numerous hydraulic and chemical processes. However, the hydrological and geological controls that influence the development of unsaturated zones have so far only been analyzed with simplified conceptualizations of flow processes, or homogeneous conceptualizations of the hydraulic conductivity in either the aquifer or the riverbed. We systematically investigated the influence of heterogeneous structures in both the riverbed and the aquifer on the development of unsaturated zones. A stochastic 1-D criterion that takes both riverbed and aquifer heterogeneity into account was developed using a Monte Carlo sampling technique. The approach allows the reliable estimation of the upper bound of the spatial extent of unsaturated areas underneath a riverbed. Through systematic numerical modeling experiments, we furthermore show that horizontal capillary forces can reduce the spatial extent of unsaturated zones under clogged areas. This analysis shows how the spatial structure of clogging layers and aquifers influence the propensity for unsaturated zones to develop: In riverbeds where clogged areas are made up of many small, spatially disconnected patches with a diameter in the order of 1 m, unsaturated areas are less likely to develop compared to riverbeds where large clogged areas exist adjacent to unclogged areas. A combination of the stochastic 1-D criterion with an analysis of the spatial structure of the clogging layers and the potential for resaturation can help develop an appropriate conceptual model and inform the choice of a suitable numerical simulator for river-aquifer systems.

  15. Biochemical simulations: stochastic, approximate stochastic and hybrid approaches.

    PubMed

    Pahle, Jürgen

    2009-01-01

    Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem.

  16. Biochemical simulations: stochastic, approximate stochastic and hybrid approaches

    PubMed Central

    2009-01-01

    Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem. PMID:19151097

  17. Bayesian methods for characterizing unknown parameters of material models

    DOE PAGES

    Emery, J. M.; Grigoriu, M. D.; Field Jr., R. V.

    2016-02-04

    A Bayesian framework is developed for characterizing the unknown parameters of probabilistic models for material properties. In this framework, the unknown parameters are viewed as random and described by their posterior distributions obtained from prior information and measurements of quantities of interest that are observable and depend on the unknown parameters. The proposed Bayesian method is applied to characterize an unknown spatial correlation of the conductivity field in the definition of a stochastic transport equation and to solve this equation by Monte Carlo simulation and stochastic reduced order models (SROMs). As a result, the Bayesian method is also employed tomore » characterize unknown parameters of material properties for laser welds from measurements of peak forces sustained by these welds.« less

  18. Bayesian methods for characterizing unknown parameters of material models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, J. M.; Grigoriu, M. D.; Field Jr., R. V.

    A Bayesian framework is developed for characterizing the unknown parameters of probabilistic models for material properties. In this framework, the unknown parameters are viewed as random and described by their posterior distributions obtained from prior information and measurements of quantities of interest that are observable and depend on the unknown parameters. The proposed Bayesian method is applied to characterize an unknown spatial correlation of the conductivity field in the definition of a stochastic transport equation and to solve this equation by Monte Carlo simulation and stochastic reduced order models (SROMs). As a result, the Bayesian method is also employed tomore » characterize unknown parameters of material properties for laser welds from measurements of peak forces sustained by these welds.« less

  19. The structure of red-infrared scattergrams of semivegetated landscapes

    NASA Technical Reports Server (NTRS)

    Jasinski, Michael F.; Eagleson, Peter S.

    1988-01-01

    A physically based linear stochastic geometric canopy soil reflectance model is presented for characterizing spatial variability of semivegetated landscapes at subpixel and regional scales. Landscapes are conceptualized as stochastic geometric surfaces, incorporating not only the variability in geometric elements, but also the variability in vegetation and soil background reflectance which can be important in some scenes. The model is used to investigate several possible mechanisms which contribute to the often observed characteristic triangular shape of red-infrared scattergrams of semivegetated landscapes. Scattergrams of simulated and semivegetated scenes are analyzed with respect to the scales of the satellite pixel and subpixel components. Analysis of actual aerial radiometric data of a pecan orchard is presented in comparison with ground observations as preliminary confirmation of the theoretical results.

  20. The structure of red-infrared scattergrams of semivegetated landscapes

    NASA Technical Reports Server (NTRS)

    Jasinski, Michael F.; Eagleson, Peter S.

    1989-01-01

    A physically based linear stochastic geometric canopy soil reflectance model is presented for characterizing spatial variability of semivegetated landscapes at subpixel and regional scales. Landscapes are conceptualized as stochastic geometric surfaces, incorporating not only the variability in geometric elements, but also the variability in vegetation and soil background reflectance which can be important in some scenes. The model is used to investigate several possible mechanisms which contribute to the often observed characteristic triangular shape of red-infrared scattergrams of semivegetated landscapes. Scattergrams of simulated semivegetated scenes are analyzed with respect to the scales of the satellite pixel and subpixel components. Analysis of actual aerial radiometric data of a pecan orchard is presented in comparison with ground observations as preliminary confirmation of the theoretical results.

  1. Stochastic Simulation of Dopamine Neuromodulation for Implementation of Fluorescent Neurochemical Probes in the Striatal Extracellular Space.

    PubMed

    Beyene, Abraham G; McFarlane, Ian R; Pinals, Rebecca L; Landry, Markita P

    2017-10-18

    Imaging the dynamic behavior of neuromodulatory neurotransmitters in the extracelluar space that arise from individual quantal release events would constitute a major advance in neurochemical imaging. Spatial and temporal resolution of these highly stochastic neuromodulatory events requires concurrent advances in the chemical development of optical nanosensors selective for neuromodulators in concert with advances in imaging methodologies to capture millisecond neurotransmitter release. Herein, we develop and implement a stochastic model to describe dopamine dynamics in the extracellular space (ECS) of the brain dorsal striatum to guide the design and implementation of fluorescent neurochemical probes that record neurotransmitter dynamics in the ECS. Our model is developed from first-principles and simulates release, diffusion, and reuptake of dopamine in a 3D simulation volume of striatal tissue. We find that in vivo imaging of neuromodulation requires simultaneous optimization of dopamine nanosensor reversibility and sensitivity: dopamine imaging in the striatum or nucleus accumbens requires nanosensors with an optimal dopamine dissociation constant (K d ) of 1 μM, whereas K d s above 10 μM are required for dopamine imaging in the prefrontal cortex. Furthermore, as a result of the probabilistic nature of dopamine terminal activity in the striatum, our model reveals that imaging frame rates of 20 Hz are optimal for recording temporally resolved dopamine release events. Our work provides a modeling platform to probe how complex neuromodulatory processes can be studied with fluorescent nanosensors and enables direct evaluation of nanosensor chemistry and imaging hardware parameters. Our stochastic model is generic for evaluating fluorescent neurotransmission probes, and is broadly applicable to the design of other neurotransmitter fluorophores and their optimization for implementation in vivo.

  2. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    DTIC Science & Technology

    1994-08-10

    SUBTITLE r 5. FUNDING NUMBERS Artificial Neural Network Metamodels of Stochastic I () Computer Simulations 6. AUTHOR(S) AD- A285 951 Robert Allen...8217!298*1C2 ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC COMPUTER SIMULATIONS by Robert Allen Kilmer B.S. in Education Mathematics, Indiana...dedicate this document to the memory of my father, William Ralph Kilmer. mi ABSTRACT Signature ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC

  3. SPAGETTA: a Multi-Purpose Gridded Stochastic Weather Generator

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Huth, R.; Rotach, M. W.; Dabhi, H.

    2017-12-01

    SPAGETTA is a new multisite/gridded multivariate parametric stochastic weather generator (WG). Site-specific precipitation occurrence and amount are modelled by Markov chain and Gamma distribution, the non-precipitation variables are modelled by an autoregressive (AR) model conditioned on precipitation occurrence, and the spatial coherence of all variables is modelled following the Wilks' (2009) approach. SPAGETTA may be run in two modes. Mode 1: it is run as a classical WG, which is calibrated using weather series from multiple sites, and only then it may produce arbitrarily long synthetic series mimicking the spatial and temporal structure of the calibration data. To generate the weather series representing the future climate, the WG parameters are modified according to the climate change scenario, typically derived from GCM or RCM simulations. Mode 2: the user provides only basic information (not necessarily to be realistic) on the temporal and spatial auto-correlation structure of the weather variables and their mean annual cycle; the generator itself derives the parameters of the underlying AR model, which produces the multi-site weather series. Optionally, the user may add the spatially varying trend, which is superimposed to the synthetic series. The contribution consists of following parts: (a) Model of the WG. (b) Validation of WG in terms of the spatial temperature and precipitation characteristics, including characteristics of spatial hot/cold/dry/wet spells. (c) Results of the climate change impact experiment, in which the WG parameters representing the spatial and temporal variability are modified using the climate change scenarios and the effect on the above spatial validation indices is analysed. In this experiment, the WG is calibrated using the E-OBS gridded daily weather data for several European regions, and the climate change scenarios are derived from the selected RCM simulations (CORDEX database). (d) The second mode of operation will be demonstrated by results obtained while developing the methodology for assessing collective significance of trends in multi-site weather series. The performance of the proposed test statistics is assessed based on large number of realisations of synthetic series produced by WG assuming a given statistical structure and trend of the weather series.

  4. Accurate hybrid stochastic simulation of a system of coupled chemical or biochemical reactions.

    PubMed

    Salis, Howard; Kaznessis, Yiannis

    2005-02-01

    The dynamical solution of a well-mixed, nonlinear stochastic chemical kinetic system, described by the Master equation, may be exactly computed using the stochastic simulation algorithm. However, because the computational cost scales with the number of reaction occurrences, systems with one or more "fast" reactions become costly to simulate. This paper describes a hybrid stochastic method that partitions the system into subsets of fast and slow reactions, approximates the fast reactions as a continuous Markov process, using a chemical Langevin equation, and accurately describes the slow dynamics using the integral form of the "Next Reaction" variant of the stochastic simulation algorithm. The key innovation of this method is its mechanism of efficiently monitoring the occurrences of slow, discrete events while simultaneously simulating the dynamics of a continuous, stochastic or deterministic process. In addition, by introducing an approximation in which multiple slow reactions may occur within a time step of the numerical integration of the chemical Langevin equation, the hybrid stochastic method performs much faster with only a marginal decrease in accuracy. Multiple examples, including a biological pulse generator and a large-scale system benchmark, are simulated using the exact and proposed hybrid methods as well as, for comparison, a previous hybrid stochastic method. Probability distributions of the solutions are compared and the weak errors of the first two moments are computed. In general, these hybrid methods may be applied to the simulation of the dynamics of a system described by stochastic differential, ordinary differential, and Master equations.

  5. Systems Biology Approaches for Understanding Genome Architecture.

    PubMed

    Sewitz, Sven; Lipkow, Karen

    2016-01-01

    The linear and three-dimensional arrangement and composition of chromatin in eukaryotic genomes underlies the mechanisms directing gene regulation. Understanding this organization requires the integration of many data types and experimental results. Here we describe the approach of integrating genome-wide protein-DNA binding data to determine chromatin states. To investigate spatial aspects of genome organization, we present a detailed description of how to run stochastic simulations of protein movements within a simulated nucleus in 3D. This systems level approach enables the development of novel questions aimed at understanding the basic mechanisms that regulate genome dynamics.

  6. Modelling uncertainty in incompressible flow simulation using Galerkin based generalized ANOVA

    NASA Astrophysics Data System (ADS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2016-11-01

    This paper presents a new algorithm, referred to here as Galerkin based generalized analysis of variance decomposition (GG-ANOVA) for modelling input uncertainties and its propagation in incompressible fluid flow. The proposed approach utilizes ANOVA to represent the unknown stochastic response. Further, the unknown component functions of ANOVA are represented using the generalized polynomial chaos expansion (PCE). The resulting functional form obtained by coupling the ANOVA and PCE is substituted into the stochastic Navier-Stokes equation (NSE) and Galerkin projection is employed to decompose it into a set of coupled deterministic 'Navier-Stokes alike' equations. Temporal discretization of the set of coupled deterministic equations is performed by employing Adams-Bashforth scheme for convective term and Crank-Nicolson scheme for diffusion term. Spatial discretization is performed by employing finite difference scheme. Implementation of the proposed approach has been illustrated by two examples. In the first example, a stochastic ordinary differential equation has been considered. This example illustrates the performance of proposed approach with change in nature of random variable. Furthermore, convergence characteristics of GG-ANOVA has also been demonstrated. The second example investigates flow through a micro channel. Two case studies, namely the stochastic Kelvin-Helmholtz instability and stochastic vortex dipole, have been investigated. For all the problems results obtained using GG-ANOVA are in excellent agreement with benchmark solutions.

  7. Origin of Pareto-like spatial distributions in ecosystems.

    PubMed

    Manor, Alon; Shnerb, Nadav M

    2008-12-31

    Recent studies of cluster distribution in various ecosystems revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this patch statistics is a manifestation of the law of proportionate effect. Mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (such as desertification) manifest themselves in a drastic change of the stability properties of spatial colonies.

  8. Formulating Spatially Varying Performance in the Statistical Fusion Framework

    PubMed Central

    Landman, Bennett A.

    2012-01-01

    To date, label fusion methods have primarily relied either on global (e.g. STAPLE, globally weighted vote) or voxelwise (e.g. locally weighted vote) performance models. Optimality of the statistical fusion framework hinges upon the validity of the stochastic model of how a rater errs (i.e., the labeling process model). Hitherto, approaches have tended to focus on the extremes of potential models. Herein, we propose an extension to the STAPLE approach to seamlessly account for spatially varying performance by extending the performance level parameters to account for a smooth, voxelwise performance level field that is unique to each rater. This approach, Spatial STAPLE, provides significant improvements over state-of-the-art label fusion algorithms in both simulated and empirical data sets. PMID:22438513

  9. Simulating immiscible multi-phase flow and wetting with 3D stochastic rotation dynamics (SRD)

    NASA Astrophysics Data System (ADS)

    Hiller, Thomas; Sanchez de La Lama, Marta; Herminghaus, Stephan; Brinkmann, Martin

    2013-11-01

    We use a variant of the mesoscopic particle method stochastic rotation dynamics (SRD) to simulate immiscible multi-phase flow on the pore and sub-pore scale in three dimensions. As an extension to the multi-color SRD method, first proposed by Inoue et al., we present an implementation that accounts for complex wettability on heterogeneous surfaces. In order to demonstrate the versatility of this algorithm, we consider immiscible two-phase flow through a model porous medium (disordered packing of spherical beads) where the substrate exhibits different spatial wetting patterns. We show that these patterns have a significant effect on the interface dynamics. Furthermore, the implementation of angular momentum conservation into the SRD algorithm allows us to extent the applicability of SRD also to micro-fluidic systems. It is now possible to study e.g. the internal flow behaviour of a droplet depending on the driving velocity of the surrounding bulk fluid or the splitting of droplets by an obstacle.

  10. FERN - a Java framework for stochastic simulation and evaluation of reaction networks.

    PubMed

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-08-29

    Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.

  11. Stochastic Modelling, Analysis, and Simulations of the Solar Cycle Dynamic Process

    NASA Astrophysics Data System (ADS)

    Turner, Douglas C.; Ladde, Gangaram S.

    2018-03-01

    Analytical solutions, discretization schemes and simulation results are presented for the time delay deterministic differential equation model of the solar dynamo presented by Wilmot-Smith et al. In addition, this model is extended under stochastic Gaussian white noise parametric fluctuations. The introduction of stochastic fluctuations incorporates variables affecting the dynamo process in the solar interior, estimation error of parameters, and uncertainty of the α-effect mechanism. Simulation results are presented and analyzed to exhibit the effects of stochastic parametric volatility-dependent perturbations. The results generalize and extend the work of Hazra et al. In fact, some of these results exhibit the oscillatory dynamic behavior generated by the stochastic parametric additative perturbations in the absence of time delay. In addition, the simulation results of the modified stochastic models influence the change in behavior of the very recently developed stochastic model of Hazra et al.

  12. High-resolution stochastic downscaling of climate models: simulating wind advection, cloud cover and precipitation

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Fatichi, Simone; Burlando, Paolo

    2015-04-01

    A new stochastic approach to generate wind advection, cloud cover and precipitation fields is presented with the aim of formulating a space-time weather generator characterized by fields with high spatial and temporal resolution (e.g., 1 km x 1 km and 5 min). Its use is suitable for stochastic downscaling of climate scenarios in the context of hydrological, ecological and geomorphological applications. The approach is based on concepts from the Advanced WEather GENerator (AWE-GEN) presented by Fatichi et al. (2011, Adv. Water Resour.), the Space-Time Realizations of Areal Precipitation model (STREAP) introduced by Paschalis et al. (2013, Water Resour. Res.), and the High-Resolution Synoptically conditioned Weather Generator (HiReS-WG) presented by Peleg and Morin (2014, Water Resour. Res.). Advection fields are generated on the basis of the 500 hPa u and v wind direction variables derived from global or regional climate models. The advection velocity and direction are parameterized using Kappa and von Mises distributions respectively. A random Gaussian fields is generated using a fast Fourier transform to preserve the spatial correlation of advection. The cloud cover area, total precipitation area and mean advection of the field are coupled using a multi-autoregressive model. The approach is relatively parsimonious in terms of computational demand and, in the context of climate change, allows generating many stochastic realizations of current and projected climate in a fast and efficient way. A preliminary test of the approach is presented with reference to a case study in a complex orography terrain in the Swiss Alps.

  13. Population dynamics, information transfer, and spatial organization in a chemical reaction network under spatial confinement and crowding conditions

    DOE PAGES

    Bellesia, Giovanni; Bales, Benjamin B.

    2016-10-10

    Here, we investigate, via Brownian dynamics simulations, the reaction dynamics of a generic, nonlinear chemical network under spatial confinement and crowding conditions. In detail, the Willamowski-Rossler chemical reaction system has been “extended” and considered as a prototype reaction-diffusion system. These results are potentially relevant to a number of open problems in biophysics and biochemistry, such as the synthesis of primitive cellular units (protocells) and the definition of their role in the chemical origin of life and the characterization of vesicle-mediated drug delivery processes. More generally, the computational approach presented in this work makes the case for the use of spatialmore » stochastic simulation methods for the study of biochemical networks in vivo where the “well-mixed” approximation is invalid and both thermal and intrinsic fluctuations linked to the possible presence of molecular species in low number copies cannot be averaged out.« less

  14. Multi-Dimensional, Mesoscopic Monte Carlo Simulations of Inhomogeneous Reaction-Drift-Diffusion Systems on Graphics-Processing Units

    PubMed Central

    Vigelius, Matthias; Meyer, Bernd

    2012-01-01

    For many biological applications, a macroscopic (deterministic) treatment of reaction-drift-diffusion systems is insufficient. Instead, one has to properly handle the stochastic nature of the problem and generate true sample paths of the underlying probability distribution. Unfortunately, stochastic algorithms are computationally expensive and, in most cases, the large number of participating particles renders the relevant parameter regimes inaccessible. In an attempt to address this problem we present a genuine stochastic, multi-dimensional algorithm that solves the inhomogeneous, non-linear, drift-diffusion problem on a mesoscopic level. Our method improves on existing implementations in being multi-dimensional and handling inhomogeneous drift and diffusion. The algorithm is well suited for an implementation on data-parallel hardware architectures such as general-purpose graphics processing units (GPUs). We integrate the method into an operator-splitting approach that decouples chemical reactions from the spatial evolution. We demonstrate the validity and applicability of our algorithm with a comprehensive suite of standard test problems that also serve to quantify the numerical accuracy of the method. We provide a freely available, fully functional GPU implementation. Integration into Inchman, a user-friendly web service, that allows researchers to perform parallel simulations of reaction-drift-diffusion systems on GPU clusters is underway. PMID:22506001

  15. Modeling the spatial spread of infectious diseases: the GLobal Epidemic and Mobility computational model

    PubMed Central

    Balcan, Duygu; Gonçalves, Bruno; Hu, Hao; Ramasco, José J.; Colizza, Vittoria

    2010-01-01

    Here we present the Global Epidemic and Mobility (GLEaM) model that integrates sociodemographic and population mobility data in a spatially structured stochastic disease approach to simulate the spread of epidemics at the worldwide scale. We discuss the flexible structure of the model that is open to the inclusion of different disease structures and local intervention policies. This makes GLEaM suitable for the computational modeling and anticipation of the spatio-temporal patterns of global epidemic spreading, the understanding of historical epidemics, the assessment of the role of human mobility in shaping global epidemics, and the analysis of mitigation and containment scenarios. PMID:21415939

  16. A stochastic-geometric model of soil variation in Pleistocene patterned ground

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Meerschman, Eef; Van Meirvenne, Marc

    2013-04-01

    In this paper we examine the spatial variability of soil in parent material with complex spatial structure which arises from complex non-linear geomorphic processes. We show that this variability can be better-modelled by a stochastic-geometric model than by a standard Gaussian random field. The benefits of the new model are seen in the reproduction of features of the target variable which influence processes like water movement and pollutant dispersal. Complex non-linear processes in the soil give rise to properties with non-Gaussian distributions. Even under a transformation to approximate marginal normality, such variables may have a more complex spatial structure than the Gaussian random field model of geostatistics can accommodate. In particular the extent to which extreme values of the variable are connected in spatially coherent regions may be misrepresented. As a result, for example, geostatistical simulation generally fails to reproduce the pathways for preferential flow in an environment where coarse infill of former fluvial channels or coarse alluvium of braided streams creates pathways for rapid movement of water. Multiple point geostatistics has been developed to deal with this problem. Multiple point methods proceed by sampling from a set of training images which can be assumed to reproduce the non-Gaussian behaviour of the target variable. The challenge is to identify appropriate sources of such images. In this paper we consider a mode of soil variation in which the soil varies continuously, exhibiting short-range lateral trends induced by local effects of the factors of soil formation which vary across the region of interest in an unpredictable way. The trends in soil variation are therefore only apparent locally, and the soil variation at regional scale appears random. We propose a stochastic-geometric model for this mode of soil variation called the Continuous Local Trend (CLT) model. We consider a case study of soil formed in relict patterned ground with pronounced lateral textural variations arising from the presence of infilled ice-wedges of Pleistocene origin. We show how knowledge of the pedogenetic processes in this environment, along with some simple descriptive statistics, can be used to select and fit a CLT model for the apparent electrical conductivity (ECa) of the soil. We use the model to simulate realizations of the CLT process, and compare these with realizations of a fitted Gaussian random field. We show how statistics that summarize the spatial coherence of regions with small values of ECa, which are expected to have coarse texture and so larger saturated hydraulic conductivity, are better reproduced by the CLT model than by the Gaussian random field. This suggests that the CLT model could be used to generate an unlimited supply of training images to allow multiple point geostatistical simulation or prediction of this or similar variables.

  17. Developing Stochastic Models as Inputs for High-Frequency Ground Motion Simulations

    NASA Astrophysics Data System (ADS)

    Savran, William Harvey

    High-frequency ( 10 Hz) deterministic ground motion simulations are challenged by our understanding of the small-scale structure of the earth's crust and the rupture process during an earthquake. We will likely never obtain deterministic models that can accurately describe these processes down to the meter scale length required for broadband wave propagation. Instead, we can attempt to explain the behavior, in a statistical sense, by including stochastic models defined by correlations observed in the natural earth and through physics based simulations of the earthquake rupture process. Toward this goal, we develop stochastic models to address both of the primary considerations for deterministic ground motion simulations: namely, the description of the material properties in the crust, and broadband earthquake source descriptions. Using borehole sonic log data recorded in Los Angeles basin, we estimate the spatial correlation structure of the small-scale fluctuations in P-wave velocities by determining the best-fitting parameters of a von Karman correlation function. We find that Hurst exponents, nu, between 0.0-0.2, vertical correlation lengths, az, of 15-150m, an standard deviation, sigma of about 5% characterize the variability in the borehole data. Usin these parameters, we generated a stochastic model of velocity and density perturbations and combined with leading seismic velocity models to perform a validation exercise for the 2008, Chino Hills, CA using heterogeneous media. We find that models of velocity and density perturbations can have significant effects on the wavefield at frequencies as low as 0.3 Hz, with ensemble median values of various ground motion metrics varying up to +/-50%, at certain stations, compared to those computed solely from the CVM. Finally, we develop a kinematic rupture generator based on dynamic rupture simulations on geometrically complex faults. We analyze 100 dynamic rupture simulations on strike-slip faults ranging from Mw 6.4-7.2. We find that our dynamic simulations follow empirical scaling relationships for inter-plate strike-slip events, and provide source spectra comparable with an o -2 model. Our rupture generator reproduces GMPE medians and intra-event standard deviations spectral accelerations for an ensemble of 10 Hz fully-deterministic ground motion simulations, as compared to NGA West2 GMPE relationships up to 0.2 seconds.

  18. Stochastic simulation and analysis of biomolecular reaction networks

    PubMed Central

    Frazier, John M; Chushak, Yaroslav; Foy, Brent

    2009-01-01

    Background In recent years, several stochastic simulation algorithms have been developed to generate Monte Carlo trajectories that describe the time evolution of the behavior of biomolecular reaction networks. However, the effects of various stochastic simulation and data analysis conditions on the observed dynamics of complex biomolecular reaction networks have not recieved much attention. In order to investigate these issues, we employed a a software package developed in out group, called Biomolecular Network Simulator (BNS), to simulate and analyze the behavior of such systems. The behavior of a hypothetical two gene in vitro transcription-translation reaction network is investigated using the Gillespie exact stochastic algorithm to illustrate some of the factors that influence the analysis and interpretation of these data. Results Specific issues affecting the analysis and interpretation of simulation data are investigated, including: (1) the effect of time interval on data presentation and time-weighted averaging of molecule numbers, (2) effect of time averaging interval on reaction rate analysis, (3) effect of number of simulations on precision of model predictions, and (4) implications of stochastic simulations on optimization procedures. Conclusion The two main factors affecting the analysis of stochastic simulations are: (1) the selection of time intervals to compute or average state variables and (2) the number of simulations generated to evaluate the system behavior. PMID:19534796

  19. On the equivalence of dynamically orthogonal and bi-orthogonal methods: Theory and numerical simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Minseok; Sapsis, Themistoklis P.; Karniadakis, George Em, E-mail: george_karniadakis@brown.edu

    2014-08-01

    The Karhunen–Lòeve (KL) decomposition provides a low-dimensional representation for random fields as it is optimal in the mean square sense. Although for many stochastic systems of practical interest, described by stochastic partial differential equations (SPDEs), solutions possess this low-dimensional character, they also have a strongly time-dependent form and to this end a fixed-in-time basis may not describe the solution in an efficient way. Motivated by this limitation of standard KL expansion, Sapsis and Lermusiaux (2009) [26] developed the dynamically orthogonal (DO) field equations which allow for the simultaneous evolution of both the spatial basis where uncertainty ‘lives’ but also themore » stochastic characteristics of uncertainty. Recently, Cheng et al. (2013) [28] introduced an alternative approach, the bi-orthogonal (BO) method, which performs the exact same tasks, i.e. it evolves the spatial basis and the stochastic characteristics of uncertainty. In the current work we examine the relation of the two approaches and we prove theoretically and illustrate numerically their equivalence, in the sense that one method is an exact reformulation of the other. We show this by deriving a linear and invertible transformation matrix described by a matrix differential equation that connects the BO and the DO solutions. We also examine a pathology of the BO equations that occurs when two eigenvalues of the solution cross, resulting in an instantaneous, infinite-speed, internal rotation of the computed spatial basis. We demonstrate that despite the instantaneous duration of the singularity this has important implications on the numerical performance of the BO approach. On the other hand, it is observed that the BO is more stable in nonlinear problems involving a relatively large number of modes. Several examples, linear and nonlinear, are presented to illustrate the DO and BO methods as well as their equivalence.« less

  20. Stochastic simulation and decadal prediction of hydroclimate in the Western Himalayas

    NASA Astrophysics Data System (ADS)

    Robertson, A. W.; Chekroun, M. D.; Cook, E.; D'Arrigo, R.; Ghil, M.; Greene, A. M.; Holsclaw, T.; Kondrashov, D. A.; Lall, U.; Lu, M.; Smyth, P.

    2012-12-01

    Improved estimates of climate over the next 10 to 50 years are needed for long-term planning in water resource and flood management. However, the task of effectively incorporating the results of climate change research into decision-making face a ``double conflict of scales'': the temporal scales of climate model projections are too long, while their usable spatial scales (global to planetary) are much larger than those needed for actual decision making (at the regional to local level). This work is designed to help tackle this ``double conflict'' in the context of water management over monsoonal Asia, based on dendroclimatic multi-century reconstructions of drought indices and river flows. We identify low-frequency modes of variability with time scales from interannual to interdecadal based on these series, and then generate future scenarios based on (a) empirical model decadal predictions, and (b) stochastic simulations generated with autoregressive models that reproduce the power spectrum of the data. Finally, we consider how such scenarios could be used to develop reservoir optimization models. Results will be presented based on multi-century Upper Indus river discharge reconstructions that exhibit a strong periodicity near 27 years that is shown to yield some retrospective forecasting skill over the 1700-2000 period, at a 15-yr yield time. Stochastic simulations of annual PDSI drought index values over the Upper Indus basin are constructed using Empirical Model Reduction; their power spectra are shown to be quite realistic, with spectral peaks near 5--8 years.

  1. Fusion of Hard and Soft Information in Nonparametric Density Estimation

    DTIC Science & Technology

    2015-06-10

    and stochastic optimization models, in analysis of simulation output, and when instantiating probability models. We adopt a constrained maximum...particular, density estimation is needed for generation of input densities to simulation and stochastic optimization models, in analysis of simulation output...an essential step in simulation analysis and stochastic optimization is the generation of probability densities for input random variables; see for

  2. Selected-node stochastic simulation algorithm

    NASA Astrophysics Data System (ADS)

    Duso, Lorenzo; Zechner, Christoph

    2018-04-01

    Stochastic simulations of biochemical networks are of vital importance for understanding complex dynamics in cells and tissues. However, existing methods to perform such simulations are associated with computational difficulties and addressing those remains a daunting challenge to the present. Here we introduce the selected-node stochastic simulation algorithm (snSSA), which allows us to exclusively simulate an arbitrary, selected subset of molecular species of a possibly large and complex reaction network. The algorithm is based on an analytical elimination of chemical species, thereby avoiding explicit simulation of the associated chemical events. These species are instead described continuously in terms of statistical moments derived from a stochastic filtering equation, resulting in a substantial speedup when compared to Gillespie's stochastic simulation algorithm (SSA). Moreover, we show that statistics obtained via snSSA profit from a variance reduction, which can significantly lower the number of Monte Carlo samples needed to achieve a certain performance. We demonstrate the algorithm using several biological case studies for which the simulation time could be reduced by orders of magnitude.

  3. Rule-based spatial modeling with diffusing, geometrically constrained molecules.

    PubMed

    Gruenert, Gerd; Ibrahim, Bashar; Lenser, Thorsten; Lohel, Maiko; Hinze, Thomas; Dittrich, Peter

    2010-06-07

    We suggest a new type of modeling approach for the coarse grained, particle-based spatial simulation of combinatorially complex chemical reaction systems. In our approach molecules possess a location in the reactor as well as an orientation and geometry, while the reactions are carried out according to a list of implicitly specified reaction rules. Because the reaction rules can contain patterns for molecules, a combinatorially complex or even infinitely sized reaction network can be defined. For our implementation (based on LAMMPS), we have chosen an already existing formalism (BioNetGen) for the implicit specification of the reaction network. This compatibility allows to import existing models easily, i.e., only additional geometry data files have to be provided. Our simulations show that the obtained dynamics can be fundamentally different from those simulations that use classical reaction-diffusion approaches like Partial Differential Equations or Gillespie-type spatial stochastic simulation. We show, for example, that the combination of combinatorial complexity and geometric effects leads to the emergence of complex self-assemblies and transportation phenomena happening faster than diffusion (using a model of molecular walkers on microtubules). When the mentioned classical simulation approaches are applied, these aspects of modeled systems cannot be observed without very special treatment. Further more, we show that the geometric information can even change the organizational structure of the reaction system. That is, a set of chemical species that can in principle form a stationary state in a Differential Equation formalism, is potentially unstable when geometry is considered, and vice versa. We conclude that our approach provides a new general framework filling a gap in between approaches with no or rigid spatial representation like Partial Differential Equations and specialized coarse-grained spatial simulation systems like those for DNA or virus capsid self-assembly.

  4. Rule-based spatial modeling with diffusing, geometrically constrained molecules

    PubMed Central

    2010-01-01

    Background We suggest a new type of modeling approach for the coarse grained, particle-based spatial simulation of combinatorially complex chemical reaction systems. In our approach molecules possess a location in the reactor as well as an orientation and geometry, while the reactions are carried out according to a list of implicitly specified reaction rules. Because the reaction rules can contain patterns for molecules, a combinatorially complex or even infinitely sized reaction network can be defined. For our implementation (based on LAMMPS), we have chosen an already existing formalism (BioNetGen) for the implicit specification of the reaction network. This compatibility allows to import existing models easily, i.e., only additional geometry data files have to be provided. Results Our simulations show that the obtained dynamics can be fundamentally different from those simulations that use classical reaction-diffusion approaches like Partial Differential Equations or Gillespie-type spatial stochastic simulation. We show, for example, that the combination of combinatorial complexity and geometric effects leads to the emergence of complex self-assemblies and transportation phenomena happening faster than diffusion (using a model of molecular walkers on microtubules). When the mentioned classical simulation approaches are applied, these aspects of modeled systems cannot be observed without very special treatment. Further more, we show that the geometric information can even change the organizational structure of the reaction system. That is, a set of chemical species that can in principle form a stationary state in a Differential Equation formalism, is potentially unstable when geometry is considered, and vice versa. Conclusions We conclude that our approach provides a new general framework filling a gap in between approaches with no or rigid spatial representation like Partial Differential Equations and specialized coarse-grained spatial simulation systems like those for DNA or virus capsid self-assembly. PMID:20529264

  5. Irregular synchronous activity in stochastically-coupled networks of integrate-and-fire neurons.

    PubMed

    Lin, J K; Pawelzik, K; Ernst, U; Sejnowski, T J

    1998-08-01

    We investigate the spatial and temporal aspects of firing patterns in a network of integrate-and-fire neurons arranged in a one-dimensional ring topology. The coupling is stochastic and shaped like a Mexican hat with local excitation and lateral inhibition. With perfect precision in the couplings, the attractors of activity in the network occur at every position in the ring. Inhomogeneities in the coupling break the translational invariance of localized attractors and lead to synchronization within highly active as well as weakly active clusters. The interspike interval variability is high, consistent with recent observations of spike time distributions in visual cortex. The robustness of our results is demonstrated with more realistic simulations on a network of McGregor neurons which model conductance changes and after-hyperpolarization potassium currents.

  6. Stochastic Parametrization for the Impact of Neglected Variability Patterns

    NASA Astrophysics Data System (ADS)

    Kaiser, Olga; Hien, Steffen; Achatz, Ulrich; Horenko, Illia

    2017-04-01

    An efficient description of the gravity wave variability and the related spontaneous emission processes requires an empirical stochastic closure for the impact of neglected variability patterns (subgridscales or SGS). In particular, we focus on the analysis of the IGW emission within a tangent linear model which requires a stochastic SGS parameterization for taking the self interaction of the ageostrophic flow components into account. For this purpose, we identify the best SGS model in terms of exactness and simplicity by deploying a wide range of different data-driven model classes, including standard stationary regression models, autoregression and artificial neuronal networks models - as well as the family of nonstationary models like FEM-BV-VARX model class (Finite Element based vector autoregressive time series analysis with bounded variation of the model parameters). The models are used to investigate the main characteristics of the underlying dynamics and to explore the significant spatial and temporal neighbourhood dependencies. The best SGS model in terms of exactness and simplicity is obtained for the nonstationary FEM-BV-VARX setting, determining only direct spatial and temporal neighbourhood as significant - and allowing to drastically reduce the number of informations that are required for the optimal SGS. Additionally, the models are characterized by sets of vector- and matrix-valued parameters that must be inferred from big data sets provided by simulations - making it a task that can not be solved without deploying high-performance computing facilities (HPC).

  7. Hybrid Optimal Design of the Eco-Hydrological Wireless Sensor Network in the Middle Reach of the Heihe River Basin, China

    PubMed Central

    Kang, Jian; Li, Xin; Jin, Rui; Ge, Yong; Wang, Jinfeng; Wang, Jianghao

    2014-01-01

    The eco-hydrological wireless sensor network (EHWSN) in the middle reaches of the Heihe River Basin in China is designed to capture the spatial and temporal variability and to estimate the ground truth for validating the remote sensing productions. However, there is no available prior information about a target variable. To meet both requirements, a hybrid model-based sampling method without any spatial autocorrelation assumptions is developed to optimize the distribution of EHWSN nodes based on geostatistics. This hybrid model incorporates two sub-criteria: one for the variogram modeling to represent the variability, another for improving the spatial prediction to evaluate remote sensing productions. The reasonability of the optimized EHWSN is validated from representativeness, the variogram modeling and the spatial accuracy through using 15 types of simulation fields generated with the unconditional geostatistical stochastic simulation. The sampling design shows good representativeness; variograms estimated by samples have less than 3% mean error relative to true variograms. Then, fields at multiple scales are predicted. As the scale increases, estimated fields have higher similarities to simulation fields at block sizes exceeding 240 m. The validations prove that this hybrid sampling method is effective for both objectives when we do not know the characteristics of an optimized variables. PMID:25317762

  8. Hybrid optimal design of the eco-hydrological wireless sensor network in the middle reach of the Heihe River Basin, China.

    PubMed

    Kang, Jian; Li, Xin; Jin, Rui; Ge, Yong; Wang, Jinfeng; Wang, Jianghao

    2014-10-14

    The eco-hydrological wireless sensor network (EHWSN) in the middle reaches of the Heihe River Basin in China is designed to capture the spatial and temporal variability and to estimate the ground truth for validating the remote sensing productions. However, there is no available prior information about a target variable. To meet both requirements, a hybrid model-based sampling method without any spatial autocorrelation assumptions is developed to optimize the distribution of EHWSN nodes based on geostatistics. This hybrid model incorporates two sub-criteria: one for the variogram modeling to represent the variability, another for improving the spatial prediction to evaluate remote sensing productions. The reasonability of the optimized EHWSN is validated from representativeness, the variogram modeling and the spatial accuracy through using 15 types of simulation fields generated with the unconditional geostatistical stochastic simulation. The sampling design shows good representativeness; variograms estimated by samples have less than 3% mean error relative to true variograms. Then, fields at multiple scales are predicted. As the scale increases, estimated fields have higher similarities to simulation fields at block sizes exceeding 240 m. The validations prove that this hybrid sampling method is effective for both objectives when we do not know the characteristics of an optimized variables.

  9. Deciphering the expression of climate change within the Lower Colorado River basin by stochastic simulation of convective rainfall

    NASA Astrophysics Data System (ADS)

    Bliss Singer, Michael; Michaelides, Katerina

    2017-10-01

    In drylands, convective rainstorms typically control runoff, streamflow, water supply and flood risk to human populations, and ecological water availability at multiple spatial scales. Since drainage basin water balance is sensitive to climate, it is important to improve characterization of convective rainstorms in a manner that enables statistical assessment of rainfall at high spatial and temporal resolution, and the prediction of plausible manifestations of climate change. Here we present a simple rainstorm generator, STORM, for convective storm simulation. It was created using data from a rain gauge network in one dryland drainage basin, but is applicable anywhere. We employ STORM to assess watershed rainfall under climate change simulations that reflect differences in wetness/storminess, and thus provide insight into observed or projected regional hydrologic trends. Our analysis documents historical, regional climate change manifesting as a multidecadal decline in rainfall intensity, which we suggest has negatively impacted ephemeral runoff in the Lower Colorado River basin, but has not contributed substantially to regional negative streamflow trends.

  10. Waveform inversion with source encoding for breast sound speed reconstruction in ultrasound computed tomography.

    PubMed

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the sound speed distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Both computer simulation and experimental phantom studies are conducted to demonstrate the use of the WISE method. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  11. Stochastic ecological network occupancy (SENO) models: a new tool for modeling ecological networks across spatial scales

    USGS Publications Warehouse

    Lafferty, Kevin D.; Dunne, Jennifer A.

    2010-01-01

    Stochastic ecological network occupancy (SENO) models predict the probability that species will occur in a sample of an ecological network. In this review, we introduce SENO models as a means to fill a gap in the theoretical toolkit of ecologists. As input, SENO models use a topological interaction network and rates of colonization and extinction (including consumer effects) for each species. A SENO model then simulates the ecological network over time, resulting in a series of sub-networks that can be used to identify commonly encountered community modules. The proportion of time a species is present in a patch gives its expected probability of occurrence, whose sum across species gives expected species richness. To illustrate their utility, we provide simple examples of how SENO models can be used to investigate how topological complexity, species interactions, species traits, and spatial scale affect communities in space and time. They can categorize species as biodiversity facilitators, contributors, or inhibitors, making this approach promising for ecosystem-based management of invasive, threatened, or exploited species.

  12. I = 1 and I = 2 π-π scattering phase shifts from Nf = 2 + 1 lattice QCD

    NASA Astrophysics Data System (ADS)

    Bulava, John; Fahy, Brendan; Hörz, Ben; Juge, Keisuke J.; Morningstar, Colin; Wong, Chik Him

    2016-09-01

    The I = 1 p-wave and I = 2 s-wave elastic π-π scattering amplitudes are calculated from a first-principles lattice QCD simulation using a single ensemble of gauge field configurations with Nf = 2 + 1 dynamical flavors of anisotropic clover-improved Wilson fermions. This ensemble has a large spatial volume V =(3.7 fm)3, pion mass mπ = 230 MeV, and spatial lattice spacing as = 0.11 fm. Calculation of the necessary temporal correlation matrices is efficiently performed using the stochastic LapH method, while the large volume enables an improved energy resolution compared to previous work. For this single ensemble we obtain mρ /mπ = 3.350 (24), gρππ = 5.99 (26), and a clear signal for the I = 2 s-wave. The success of the stochastic LapH method in this proof-of-principle large-volume calculation paves the way for quantitative study of the lattice spacing effects and quark mass dependence of scattering amplitudes using state-of-the-art ensembles.

  13. Introduction to Stochastic Simulations for Chemical and Physical Processes: Principles and Applications

    ERIC Educational Resources Information Center

    Weiss, Charles J.

    2017-01-01

    An introduction to digital stochastic simulations for modeling a variety of physical and chemical processes is presented. Despite the importance of stochastic simulations in chemistry, the prevalence of turn-key software solutions can impose a layer of abstraction between the user and the underlying approach obscuring the methodology being…

  14. Emergence of diversity in homogeneous coupled Boolean networks

    NASA Astrophysics Data System (ADS)

    Kang, Chris; Aguilar, Boris; Shmulevich, Ilya

    2018-05-01

    The origin of multicellularity in metazoa is one of the fundamental questions of evolutionary biology. We have modeled the generic behaviors of gene regulatory networks in isogenic cells as stochastic nonlinear dynamical systems—coupled Boolean networks with perturbation. Model simulations under a variety of dynamical regimes suggest that the central characteristic of multicellularity, permanent spatial differentiation (diversification), indeed can arise. Additionally, we observe that diversification is more likely to occur near the critical regime of Lyapunov stability.

  15. Hierarchical stochastic modeling of large river ecosystems and fish growth across spatio-temporal scales and climate models: the Missouri River endangered pallid sturgeon example

    USGS Publications Warehouse

    Wildhaber, Mark L.; Wikle, Christopher K.; Moran, Edward H.; Anderson, Christopher J.; Franz, Kristie J.; Dey, Rima

    2017-01-01

    We present a hierarchical series of spatially decreasing and temporally increasing models to evaluate the uncertainty in the atmosphere – ocean global climate model (AOGCM) and the regional climate model (RCM) relative to the uncertainty in the somatic growth of the endangered pallid sturgeon (Scaphirhynchus albus). For effects on fish populations of riverine ecosystems, cli- mate output simulated by coarse-resolution AOGCMs and RCMs must be downscaled to basins to river hydrology to population response. One needs to transfer the information from these climate simulations down to the individual scale in a way that minimizes extrapolation and can account for spatio-temporal variability in the intervening stages. The goal is a framework to determine whether, given uncertainties in the climate models and the biological response, meaningful inference can still be made. The non-linear downscaling of climate information to the river scale requires that one realistically account for spatial and temporal variability across scale. Our down- scaling procedure includes the use of fixed/calibrated hydrological flow and temperature models coupled with a stochastically parameterized sturgeon bioenergetics model. We show that, although there is a large amount of uncertainty associated with both the climate model output and the fish growth process, one can establish significant differences in fish growth distributions between models, and between future and current climates for a given model.

  16. Stochastic lattice model of synaptic membrane protein domains.

    PubMed

    Li, Yiwei; Kahraman, Osman; Haselwandter, Christoph A

    2017-05-01

    Neurotransmitter receptor molecules, concentrated in synaptic membrane domains along with scaffolds and other kinds of proteins, are crucial for signal transmission across chemical synapses. In common with other membrane protein domains, synaptic domains are characterized by low protein copy numbers and protein crowding, with rapid stochastic turnover of individual molecules. We study here in detail a stochastic lattice model of the receptor-scaffold reaction-diffusion dynamics at synaptic domains that was found previously to capture, at the mean-field level, the self-assembly, stability, and characteristic size of synaptic domains observed in experiments. We show that our stochastic lattice model yields quantitative agreement with mean-field models of nonlinear diffusion in crowded membranes. Through a combination of analytic and numerical solutions of the master equation governing the reaction dynamics at synaptic domains, together with kinetic Monte Carlo simulations, we find substantial discrepancies between mean-field and stochastic models for the reaction dynamics at synaptic domains. Based on the reaction and diffusion properties of synaptic receptors and scaffolds suggested by previous experiments and mean-field calculations, we show that the stochastic reaction-diffusion dynamics of synaptic receptors and scaffolds provide a simple physical mechanism for collective fluctuations in synaptic domains, the molecular turnover observed at synaptic domains, key features of the observed single-molecule trajectories, and spatial heterogeneity in the effective rates at which receptors and scaffolds are recycled at the cell membrane. Our work sheds light on the physical mechanisms and principles linking the collective properties of membrane protein domains to the stochastic dynamics that rule their molecular components.

  17. MONALISA for stochastic simulations of Petri net models of biochemical systems.

    PubMed

    Balazki, Pavel; Lindauer, Klaus; Einloft, Jens; Ackermann, Jörg; Koch, Ina

    2015-07-10

    The concept of Petri nets (PN) is widely used in systems biology and allows modeling of complex biochemical systems like metabolic systems, signal transduction pathways, and gene expression networks. In particular, PN allows the topological analysis based on structural properties, which is important and useful when quantitative (kinetic) data are incomplete or unknown. Knowing the kinetic parameters, the simulation of time evolution of such models can help to study the dynamic behavior of the underlying system. If the number of involved entities (molecules) is low, a stochastic simulation should be preferred against the classical deterministic approach of solving ordinary differential equations. The Stochastic Simulation Algorithm (SSA) is a common method for such simulations. The combination of the qualitative and semi-quantitative PN modeling and stochastic analysis techniques provides a valuable approach in the field of systems biology. Here, we describe the implementation of stochastic analysis in a PN environment. We extended MONALISA - an open-source software for creation, visualization and analysis of PN - by several stochastic simulation methods. The simulation module offers four simulation modes, among them the stochastic mode with constant firing rates and Gillespie's algorithm as exact and approximate versions. The simulator is operated by a user-friendly graphical interface and accepts input data such as concentrations and reaction rate constants that are common parameters in the biological context. The key features of the simulation module are visualization of simulation, interactive plotting, export of results into a text file, mathematical expressions for describing simulation parameters, and up to 500 parallel simulations of the same parameter sets. To illustrate the method we discuss a model for insulin receptor recycling as case study. We present a software that combines the modeling power of Petri nets with stochastic simulation of dynamic processes in a user-friendly environment supported by an intuitive graphical interface. The program offers a valuable alternative to modeling, using ordinary differential equations, especially when simulating single-cell experiments with low molecule counts. The ability to use mathematical expressions provides an additional flexibility in describing the simulation parameters. The open-source distribution allows further extensions by third-party developers. The software is cross-platform and is licensed under the Artistic License 2.0.

  18. Stochastic simulation of human pulmonary blood flow and transit time frequency distribution based on anatomic and elasticity data.

    PubMed

    Huang, Wei; Shi, Jun; Yen, R T

    2012-12-01

    The objective of our study was to develop a computing program for computing the transit time frequency distributions of red blood cell in human pulmonary circulation, based on our anatomic and elasticity data of blood vessels in human lung. A stochastic simulation model was introduced to simulate blood flow in human pulmonary circulation. In the stochastic simulation model, the connectivity data of pulmonary blood vessels in human lung was converted into a probability matrix. Based on this model, the transit time of red blood cell in human pulmonary circulation and the output blood pressure were studied. Additionally, the stochastic simulation model can be used to predict the changes of blood flow in human pulmonary circulation with the advantage of the lower computing cost and the higher flexibility. In conclusion, a stochastic simulation approach was introduced to simulate the blood flow in the hierarchical structure of a pulmonary circulation system, and to calculate the transit time distributions and the blood pressure outputs.

  19. Modelling the spread of American foulbrood in honeybees

    PubMed Central

    Datta, Samik; Bull, James C.; Budge, Giles E.; Keeling, Matt J.

    2013-01-01

    We investigate the spread of American foulbrood (AFB), a disease caused by the bacterium Paenibacillus larvae, that affects bees and can be extremely damaging to beehives. Our dataset comes from an inspection period carried out during an AFB epidemic of honeybee colonies on the island of Jersey during the summer of 2010. The data include the number of hives of honeybees, location and owner of honeybee apiaries across the island. We use a spatial SIR model with an underlying owner network to simulate the epidemic and characterize the epidemic using a Markov chain Monte Carlo (MCMC) scheme to determine model parameters and infection times (including undetected ‘occult’ infections). Likely methods of infection spread can be inferred from the analysis, with both distance- and owner-based transmissions being found to contribute to the spread of AFB. The results of the MCMC are corroborated by simulating the epidemic using a stochastic SIR model, resulting in aggregate levels of infection that are comparable to the data. We use this stochastic SIR model to simulate the impact of different control strategies on controlling the epidemic. It is found that earlier inspections result in smaller epidemics and a higher likelihood of AFB extinction. PMID:24026473

  20. Hybrid ODE/SSA methods and the cell cycle model

    NASA Astrophysics Data System (ADS)

    Wang, S.; Chen, M.; Cao, Y.

    2017-07-01

    Stochastic effect in cellular systems has been an important topic in systems biology. Stochastic modeling and simulation methods are important tools to study stochastic effect. Given the low efficiency of stochastic simulation algorithms, the hybrid method, which combines an ordinary differential equation (ODE) system with a stochastic chemically reacting system, shows its unique advantages in the modeling and simulation of biochemical systems. The efficiency of hybrid method is usually limited by reactions in the stochastic subsystem, which are modeled and simulated using Gillespie's framework and frequently interrupt the integration of the ODE subsystem. In this paper we develop an efficient implementation approach for the hybrid method coupled with traditional ODE solvers. We also compare the efficiency of hybrid methods with three widely used ODE solvers RADAU5, DASSL, and DLSODAR. Numerical experiments with three biochemical models are presented. A detailed discussion is presented for the performances of three ODE solvers.

  1. A comparative study of mixed exponential and Weibull distributions in a stochastic model replicating a tropical rainfall process

    NASA Astrophysics Data System (ADS)

    Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah

    2014-11-01

    A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.

  2. Multiscale stochastic simulations of chemical reactions with regulated scale separation

    NASA Astrophysics Data System (ADS)

    Koumoutsakos, Petros; Feigelman, Justin

    2013-07-01

    We present a coupling of multiscale frameworks with accelerated stochastic simulation algorithms for systems of chemical reactions with disparate propensities. The algorithms regulate the propensities of the fast and slow reactions of the system, using alternating micro and macro sub-steps simulated with accelerated algorithms such as τ and R-leaping. The proposed algorithms are shown to provide significant speedups in simulations of stiff systems of chemical reactions with a trade-off in accuracy as controlled by a regulating parameter. More importantly, the error of the methods exhibits a cutoff phenomenon that allows for optimal parameter choices. Numerical experiments demonstrate that hybrid algorithms involving accelerated stochastic simulations can be, in certain cases, more accurate while faster, than their corresponding stochastic simulation algorithm counterparts.

  3. Stochastic calculus of protein filament formation under spatial confinement

    NASA Astrophysics Data System (ADS)

    Michaels, Thomas C. T.; Dear, Alexander J.; Knowles, Tuomas P. J.

    2018-05-01

    The growth of filamentous aggregates from precursor proteins is a process of central importance to both normal and aberrant biology, for instance as the driver of devastating human disorders such as Alzheimer's and Parkinson's diseases. The conventional theoretical framework for describing this class of phenomena in bulk is based upon the mean-field limit of the law of mass action, which implicitly assumes deterministic dynamics. However, protein filament formation processes under spatial confinement, such as in microdroplets or in the cellular environment, show intrinsic variability due to the molecular noise associated with small-volume effects. To account for this effect, in this paper we introduce a stochastic differential equation approach for investigating protein filament formation processes under spatial confinement. Using this framework, we study the statistical properties of stochastic aggregation curves, as well as the distribution of reaction lag-times. Moreover, we establish the gradual breakdown of the correlation between lag-time and normalized growth rate under spatial confinement. Our results establish the key role of spatial confinement in determining the onset of stochasticity in protein filament formation and offer a formalism for studying protein aggregation kinetics in small volumes in terms of the kinetic parameters describing the aggregation dynamics in bulk.

  4. Time-variant Lagrangian transport formulation reduces aggregation bias of water and solute mean travel time in heterogeneous catchments

    NASA Astrophysics Data System (ADS)

    Danesh-Yazdi, Mohammad; Botter, Gianluca; Foufoula-Georgiou, Efi

    2017-05-01

    Lack of hydro-bio-chemical data at subcatchment scales necessitates adopting an aggregated system approach for estimating water and solute transport properties, such as residence and travel time distributions, at the catchment scale. In this work, we show that within-catchment spatial heterogeneity, as expressed in spatially variable discharge-storage relationships, can be appropriately encapsulated within a lumped time-varying stochastic Lagrangian formulation of transport. This time (variability) for space (heterogeneity) substitution yields mean travel times (MTTs) that are not significantly biased to the aggregation of spatial heterogeneity. Despite the significant variability of MTT at small spatial scales, there exists a characteristic scale above which the MTT is not impacted by the aggregation of spatial heterogeneity. Extensive simulations of randomly generated river networks reveal that the ratio between the characteristic scale and the mean incremental area is on average independent of river network topology and the spatial arrangement of incremental areas.

  5. Stochastic Human Exposure and Dose Simulation Model for Pesticides

    EPA Science Inventory

    SHEDS-Pesticides (Stochastic Human Exposure and Dose Simulation Model for Pesticides) is a physically-based stochastic model developed to quantify exposure and dose of humans to multimedia, multipathway pollutants. Probabilistic inputs are combined in physical/mechanistic algorit...

  6. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerrier, C.; Holcman, D., E-mail: david.holcman@ens.fr; Mathematical Institute, Oxford OX2 6GG, Newton Institute

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationallymore » greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.« less

  7. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.

  8. STEPS: efficient simulation of stochastic reaction-diffusion models in realistic morphologies.

    PubMed

    Hepburn, Iain; Chen, Weiliang; Wils, Stefan; De Schutter, Erik

    2012-05-10

    Models of cellular molecular systems are built from components such as biochemical reactions (including interactions between ligands and membrane-bound proteins), conformational changes and active and passive transport. A discrete, stochastic description of the kinetics is often essential to capture the behavior of the system accurately. Where spatial effects play a prominent role the complex morphology of cells may have to be represented, along with aspects such as chemical localization and diffusion. This high level of detail makes efficiency a particularly important consideration for software that is designed to simulate such systems. We describe STEPS, a stochastic reaction-diffusion simulator developed with an emphasis on simulating biochemical signaling pathways accurately and efficiently. STEPS supports all the above-mentioned features, and well-validated support for SBML allows many existing biochemical models to be imported reliably. Complex boundaries can be represented accurately in externally generated 3D tetrahedral meshes imported by STEPS. The powerful Python interface facilitates model construction and simulation control. STEPS implements the composition and rejection method, a variation of the Gillespie SSA, supporting diffusion between tetrahedral elements within an efficient search and update engine. Additional support for well-mixed conditions and for deterministic model solution is implemented. Solver accuracy is confirmed with an original and extensive validation set consisting of isolated reaction, diffusion and reaction-diffusion systems. Accuracy imposes upper and lower limits on tetrahedron sizes, which are described in detail. By comparing to Smoldyn, we show how the voxel-based approach in STEPS is often faster than particle-based methods, with increasing advantage in larger systems, and by comparing to MesoRD we show the efficiency of the STEPS implementation. STEPS simulates models of cellular reaction-diffusion systems with complex boundaries with high accuracy and high performance in C/C++, controlled by a powerful and user-friendly Python interface. STEPS is free for use and is available at http://steps.sourceforge.net/

  9. STEPS: efficient simulation of stochastic reaction–diffusion models in realistic morphologies

    PubMed Central

    2012-01-01

    Background Models of cellular molecular systems are built from components such as biochemical reactions (including interactions between ligands and membrane-bound proteins), conformational changes and active and passive transport. A discrete, stochastic description of the kinetics is often essential to capture the behavior of the system accurately. Where spatial effects play a prominent role the complex morphology of cells may have to be represented, along with aspects such as chemical localization and diffusion. This high level of detail makes efficiency a particularly important consideration for software that is designed to simulate such systems. Results We describe STEPS, a stochastic reaction–diffusion simulator developed with an emphasis on simulating biochemical signaling pathways accurately and efficiently. STEPS supports all the above-mentioned features, and well-validated support for SBML allows many existing biochemical models to be imported reliably. Complex boundaries can be represented accurately in externally generated 3D tetrahedral meshes imported by STEPS. The powerful Python interface facilitates model construction and simulation control. STEPS implements the composition and rejection method, a variation of the Gillespie SSA, supporting diffusion between tetrahedral elements within an efficient search and update engine. Additional support for well-mixed conditions and for deterministic model solution is implemented. Solver accuracy is confirmed with an original and extensive validation set consisting of isolated reaction, diffusion and reaction–diffusion systems. Accuracy imposes upper and lower limits on tetrahedron sizes, which are described in detail. By comparing to Smoldyn, we show how the voxel-based approach in STEPS is often faster than particle-based methods, with increasing advantage in larger systems, and by comparing to MesoRD we show the efficiency of the STEPS implementation. Conclusion STEPS simulates models of cellular reaction–diffusion systems with complex boundaries with high accuracy and high performance in C/C++, controlled by a powerful and user-friendly Python interface. STEPS is free for use and is available at http://steps.sourceforge.net/ PMID:22574658

  10. Potential and flux field landscape theory. I. Global stability and dynamics of spatially dependent non-equilibrium systems.

    PubMed

    Wu, Wei; Wang, Jin

    2013-09-28

    We established a potential and flux field landscape theory to quantify the global stability and dynamics of general spatially dependent non-equilibrium deterministic and stochastic systems. We extended our potential and flux landscape theory for spatially independent non-equilibrium stochastic systems described by Fokker-Planck equations to spatially dependent stochastic systems governed by general functional Fokker-Planck equations as well as functional Kramers-Moyal equations derived from master equations. Our general theory is applied to reaction-diffusion systems. For equilibrium spatially dependent systems with detailed balance, the potential field landscape alone, defined in terms of the steady state probability distribution functional, determines the global stability and dynamics of the system. The global stability of the system is closely related to the topography of the potential field landscape in terms of the basins of attraction and barrier heights in the field configuration state space. The effective driving force of the system is generated by the functional gradient of the potential field alone. For non-equilibrium spatially dependent systems, the curl probability flux field is indispensable in breaking detailed balance and creating non-equilibrium condition for the system. A complete characterization of the non-equilibrium dynamics of the spatially dependent system requires both the potential field and the curl probability flux field. While the non-equilibrium potential field landscape attracts the system down along the functional gradient similar to an electron moving in an electric field, the non-equilibrium flux field drives the system in a curly way similar to an electron moving in a magnetic field. In the small fluctuation limit, the intrinsic potential field as the small fluctuation limit of the potential field for spatially dependent non-equilibrium systems, which is closely related to the steady state probability distribution functional, is found to be a Lyapunov functional of the deterministic spatially dependent system. Therefore, the intrinsic potential landscape can characterize the global stability of the deterministic system. The relative entropy functional of the stochastic spatially dependent non-equilibrium system is found to be the Lyapunov functional of the stochastic dynamics of the system. Therefore, the relative entropy functional quantifies the global stability of the stochastic system with finite fluctuations. Our theory offers an alternative general approach to other field-theoretic techniques, to study the global stability and dynamics of spatially dependent non-equilibrium field systems. It can be applied to many physical, chemical, and biological spatially dependent non-equilibrium systems.

  11. A multi-sensor RSS spatial sensing-based robust stochastic optimization algorithm for enhanced wireless tethering.

    PubMed

    Parasuraman, Ramviyas; Fabry, Thomas; Molinari, Luca; Kershaw, Keith; Di Castro, Mario; Masi, Alessandro; Ferre, Manuel

    2014-12-12

    The reliability of wireless communication in a network of mobile wireless robot nodes depends on the received radio signal strength (RSS). When the robot nodes are deployed in hostile environments with ionizing radiations (such as in some scientific facilities), there is a possibility that some electronic components may fail randomly (due to radiation effects), which causes problems in wireless connectivity. The objective of this paper is to maximize robot mission capabilities by maximizing the wireless network capacity and to reduce the risk of communication failure. Thus, in this paper, we consider a multi-node wireless tethering structure called the "server-relay-client" framework that uses (multiple) relay nodes in between a server and a client node. We propose a robust stochastic optimization (RSO) algorithm using a multi-sensor-based RSS sampling method at the relay nodes to efficiently improve and balance the RSS between the source and client nodes to improve the network capacity and to provide redundant networking abilities. We use pre-processing techniques, such as exponential moving averaging and spatial averaging filters on the RSS data for smoothing. We apply a receiver spatial diversity concept and employ a position controller on the relay node using a stochastic gradient ascent method for self-positioning the relay node to achieve the RSS balancing task. The effectiveness of the proposed solution is validated by extensive simulations and field experiments in CERN facilities. For the field trials, we used a youBot mobile robot platform as the relay node, and two stand-alone Raspberry Pi computers as the client and server nodes. The algorithm has been proven to be robust to noise in the radio signals and to work effectively even under non-line-of-sight conditions.

  12. A Multi-Sensor RSS Spatial Sensing-Based Robust Stochastic Optimization Algorithm for Enhanced Wireless Tethering

    PubMed Central

    Parasuraman, Ramviyas; Fabry, Thomas; Molinari, Luca; Kershaw, Keith; Di Castro, Mario; Masi, Alessandro; Ferre, Manuel

    2014-01-01

    The reliability of wireless communication in a network of mobile wireless robot nodes depends on the received radio signal strength (RSS). When the robot nodes are deployed in hostile environments with ionizing radiations (such as in some scientific facilities), there is a possibility that some electronic components may fail randomly (due to radiation effects), which causes problems in wireless connectivity. The objective of this paper is to maximize robot mission capabilities by maximizing the wireless network capacity and to reduce the risk of communication failure. Thus, in this paper, we consider a multi-node wireless tethering structure called the “server-relay-client” framework that uses (multiple) relay nodes in between a server and a client node. We propose a robust stochastic optimization (RSO) algorithm using a multi-sensor-based RSS sampling method at the relay nodes to efficiently improve and balance the RSS between the source and client nodes to improve the network capacity and to provide redundant networking abilities. We use pre-processing techniques, such as exponential moving averaging and spatial averaging filters on the RSS data for smoothing. We apply a receiver spatial diversity concept and employ a position controller on the relay node using a stochastic gradient ascent method for self-positioning the relay node to achieve the RSS balancing task. The effectiveness of the proposed solution is validated by extensive simulations and field experiments in CERN facilities. For the field trials, we used a youBot mobile robot platform as the relay node, and two stand-alone Raspberry Pi computers as the client and server nodes. The algorithm has been proven to be robust to noise in the radio signals and to work effectively even under non-line-of-sight conditions. PMID:25615734

  13. OpenMC In Situ Source Convergence Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldrich, Garrett Allen; Dutta, Soumya; Woodring, Jonathan Lee

    2016-05-07

    We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are ablemore » to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.« less

  14. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB.

    PubMed

    Klingbeil, Guido; Erban, Radek; Giles, Mike; Maini, Philip K

    2011-04-15

    The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user's models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. The software is open source under the GPL v3 and available at http://www.maths.ox.ac.uk/cmb/STOCHSIMGPU. The web site also contains supplementary information. klingbeil@maths.ox.ac.uk Supplementary data are available at Bioinformatics online.

  15. Spatial delineation, fluid-lithology characterization, and petrophysical modeling of deepwater Gulf of Mexico reservoirs though joint AVA deterministic and stochastic inversion of three-dimensional partially-stacked seismic amplitude data and well logs

    NASA Astrophysics Data System (ADS)

    Contreras, Arturo Javier

    This dissertation describes a novel Amplitude-versus-Angle (AVA) inversion methodology to quantitatively integrate pre-stack seismic data, well logs, geologic data, and geostatistical information. Deterministic and stochastic inversion algorithms are used to characterize flow units of deepwater reservoirs located in the central Gulf of Mexico. A detailed fluid/lithology sensitivity analysis was conducted to assess the nature of AVA effects in the study area. Standard AVA analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generate typical Class III AVA responses. Layer-dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution, indicating that presence of light saturating fluids clearly affects the elastic response of sands. Accordingly, AVA deterministic and stochastic inversions, which combine the advantages of AVA analysis with those of inversion, have provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties and fluid-sensitive modulus attributes (P-Impedance, S-Impedance, density, and LambdaRho, in the case of deterministic inversion; and P-velocity, S-velocity, density, and lithotype (sand-shale) distributions, in the case of stochastic inversion). The quantitative use of rock/fluid information through AVA seismic data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, provides accurate 3D models of petrophysical properties such as porosity, permeability, and water saturation. Pre-stack stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques. Furthermore, 3D petrophysical models can be more accurately co-simulated from AVA stochastic inversion results. By combining AVA sensitivity analysis techniques with pre-stack stochastic inversion, geologic data, and awareness of inversion pitfalls, it is possible to substantially reduce the risk in exploration and development of conventional and non-conventional reservoirs. From the final integration of deterministic and stochastic inversion results with depositional models and analogous examples, the M-series reservoirs have been interpreted as stacked terminal turbidite lobes within an overall fan complex (the Miocene MCAVLU Submarine Fan System); this interpretation is consistent with previous core data interpretations and regional stratigraphic/depositional studies.

  16. Functional Wigner representation of quantum dynamics of Bose-Einstein condensate

    NASA Astrophysics Data System (ADS)

    Opanchuk, B.; Drummond, P. D.

    2013-04-01

    We develop a method of simulating the full quantum field dynamics of multi-mode multi-component Bose-Einstein condensates in a trap. We use the truncated Wigner representation to obtain a probabilistic theory that can be sampled. This method produces c-number stochastic equations which may be solved using conventional stochastic methods. The technique is valid for large mode occupation numbers. We give a detailed derivation of methods of functional Wigner representation appropriate for quantum fields. Our approach describes spatial evolution of spinor components and properly accounts for nonlinear losses. Such techniques are applicable to calculating the leading quantum corrections, including effects such as quantum squeezing, entanglement, EPR correlations, and interactions with engineered nonlinear reservoirs. By using a consistent expansion in the inverse density, we are able to explain an inconsistency in the nonlinear loss equations found by earlier authors.

  17. Stochastic Inversion of 2D Magnetotelluric Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jinsong

    2010-07-01

    The algorithm is developed to invert 2D magnetotelluric (MT) data based on sharp boundary parametrization using a Bayesian framework. Within the algorithm, we consider the locations and the resistivity of regions formed by the interfaces are as unknowns. We use a parallel, adaptive finite-element algorithm to forward simulate frequency-domain MT responses of 2D conductivity structure. Those unknown parameters are spatially correlated and are described by a geostatistical model. The joint posterior probability distribution function is explored by Markov Chain Monte Carlo (MCMC) sampling methods. The developed stochastic model is effective for estimating the interface locations and resistivity. Most importantly, itmore » provides details uncertainty information on each unknown parameter. Hardware requirements: PC, Supercomputer, Multi-platform, Workstation; Software requirements C and Fortan; Operation Systems/version is Linux/Unix or Windows« less

  18. Approximate Dynamic Programming and Aerial Refueling

    DTIC Science & Technology

    2007-06-01

    by two Army Air Corps de Havilland DH -4Bs (9). While crude by modern standards, the passing of hoses be- tween planes is effectively the same approach...incorporating stochastic data sets. . . . . . . . . . . 106 55 Total Cost Stochastically Trained Simulations versus Deterministically Trained Simulations...incorporating stochastic data sets. 106 To create meaningful results when testing stochastic data, the data sets are av- eraged so that conclusions are not

  19. Plant toxins and trophic cascades alter fire regime and succession on a boral forest landscape

    USGS Publications Warehouse

    Feng, Zhilan; Alfaro-Murillo, Jorge A.; DeAngelis, Donald L.; Schmidt, Jennifer; Barga, Matthew; Zheng, Yiqiang; Ahmad Tamrin, Muhammad Hanis B.; Olson, Mark; Glaser, Tim; Kielland, Knut; Chapin, F. Stuart; Bryant, John

    2012-01-01

    Two models were integrated in order to study the effect of plant toxicity and a trophic cascade on forest succession and fire patterns across a boreal landscape in central Alaska. One of the models, ALFRESCO, is a cellular automata model that stochastically simulates transitions from spruce dominated 1 km2 spatial cells to deciduous woody vegetation based on stochastic fires, and from deciduous woody vegetation to spruce based on age of the cell with some stochastic variation. The other model, the ‘toxin-dependent functional response’ model (TDFRM) simulates woody vegetation types with different levels of toxicity, an herbivore browser (moose) that can forage selectively on these types, and a carnivore (wolf) that preys on the herbivore. Here we replace the simple succession rules in each ALFRESCO cell by plant–herbivore–carnivore dynamics from TDFRM. The central hypothesis tested in the integrated model is that the herbivore, by feeding selectively on low-toxicity deciduous woody vegetation, speeds succession towards high-toxicity evergreens, like spruce. Wolves, by keeping moose populations down, can help slow the succession. Our results confirmed this hypothesis for the model calibrated to the Tanana floodplain of Alaska. We used the model to estimate the effects of different levels of wolf control. Simulations indicated that management reductions in wolf densities could reduce the mean time to transition from deciduous to spruce by more than 15 years, thereby increasing landscape flammability. The integrated model can be useful in estimating ecosystem impacts of wolf control and moose harvesting in central Alaska.

  20. Kinematic ground motion simulations on rough faults including effects of 3D stochastic velocity perturbations

    USGS Publications Warehouse

    Graves, Robert; Pitarka, Arben

    2016-01-01

    We describe a methodology for generating kinematic earthquake ruptures for use in 3D ground‐motion simulations over the 0–5 Hz frequency band. Our approach begins by specifying a spatially random slip distribution that has a roughly wavenumber‐squared fall‐off. Given a hypocenter, the rupture speed is specified to average about 75%–80% of the local shear wavespeed and the prescribed slip‐rate function has a Kostrov‐like shape with a fault‐averaged rise time that scales self‐similarly with the seismic moment. Both the rupture time and rise time include significant local perturbations across the fault surface specified by spatially random fields that are partially correlated with the underlying slip distribution. We represent velocity‐strengthening fault zones in the shallow (<5  km) and deep (>15  km) crust by decreasing rupture speed and increasing rise time in these regions. Additional refinements to this approach include the incorporation of geometric perturbations to the fault surface, 3D stochastic correlated perturbations to the P‐ and S‐wave velocity structure, and a damage zone surrounding the shallow fault surface characterized by a 30% reduction in seismic velocity. We demonstrate the approach using a suite of simulations for a hypothetical Mw 6.45 strike‐slip earthquake embedded in a generalized hard‐rock velocity structure. The simulation results are compared with the median predictions from the 2014 Next Generation Attenuation‐West2 Project ground‐motion prediction equations and show very good agreement over the frequency band 0.1–5 Hz for distances out to 25 km from the fault. Additionally, the newly added features act to reduce the coherency of the radiated higher frequency (f>1  Hz) ground motions, and homogenize radiation‐pattern effects in this same bandwidth, which move the simulations closer to the statistical characteristics of observed motions as illustrated by comparison with recordings from the 1979 Imperial Valley earthquake.

  1. Kinematic Ground-Motion Simulations on Rough Faults Including Effects of 3D Stochastic Velocity Perturbations

    DOE PAGES

    Graves, Robert; Pitarka, Arben

    2016-08-23

    Here, we describe a methodology for generating kinematic earthquake ruptures for use in 3D ground–motion simulations over the 0–5 Hz frequency band. Our approach begins by specifying a spatially random slip distribution that has a roughly wavenumber–squared fall–off. Given a hypocenter, the rupture speed is specified to average about 75%–80% of the local shear wavespeed and the prescribed slip–rate function has a Kostrov–like shape with a fault–averaged rise time that scales self–similarly with the seismic moment. Both the rupture time and rise time include significant local perturbations across the fault surface specified by spatially random fields that are partially correlatedmore » with the underlying slip distribution. We represent velocity–strengthening fault zones in the shallow (<5 km) and deep (>15 km) crust by decreasing rupture speed and increasing rise time in these regions. Additional refinements to this approach include the incorporation of geometric perturbations to the fault surface, 3D stochastic correlated perturbations to the P– and S–wave velocity structure, and a damage zone surrounding the shallow fault surface characterized by a 30% reduction in seismic velocity. We demonstrate the approach using a suite of simulations for a hypothetical Mw 6.45 strike–slip earthquake embedded in a generalized hard–rock velocity structure. The simulation results are compared with the median predictions from the 2014 Next Generation Attenuation–West2 Project ground–motion prediction equations and show very good agreement over the frequency band 0.1–5 Hz for distances out to 25 km from the fault. Additionally, the newly added features act to reduce the coherency of the radiated higher frequency (f>1 Hz) ground motions, and homogenize radiation–pattern effects in this same bandwidth, which move the simulations closer to the statistical characteristics of observed motions as illustrated by comparison with recordings from the 1979 Imperial Valley earthquake.« less

  2. Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies

    NASA Astrophysics Data System (ADS)

    Williams, Paul; Howe, Nicola; Gregory, Jonathan; Smith, Robin; Joshi, Manoj

    2016-04-01

    In climate simulations, the impacts of the sub-grid scales on the resolved scales are conventionally represented using deterministic closure schemes, which assume that the impacts are uniquely determined by the resolved scales. Stochastic parameterization relaxes this assumption, by sampling the sub-grid variability in a computationally inexpensive manner. This presentation shows that the simulated climatological state of the ocean is improved in many respects by implementing a simple stochastic parameterization of ocean eddies into a coupled atmosphere-ocean general circulation model. Simulations from a high-resolution, eddy-permitting ocean model are used to calculate the eddy statistics needed to inject realistic stochastic noise into a low-resolution, non-eddy-permitting version of the same model. A suite of four stochastic experiments is then run to test the sensitivity of the simulated climate to the noise definition, by varying the noise amplitude and decorrelation time within reasonable limits. The addition of zero-mean noise to the ocean temperature tendency is found to have a non-zero effect on the mean climate. Specifically, in terms of the ocean temperature and salinity fields both at the surface and at depth, the noise reduces many of the biases in the low-resolution model and causes it to more closely resemble the high-resolution model. The variability of the strength of the global ocean thermohaline circulation is also improved. It is concluded that stochastic ocean perturbations can yield reductions in climate model error that are comparable to those obtained by refining the resolution, but without the increased computational cost. Therefore, stochastic parameterizations of ocean eddies have the potential to significantly improve climate simulations. Reference PD Williams, NJ Howe, JM Gregory, RS Smith, and MM Joshi (2016) Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies. Journal of Climate, under revision.

  3. Additive noise-induced Turing transitions in spatial systems with application to neural fields and the Swift Hohenberg equation

    NASA Astrophysics Data System (ADS)

    Hutt, Axel; Longtin, Andre; Schimansky-Geier, Lutz

    2008-05-01

    This work studies the spatio-temporal dynamics of a generic integral-differential equation subject to additive random fluctuations. It introduces a combination of the stochastic center manifold approach for stochastic differential equations and the adiabatic elimination for Fokker-Planck equations, and studies analytically the systems’ stability near Turing bifurcations. In addition two types of fluctuation are studied, namely fluctuations uncorrelated in space and time, and global fluctuations, which are constant in space but uncorrelated in time. We show that the global fluctuations shift the Turing bifurcation threshold. This shift is proportional to the fluctuation variance. Applications to a neural field equation and the Swift-Hohenberg equation reveal the shift of the bifurcation to larger control parameters, which represents a stabilization of the system. All analytical results are confirmed by numerical simulations of the occurring mode equations and the full stochastic integral-differential equation. To gain some insight into experimental manifestations, the sum of uncorrelated and global additive fluctuations is studied numerically and the analytical results on global fluctuations are confirmed qualitatively.

  4. Efficient Stochastic Inversion Using Adjoint Models and Kernel-PCA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thimmisetty, Charanraj A.; Zhao, Wenju; Chen, Xiao

    2017-10-18

    Performing stochastic inversion on a computationally expensive forward simulation model with a high-dimensional uncertain parameter space (e.g. a spatial random field) is computationally prohibitive even when gradient information can be computed efficiently. Moreover, the ‘nonlinear’ mapping from parameters to observables generally gives rise to non-Gaussian posteriors even with Gaussian priors, thus hampering the use of efficient inversion algorithms designed for models with Gaussian assumptions. In this paper, we propose a novel Bayesian stochastic inversion methodology, which is characterized by a tight coupling between the gradient-based Langevin Markov Chain Monte Carlo (LMCMC) method and a kernel principal component analysis (KPCA). Thismore » approach addresses the ‘curse-of-dimensionality’ via KPCA to identify a low-dimensional feature space within the high-dimensional and nonlinearly correlated parameter space. In addition, non-Gaussian posterior distributions are estimated via an efficient LMCMC method on the projected low-dimensional feature space. We will demonstrate this computational framework by integrating and adapting our recent data-driven statistics-on-manifolds constructions and reduction-through-projection techniques to a linear elasticity model.« less

  5. Modeling delay in genetic networks: From delay birth-death processes to delay stochastic differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, Chinmaya; López, José Manuel; Azencott, Robert

    Delay is an important and ubiquitous aspect of many biochemical processes. For example, delay plays a central role in the dynamics of genetic regulatory networks as it stems from the sequential assembly of first mRNA and then protein. Genetic regulatory networks are therefore frequently modeled as stochastic birth-death processes with delay. Here, we examine the relationship between delay birth-death processes and their appropriate approximating delay chemical Langevin equations. We prove a quantitative bound on the error between the pathwise realizations of these two processes. Our results hold for both fixed delay and distributed delay. Simulations demonstrate that the delay chemicalmore » Langevin approximation is accurate even at moderate system sizes. It captures dynamical features such as the oscillatory behavior in negative feedback circuits, cross-correlations between nodes in a network, and spatial and temporal information in two commonly studied motifs of metastability in biochemical systems. Overall, these results provide a foundation for using delay stochastic differential equations to approximate the dynamics of birth-death processes with delay.« less

  6. Provably unbounded memory advantage in stochastic simulation using quantum mechanics

    NASA Astrophysics Data System (ADS)

    Garner, Andrew J. P.; Liu, Qing; Thompson, Jayne; Vedral, Vlatko; Gu, mile

    2017-10-01

    Simulating the stochastic evolution of real quantities on a digital computer requires a trade-off between the precision to which these quantities are approximated, and the memory required to store them. The statistical accuracy of the simulation is thus generally limited by the internal memory available to the simulator. Here, using tools from computational mechanics, we show that quantum processors with a fixed finite memory can simulate stochastic processes of real variables to arbitrarily high precision. This demonstrates a provable, unbounded memory advantage that a quantum simulator can exhibit over its best possible classical counterpart.

  7. The Validity of Quasi-Steady-State Approximations in Discrete Stochastic Simulations

    PubMed Central

    Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R.

    2014-01-01

    In biochemical networks, reactions often occur on disparate timescales and can be characterized as either fast or slow. The quasi-steady-state approximation (QSSA) utilizes timescale separation to project models of biochemical networks onto lower-dimensional slow manifolds. As a result, fast elementary reactions are not modeled explicitly, and their effect is captured by nonelementary reaction-rate functions (e.g., Hill functions). The accuracy of the QSSA applied to deterministic systems depends on how well timescales are separated. Recently, it has been proposed to use the nonelementary rate functions obtained via the deterministic QSSA to define propensity functions in stochastic simulations of biochemical networks. In this approach, termed the stochastic QSSA, fast reactions that are part of nonelementary reactions are not simulated, greatly reducing computation time. However, it is unclear when the stochastic QSSA provides an accurate approximation of the original stochastic simulation. We show that, unlike the deterministic QSSA, the validity of the stochastic QSSA does not follow from timescale separation alone, but also depends on the sensitivity of the nonelementary reaction rate functions to changes in the slow species. The stochastic QSSA becomes more accurate when this sensitivity is small. Different types of QSSAs result in nonelementary functions with different sensitivities, and the total QSSA results in less sensitive functions than the standard or the prefactor QSSA. We prove that, as a result, the stochastic QSSA becomes more accurate when nonelementary reaction functions are obtained using the total QSSA. Our work provides an apparently novel condition for the validity of the QSSA in stochastic simulations of biochemical reaction networks with disparate timescales. PMID:25099817

  8. The relationship between stochastic and deterministic quasi-steady state approximations.

    PubMed

    Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R

    2015-11-23

    The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.

  9. A stochastic multi-scale method for turbulent premixed combustion

    NASA Astrophysics Data System (ADS)

    Cha, Chong M.

    2002-11-01

    The stochastic chemistry algorithm of Bunker et al. and Gillespie is used to perform the chemical reactions in a transported probability density function (PDF) modeling approach of turbulent combustion. Recently, Kraft & Wagner have demonstrated a 100-fold gain in computational speed (for a 100 species mechanism) using the stochastic approach over the conventional, direct integration method of solving for the chemistry. Here, the stochastic chemistry algorithm is applied to develop a new transported PDF model of turbulent premixed combustion. The methodology relies on representing the relevant spatially dependent physical processes as queuing events. The canonical problem of a one-dimensional premixed flame is used for validation. For the laminar case, molecular diffusion is described by a random walk. For the turbulent case, one of two different material transport submodels can provide the necessary closure: Taylor dispersion or Kerstein's one-dimensional turbulence approach. The former exploits ``eddy diffusivity'' and hence would be much more computationally tractable for practical applications. Various validation studies are performed. Results from the Monte Carlo simulations compare well to asymptotic solutions of laminar premixed flames, both with and without high activation temperatures. The correct scaling of the turbulent burning velocity is predicted in both Damköhler's small- and large-scale turbulence limits. The effect of applying the eddy diffusivity concept in the various regimes is discussed.

  10. Burstiness in Viral Bursts: How Stochasticity Affects Spatial Patterns in Virus-Microbe Dynamics

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Hui; Taylor, Bradford P.; Weitz, Joshua S.

    Spatial patterns emerge in living systems at the scale of microbes to metazoans. These patterns can be driven, in part, by the stochasticity inherent to the birth and death of individuals. For microbe-virus systems, infection and lysis of hosts by viruses results in both mortality of hosts and production of viral progeny. Here, we study how variation in the number of viral progeny per lysis event affects the spatial clustering of both viruses and microbes. Each viral ''burst'' is initially localized at a near-cellular scale. The number of progeny in a single lysis event can vary in magnitude between tens and thousands. These perturbations are not accounted for in mean-field models. Here we developed individual-based models to investigate how stochasticity affects spatial patterns in virus-microbe systems. We measured the spatial clustering of individuals using pair correlation functions. We found that increasing the burst size of viruses while maintaining the same production rate led to enhanced clustering. In this poster we also report on preliminary analysis on the evolution of the burstiness of viral bursts given a spatially distributed host community.

  11. Simulated Stochastic Approximation Annealing for Global Optimization with a Square-Root Cooling Schedule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Faming; Cheng, Yichen; Lin, Guang

    2014-06-13

    Simulated annealing has been widely used in the solution of optimization problems. As known by many researchers, the global optima cannot be guaranteed to be located by simulated annealing unless a logarithmic cooling schedule is used. However, the logarithmic cooling schedule is so slow that no one can afford to have such a long CPU time. This paper proposes a new stochastic optimization algorithm, the so-called simulated stochastic approximation annealing algorithm, which is a combination of simulated annealing and the stochastic approximation Monte Carlo algorithm. Under the framework of stochastic approximation Markov chain Monte Carlo, it is shown that themore » new algorithm can work with a cooling schedule in which the temperature can decrease much faster than in the logarithmic cooling schedule, e.g., a square-root cooling schedule, while guaranteeing the global optima to be reached when the temperature tends to zero. The new algorithm has been tested on a few benchmark optimization problems, including feed-forward neural network training and protein-folding. The numerical results indicate that the new algorithm can significantly outperform simulated annealing and other competitors.« less

  12. Fast stochastic algorithm for simulating evolutionary population dynamics

    NASA Astrophysics Data System (ADS)

    Tsimring, Lev; Hasty, Jeff; Mather, William

    2012-02-01

    Evolution and co-evolution of ecological communities are stochastic processes often characterized by vastly different rates of reproduction and mutation and a coexistence of very large and very small sub-populations of co-evolving species. This creates serious difficulties for accurate statistical modeling of evolutionary dynamics. In this talk, we introduce a new exact algorithm for fast fully stochastic simulations of birth/death/mutation processes. It produces a significant speedup compared to the direct stochastic simulation algorithm in a typical case when the total population size is large and the mutation rates are much smaller than birth/death rates. We illustrate the performance of the algorithm on several representative examples: evolution on a smooth fitness landscape, NK model, and stochastic predator-prey system.

  13. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.

  14. From medium heterogeneity to flow and transport: A time-domain random walk approach

    NASA Astrophysics Data System (ADS)

    Hakoun, V.; Comolli, A.; Dentz, M.

    2017-12-01

    The prediction of flow and transport processes in heterogeneous porous media is based on the qualitative and quantitative understanding of the interplay between 1) spatial variability of hydraulic conductivity, 2) groundwater flow and 3) solute transport. Using a stochastic modeling approach, we study this interplay through direct numerical simulations of Darcy flow and advective transport in heterogeneous media. First, we study flow in correlated hydraulic permeability fields and shed light on the relationship between the statistics of log-hydraulic conductivity, a medium attribute, and the flow statistics. Second, we determine relationships between Eulerian and Lagrangian velocity statistics, this means, between flow and transport attributes. We show how Lagrangian statistics and thus transport behaviors such as late particle arrival times are influenced by the medium heterogeneity on one hand and the initial particle velocities on the other. We find that equidistantly sampled Lagrangian velocities can be described by a Markov process that evolves on the characteristic heterogeneity length scale. We employ a stochastic relaxation model for the equidistantly sampled particle velocities, which is parametrized by the velocity correlation length. This description results in a time-domain random walk model for the particle motion, whose spatial transitions are characterized by the velocity correlation length and temporal transitions by the particle velocities. This approach relates the statistical medium and flow properties to large scale transport, and allows for conditioning on the initial particle velocities and thus to the medium properties in the injection region. The approach is tested against direct numerical simulations.

  15. Roles of dispersal, stochasticity, and nonlinear dynamics in the spatial structuring of seasonal natural enemy-victim populations

    Treesearch

    Patrick C. Tobin; Ottar N. Bjornstad

    2005-01-01

    Natural enemy-victim systems may exhibit a range of dynamic space-time patterns. We used a theoretical framework to study spatiotemporal structuring in a transient natural enemy-victim system subject to differential rates of dispersal, stochastic forcing, and nonlinear dynamics. Highly mobile natural enemies that attacked less mobile victims were locally spatially...

  16. Combining Individual-Based Modeling and Food Microenvironment Descriptions To Predict the Growth of Listeria monocytogenes on Smear Soft Cheese

    PubMed Central

    Ferrier, Rachel; Hezard, Bernard; Lintz, Adrienne; Stahl, Valérie

    2013-01-01

    An individual-based modeling (IBM) approach was developed to describe the behavior of a few Listeria monocytogenes cells contaminating smear soft cheese surface. The IBM approach consisted of assessing the stochastic individual behaviors of cells on cheese surfaces and knowing the characteristics of their surrounding microenvironments. We used a microelectrode for pH measurements and micro-osmolality to assess the water activity of cheese microsamples. These measurements revealed a high variability of microscale pH compared to that of macroscale pH. A model describing the increase in pH from approximately 5.0 to more than 7.0 during ripening was developed. The spatial variability of the cheese surface characterized by an increasing pH with radius and higher pH on crests compared to that of hollows on cheese rind was also modeled. The microscale water activity ranged from approximately 0.96 to 0.98 and was stable during ripening. The spatial variability on cheese surfaces was low compared to between-cheese variability. Models describing the microscale variability of cheese characteristics were combined with the IBM approach to simulate the stochastic growth of L. monocytogenes on cheese, and these simulations were compared to bacterial counts obtained from irradiated cheeses artificially contaminated at different ripening stages. The simulated variability of L. monocytogenes counts with the IBM/microenvironmental approach was consistent with the observed one. Contrasting situations corresponding to no growth or highly contaminated foods could be deduced from these models. Moreover, the IBM approach was more effective than the traditional population/macroenvironmental approach to describe the actual bacterial behavior variability. PMID:23872572

  17. Novel patch modelling method for efficient simulation and prediction uncertainty analysis of multi-scale groundwater flow and transport processes

    NASA Astrophysics Data System (ADS)

    Sreekanth, J.; Moore, Catherine

    2018-04-01

    The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.

  18. A statistical approach to quasi-extinction forecasting.

    PubMed

    Holmes, Elizabeth Eli; Sabo, John L; Viscido, Steven Vincent; Fagan, William Fredric

    2007-12-01

    Forecasting population decline to a certain critical threshold (the quasi-extinction risk) is one of the central objectives of population viability analysis (PVA), and such predictions figure prominently in the decisions of major conservation organizations. In this paper, we argue that accurate forecasting of a population's quasi-extinction risk does not necessarily require knowledge of the underlying biological mechanisms. Because of the stochastic and multiplicative nature of population growth, the ensemble behaviour of population trajectories converges to common statistical forms across a wide variety of stochastic population processes. This paper provides a theoretical basis for this argument. We show that the quasi-extinction surfaces of a variety of complex stochastic population processes (including age-structured, density-dependent and spatially structured populations) can be modelled by a simple stochastic approximation: the stochastic exponential growth process overlaid with Gaussian errors. Using simulated and real data, we show that this model can be estimated with 20-30 years of data and can provide relatively unbiased quasi-extinction risk with confidence intervals considerably smaller than (0,1). This was found to be true even for simulated data derived from some of the noisiest population processes (density-dependent feedback, species interactions and strong age-structure cycling). A key advantage of statistical models is that their parameters and the uncertainty of those parameters can be estimated from time series data using standard statistical methods. In contrast for most species of conservation concern, biologically realistic models must often be specified rather than estimated because of the limited data available for all the various parameters. Biologically realistic models will always have a prominent place in PVA for evaluating specific management options which affect a single segment of a population, a single demographic rate, or different geographic areas. However, for forecasting quasi-extinction risk, statistical models that are based on the convergent statistical properties of population processes offer many advantages over biologically realistic models.

  19. Stochastic dynamics of penetrable rods in one dimension: occupied volume and spatial order.

    PubMed

    Craven, Galen T; Popov, Alexander V; Hernandez, Rigoberto

    2013-06-28

    The occupied volume of a penetrable hard rod (HR) system in one dimension is probed through the use of molecular dynamics simulations. In these dynamical simulations, collisions between penetrable rods are governed by a stochastic penetration algorithm (SPA), which allows for rods to either interpenetrate with a probability δ, or collide elastically otherwise. The limiting values of this parameter, δ = 0 and δ = 1, correspond to the HR and the ideal limits, respectively. At intermediate values, 0 < δ < 1, mixing of mutually exclusive and independent events is observed, making prediction of the occupied volume nontrivial. At high hard core volume fractions φ0, the occupied volume expression derived by Rikvold and Stell [J. Chem. Phys. 82, 1014 (1985)] for permeable systems does not accurately predict the occupied volume measured from the SPA simulations. Multi-body effects contribute significantly to the pair correlation function g2(r) and the simplification by Rikvold and Stell that g2(r) = δ in the penetrative region is observed to be inaccurate for the SPA model. We find that an integral over the penetrative region of g2(r) is the principal quantity that describes the particle overlap ratios corresponding to the observed penetration probabilities. Analytic formulas are developed to predict the occupied volume of mixed systems and agreement is observed between these theoretical predictions and the results measured from simulation.

  20. Computationally-efficient stochastic cluster dynamics method for modeling damage accumulation in irradiated materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoang, Tuan L.; Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, CA 94550; Marian, Jaime, E-mail: jmarian@ucla.edu

    2015-11-01

    An improved version of a recently developed stochastic cluster dynamics (SCD) method (Marian and Bulatov, 2012) [6] is introduced as an alternative to rate theory (RT) methods for solving coupled ordinary differential equation (ODE) systems for irradiation damage simulations. SCD circumvents by design the curse of dimensionality of the variable space that renders traditional ODE-based RT approaches inefficient when handling complex defect population comprised of multiple (more than two) defect species. Several improvements introduced here enable efficient and accurate simulations of irradiated materials up to realistic (high) damage doses characteristic of next-generation nuclear systems. The first improvement is a proceduremore » for efficiently updating the defect reaction-network and event selection in the context of a dynamically expanding reaction-network. Next is a novel implementation of the τ-leaping method that speeds up SCD simulations by advancing the state of the reaction network in large time increments when appropriate. Lastly, a volume rescaling procedure is introduced to control the computational complexity of the expanding reaction-network through occasional reductions of the defect population while maintaining accurate statistics. The enhanced SCD method is then applied to model defect cluster accumulation in iron thin films subjected to triple ion-beam (Fe{sup 3+}, He{sup +} and H{sup +}) irradiations, for which standard RT or spatially-resolved kinetic Monte Carlo simulations are prohibitively expensive.« less

  1. Computationally-efficient stochastic cluster dynamics method for modeling damage accumulation in irradiated materials

    NASA Astrophysics Data System (ADS)

    Hoang, Tuan L.; Marian, Jaime; Bulatov, Vasily V.; Hosemann, Peter

    2015-11-01

    An improved version of a recently developed stochastic cluster dynamics (SCD) method (Marian and Bulatov, 2012) [6] is introduced as an alternative to rate theory (RT) methods for solving coupled ordinary differential equation (ODE) systems for irradiation damage simulations. SCD circumvents by design the curse of dimensionality of the variable space that renders traditional ODE-based RT approaches inefficient when handling complex defect population comprised of multiple (more than two) defect species. Several improvements introduced here enable efficient and accurate simulations of irradiated materials up to realistic (high) damage doses characteristic of next-generation nuclear systems. The first improvement is a procedure for efficiently updating the defect reaction-network and event selection in the context of a dynamically expanding reaction-network. Next is a novel implementation of the τ-leaping method that speeds up SCD simulations by advancing the state of the reaction network in large time increments when appropriate. Lastly, a volume rescaling procedure is introduced to control the computational complexity of the expanding reaction-network through occasional reductions of the defect population while maintaining accurate statistics. The enhanced SCD method is then applied to model defect cluster accumulation in iron thin films subjected to triple ion-beam (Fe3+, He+ and H+) irradiations, for which standard RT or spatially-resolved kinetic Monte Carlo simulations are prohibitively expensive.

  2. Variance decomposition in stochastic simulators.

    PubMed

    Le Maître, O P; Knio, O M; Moraes, A

    2015-06-28

    This work aims at the development of a mathematical and computational approach that enables quantification of the inherent sources of stochasticity and of the corresponding sensitivities in stochastic simulations of chemical reaction networks. The approach is based on reformulating the system dynamics as being generated by independent standardized Poisson processes. This reformulation affords a straightforward identification of individual realizations for the stochastic dynamics of each reaction channel, and consequently a quantitative characterization of the inherent sources of stochasticity in the system. By relying on the Sobol-Hoeffding decomposition, the reformulation enables us to perform an orthogonal decomposition of the solution variance. Thus, by judiciously exploiting the inherent stochasticity of the system, one is able to quantify the variance-based sensitivities associated with individual reaction channels, as well as the importance of channel interactions. Implementation of the algorithms is illustrated in light of simulations of simplified systems, including the birth-death, Schlögl, and Michaelis-Menten models.

  3. Variance decomposition in stochastic simulators

    NASA Astrophysics Data System (ADS)

    Le Maître, O. P.; Knio, O. M.; Moraes, A.

    2015-06-01

    This work aims at the development of a mathematical and computational approach that enables quantification of the inherent sources of stochasticity and of the corresponding sensitivities in stochastic simulations of chemical reaction networks. The approach is based on reformulating the system dynamics as being generated by independent standardized Poisson processes. This reformulation affords a straightforward identification of individual realizations for the stochastic dynamics of each reaction channel, and consequently a quantitative characterization of the inherent sources of stochasticity in the system. By relying on the Sobol-Hoeffding decomposition, the reformulation enables us to perform an orthogonal decomposition of the solution variance. Thus, by judiciously exploiting the inherent stochasticity of the system, one is able to quantify the variance-based sensitivities associated with individual reaction channels, as well as the importance of channel interactions. Implementation of the algorithms is illustrated in light of simulations of simplified systems, including the birth-death, Schlögl, and Michaelis-Menten models.

  4. Stochastic population dynamics in spatially extended predator-prey systems

    NASA Astrophysics Data System (ADS)

    Dobramysl, Ulrich; Mobilia, Mauro; Pleimling, Michel; Täuber, Uwe C.

    2018-02-01

    Spatially extended population dynamics models that incorporate demographic noise serve as case studies for the crucial role of fluctuations and correlations in biological systems. Numerical and analytic tools from non-equilibrium statistical physics capture the stochastic kinetics of these complex interacting many-particle systems beyond rate equation approximations. Including spatial structure and stochastic noise in models for predator-prey competition invalidates the neutral Lotka-Volterra population cycles. Stochastic models yield long-lived erratic oscillations stemming from a resonant amplification mechanism. Spatially extended predator-prey systems display noise-stabilized activity fronts that generate persistent correlations. Fluctuation-induced renormalizations of the oscillation parameters can be analyzed perturbatively via a Doi-Peliti field theory mapping of the master equation; related tools allow detailed characterization of extinction pathways. The critical steady-state and non-equilibrium relaxation dynamics at the predator extinction threshold are governed by the directed percolation universality class. Spatial predation rate variability results in more localized clusters, enhancing both competing species’ population densities. Affixing variable interaction rates to individual particles and allowing for trait inheritance subject to mutations induces fast evolutionary dynamics for the rate distributions. Stochastic spatial variants of three-species competition with ‘rock-paper-scissors’ interactions metaphorically describe cyclic dominance. These models illustrate intimate connections between population dynamics and evolutionary game theory, underscore the role of fluctuations to drive populations toward extinction, and demonstrate how space can support species diversity. Two-dimensional cyclic three-species May-Leonard models are characterized by the emergence of spiraling patterns whose properties are elucidated by a mapping onto a complex Ginzburg-Landau equation. Multiple-species extensions to general ‘food networks’ can be classified on the mean-field level, providing both fundamental understanding of ensuing cooperativity and profound insight into the rich spatio-temporal features and coarsening kinetics in the corresponding spatially extended systems. Novel space-time patterns emerge as a result of the formation of competing alliances; e.g. coarsening domains that each incorporate rock-paper-scissors competition games.

  5. Simulating and quantifying legacy topographic data uncertainty: an initial step to advancing topographic change analyses

    NASA Astrophysics Data System (ADS)

    Wasklewicz, Thad; Zhu, Zhen; Gares, Paul

    2017-12-01

    Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor measurements observed at various times in history. The geometric relationship between the anchor point and the sensor measurement can be approximated via spatial correlation even when a sensor does not directly observe an anchor point. Findings from a numerical simulation indicate the estimated error coincides with the actual error using certain sensors (Kinematic GNSS, ALS, TLS, and SfM-MVS). Data from 2D imagery and static GNSS did not perform as well at the time the sensor is integrated into estimator largely as a result of the low density of data added from these sources. The estimator provides a history of DEM estimation as well as the uncertainties and cross correlations observed on anchor points. Our work provides preliminary evidence that our approach is valid for integrating legacy data with HRT and warrants further exploration and field validation. [Figure not available: see fulltext.

  6. Breast ultrasound computed tomography using waveform inversion with source encoding

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A.

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the speed-of-sound distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Computer-simulation studies are conducted to demonstrate the use of the WISE method. Using a single graphics processing unit card, each iteration can be completed within 25 seconds for a 128 × 128 mm2 reconstruction region. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  7. Estimator banks: a new tool for direction-of-arrival estimation

    NASA Astrophysics Data System (ADS)

    Gershman, Alex B.; Boehme, Johann F.

    1997-10-01

    A new powerful tool for improving the threshold performance of direction-of-arrival (DOA) estimation is considered. The essence of our approach is to reduce the number of outliers in the threshold domain using the so-called estimator bank containing multiple 'parallel' underlying DOA estimators which are based on pseudorandom resampling of the MUSIC spatial spectrum for given data batch or sample covariance matrix. To improve the threshold performance relative to conventional MUSIC, evolutionary principles are used, i.e., only 'successful' underlying estimators (having no failure in the preliminary estimated source localization sectors) are exploited in the final estimate. An efficient beamspace root implementation of the estimator bank approach is developed, combined with the array interpolation technique which enables the application to arbitrary arrays. A higher-order extension of our approach is also presented, where the cumulant-based MUSIC estimator is exploited as a basic technique for spatial spectrum resampling. Simulations and experimental data processing show that our algorithm performs well below the MUSIC threshold, namely, has the threshold performance similar to that of the stochastic ML method. At the same time, the computational cost of our algorithm is much lower than that of stochastic ML because no multidimensional optimization is involved.

  8. Deterministic ripple-spreading model for complex networks.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Hines, Evor L; Di Paolo, Ezequiel

    2011-04-01

    This paper proposes a deterministic complex network model, which is inspired by the natural ripple-spreading phenomenon. The motivations and main advantages of the model are the following: (i) The establishment of many real-world networks is a dynamic process, where it is often observed that the influence of a few local events spreads out through nodes, and then largely determines the final network topology. Obviously, this dynamic process involves many spatial and temporal factors. By simulating the natural ripple-spreading process, this paper reports a very natural way to set up a spatial and temporal model for such complex networks. (ii) Existing relevant network models are all stochastic models, i.e., with a given input, they cannot output a unique topology. Differently, the proposed ripple-spreading model can uniquely determine the final network topology, and at the same time, the stochastic feature of complex networks is captured by randomly initializing ripple-spreading related parameters. (iii) The proposed model can use an easily manageable number of ripple-spreading related parameters to precisely describe a network topology, which is more memory efficient when compared with traditional adjacency matrix or similar memory-expensive data structures. (iv) The ripple-spreading model has a very good potential for both extensions and applications.

  9. Large-scale derived flood frequency analysis based on continuous simulation

    NASA Astrophysics Data System (ADS)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several drawbacks reported in traditional approaches for the derived flood frequency analysis and therefore is recommended for large scale flood risk case studies.

  10. Stochastic simulation of Venice land uplift by seawater injection in deep heterogeneous aquifers

    NASA Astrophysics Data System (ADS)

    Ferronato, M.; Gambolati, G.; Teatini, P.; Bau, D. A.; Putti, M.

    2010-12-01

    In recent years, several geo-mechanical modeling studies have indicated that seawater injection into deep formations underneath the city of Venice, Italy, may produce a land uplift sufficient to significantly mitigate the effects of the acqua alta, that is, the exceptional tide peaks that periodically occur in the northern Adriatic Sea. However, of major concern is the differential vertical displacement at the ground surface, which must not exceed prescribed regulatory thresholds to guarantee structural preservation of the city historical buildings. In this work, we focus on the hydraulic conductivity, K, which - due to its inherent spatial heterogeneity - is often one of the most difficult hydrogeological parameters to characterize, and analyze the effects that spatially heterogeneous aquifer K distributions may have on the uniformity of the induced land uplift. This study relies on a series of stochastic geomechanical simulations performed using an uncoupled 3D finite-element model poro-elastic model and refers to a pilot project devised to address the feasibility and sustainability of an actual full-scale injection program. The pilot project considers the recharge of about 3,100 m3/day seawater over three years from three injection wells installed into six aquifers comprised between depths of 600 and 800 meters. The K field is modeled geostatistically according to an unconditional, second-order, stationary random process characterized by an exponential covariance function. The stochastic geomechanical simulations are structured into a sensitivity analysis in order to investigate the impact of the variance, σ2, and the horizontal correlation scale, λ, of the K field on the spatial distributions of the ground surface uplift and its horizontal gradient ρz. The results indicate that, irrespective of the σ2 and λ values, properly selected within the ranges 0.2-1.0 and 20-1000 m, respectively, typical of normally consolidated sedimentary basins, the predicted uplift is substantially regular with negligible differential displacements. Even under the most pessimistic scenario (σ2=1.0 and λ=1000 m) the maximum ρz results from two to three times smaller than that experienced by the city over the 1960’s due to ground water pumping (10×10-5), one order of magnitude less than the maximum limit allowed for masonry buildings (50×10-55), and about 20 times smaller than the values that restricted portions of the city are currently experiencing due to surficial loads and to possible heterogeneities of the upper Holocene deposits upon which the city is founded.

  11. Spatial scale affects the relative role of stochasticity versus determinism in soil bacterial communities in wheat fields across the North China Plain.

    PubMed

    Shi, Yu; Li, Yuntao; Xiang, Xingjia; Sun, Ruibo; Yang, Teng; He, Dan; Zhang, Kaoping; Ni, Yingying; Zhu, Yong-Guan; Adams, Jonathan M; Chu, Haiyan

    2018-02-05

    The relative importance of stochasticity versus determinism in soil bacterial communities is unclear, as are the possible influences that alter the balance between these. Here, we investigated the influence of spatial scale on the relative role of stochasticity and determinism in agricultural monocultures consisting only of wheat, thereby minimizing the influence of differences in plant species cover and in cultivation/disturbance regime, extending across a wide range of soils and climates of the North China Plain (NCP). We sampled 243 sites across 1092 km and sequenced the 16S rRNA bacterial gene using MiSeq. We hypothesized that determinism would play a relatively stronger role at the broadest scales, due to the strong influence of climate and soil differences in selecting many distinct OTUs of bacteria adapted to the different environments. In order to test the more general applicability of the hypothesis, we also compared with a natural ecosystem on the Tibetan Plateau. Our results revealed that the relative importance of stochasticity vs. determinism did vary with spatial scale, in the direction predicted. On the North China Plain, stochasticity played a dominant role from 150 to 900 km (separation between pairs of sites) and determinism dominated at more than 900 km (broad scale). On the Tibetan Plateau, determinism played a dominant role from 130 to 1200 km and stochasticity dominated at less than 130 km. Among the identifiable deterministic factors, soil pH showed the strongest influence on soil bacterial community structure and diversity across the North China Plain. Together, 23.9% of variation in soil microbial community composition could be explained, with environmental factors accounting for 19.7% and spatial parameters 4.1%. Our findings revealed that (1) stochastic processes are relatively more important on the North China Plain, while deterministic processes are more important on the Tibetan Plateau; (2) soil pH was the major factor in shaping soil bacterial community structure of the North China Plain; and (3) most variation in soil microbial community composition could not be explained with existing environmental and spatial factors. Further studies are needed to dissect the influence of stochastic factors (e.g., mutations or extinctions) on soil microbial community distribution, which might make it easier to predictably manipulate the microbial community to produce better yield and soil sustainability outcomes.

  12. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

  13. Opinion strength influences the spatial dynamics of opinion formation

    PubMed Central

    Baumgaertner, Bert O.; Tyson, Rebecca T.; Krone, Stephen M.

    2016-01-01

    Opinions are rarely binary; they can be held with different degrees of conviction, and this expanded attitude spectrum can affect the influence one opinion has on others. Our goal is to understand how different aspects of influence lead to recognizable spatio-temporal patterns of opinions and their strengths. To do this, we introduce a stochastic spatial agent-based model of opinion dynamics that includes a spectrum of opinion strengths and various possible rules for how the opinion strength of one individual affects the influence that this individual has on others. Through simulations, we find that even a small amount of amplification of opinion strength through interaction with like-minded neighbors can tip the scales in favor of polarization and deadlock. PMID:28529381

  14. Utilizing remote sensing of Thematic Mapper data to improve our understanding of estuarine processes and their influence on the productivity of estuarine-dependent fisheries

    NASA Technical Reports Server (NTRS)

    Browder, J. A.; May, L. N., Jr.; Rosenthal, A.; Baumann, R. H.; Gosselink, J. G.

    1986-01-01

    LANDSAT thematic mapper (TM) data are being used to refine and validate a stochastic spatial computer model to be applied to coastal resource management problems in Louisiana. Two major aspects of the research are: (1) the measurement of area of land (or emergent vegetation) and water and the length of the interface between land and water in TM imagery of selected coastal wetlands (sample marshes); and (2) the comparison of spatial patterns of land and water in the sample marshes of the imagery to that in marshes simulated by a computer model. In addition to activities in these two areas, the potential use of a published autocorrelation statistic is analyzed.

  15. Stochastic Analysis of Reaction–Diffusion Processes

    PubMed Central

    Hu, Jifeng; Kang, Hye-Won

    2013-01-01

    Reaction and diffusion processes are used to model chemical and biological processes over a wide range of spatial and temporal scales. Several routes to the diffusion process at various levels of description in time and space are discussed and the master equation for spatially discretized systems involving reaction and diffusion is developed. We discuss an estimator for the appropriate compartment size for simulating reaction–diffusion systems and introduce a measure of fluctuations in a discretized system. We then describe a new computational algorithm for implementing a modified Gillespie method for compartmental systems in which reactions are aggregated into equivalence classes and computational cells are searched via an optimized tree structure. Finally, we discuss several examples that illustrate the issues that have to be addressed in general systems. PMID:23719732

  16. TTLEM: Open access tool for building numerically accurate landscape evolution models in MATLAB

    NASA Astrophysics Data System (ADS)

    Campforts, Benjamin; Schwanghart, Wolfgang; Govers, Gerard

    2017-04-01

    Despite a growing interest in LEMs, accuracy assessment of the numerical methods they are based on has received little attention. Here, we present TTLEM which is an open access landscape evolution package designed to develop and test your own scenarios and hypothesises. TTLEM uses a higher order flux-limiting finite-volume method to simulate river incision and tectonic displacement. We show that this scheme significantly influences the evolution of simulated landscapes and the spatial and temporal variability of erosion rates. Moreover, it allows the simulation of lateral tectonic displacement on a fixed grid. Through the use of a simple GUI the software produces visible output of evolving landscapes through model run time. In this contribution, we illustrate numerical landscape evolution through a set of movies spanning different spatial and temporal scales. We focus on the erosional domain and use both spatially constant and variable input values for uplift, lateral tectonic shortening, erodibility and precipitation. Moreover, we illustrate the relevance of a stochastic approach for realistic hillslope response modelling. TTLEM is a fully open source software package, written in MATLAB and based on the TopoToolbox platform (topotoolbox.wordpress.com). Installation instructions can be found on this website and the therefore designed GitHub repository.

  17. GillesPy: A Python Package for Stochastic Model Building and Simulation.

    PubMed

    Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R

    2016-09-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.

  18. GillesPy: A Python Package for Stochastic Model Building and Simulation

    PubMed Central

    Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.

    2017-01-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888

  19. Water quality modeling in the dead end sections of drinking water distribution networks.

    PubMed

    Abokifa, Ahmed A; Yang, Y Jeffrey; Lo, Cynthia S; Biswas, Pratim

    2016-02-01

    Dead-end sections of drinking water distribution networks are known to be problematic zones in terms of water quality degradation. Extended residence time due to water stagnation leads to rapid reduction of disinfectant residuals allowing the regrowth of microbial pathogens. Water quality models developed so far apply spatial aggregation and temporal averaging techniques for hydraulic parameters by assigning hourly averaged water demands to the main nodes of the network. Although this practice has generally resulted in minimal loss of accuracy for the predicted disinfectant concentrations in main water transmission lines, this is not the case for the peripheries of the distribution network. This study proposes a new approach for simulating disinfectant residuals in dead end pipes while accounting for both spatial and temporal variability in hydraulic and transport parameters. A stochastic demand generator was developed to represent residential water pulses based on a non-homogenous Poisson process. Dispersive solute transport was considered using highly dynamic dispersion rates. A genetic algorithm was used to calibrate the axial hydraulic profile of the dead-end pipe based on the different demand shares of the withdrawal nodes. A parametric sensitivity analysis was done to assess the model performance under variation of different simulation parameters. A group of Monte-Carlo ensembles was carried out to investigate the influence of spatial and temporal variations in flow demands on the simulation accuracy. A set of three correction factors were analytically derived to adjust residence time, dispersion rate and wall demand to overcome simulation error caused by spatial aggregation approximation. The current model results show better agreement with field-measured concentrations of conservative fluoride tracer and free chlorine disinfectant than the simulations of recent advection dispersion reaction models published in the literature. Accuracy of the simulated concentration profiles showed significant dependence on the spatial distribution of the flow demands compared to temporal variation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Functional Wigner representation of quantum dynamics of Bose-Einstein condensate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opanchuk, B.; Drummond, P. D.

    2013-04-15

    We develop a method of simulating the full quantum field dynamics of multi-mode multi-component Bose-Einstein condensates in a trap. We use the truncated Wigner representation to obtain a probabilistic theory that can be sampled. This method produces c-number stochastic equations which may be solved using conventional stochastic methods. The technique is valid for large mode occupation numbers. We give a detailed derivation of methods of functional Wigner representation appropriate for quantum fields. Our approach describes spatial evolution of spinor components and properly accounts for nonlinear losses. Such techniques are applicable to calculating the leading quantum corrections, including effects such asmore » quantum squeezing, entanglement, EPR correlations, and interactions with engineered nonlinear reservoirs. By using a consistent expansion in the inverse density, we are able to explain an inconsistency in the nonlinear loss equations found by earlier authors.« less

  1. Epidemic patch models applied to pandemic influenza: contact matrix, stochasticity, robustness of predictions.

    PubMed

    Lunelli, Antonella; Pugliese, Andrea; Rizzo, Caterina

    2009-07-01

    Due to the recent emergence of H5N1 virus, the modelling of pandemic influenza has become a relevant issue. Here we present an SEIR model formulated to simulate a possible outbreak in Italy, analysing its structure and, more generally, the effect of including specific details into a model. These details regard population heterogeneities, such as age and spatial distribution, as well as stochasticity, that regulates the epidemic dynamics when the number of infectives is low. We discuss and motivate the specific modelling choices made when building the model and investigate how the model details influence the predicted dynamics. Our analysis may help in deciding which elements of complexity are worth including in the design of a deterministic model for pandemic influenza, in a balance between, on the one hand, keeping the model computationally efficient and the number of parameters low and, on the other hand, maintaining the necessary realistic features.

  2. Stochastic modeling of mode interactions via linear parabolized stability equations

    NASA Astrophysics Data System (ADS)

    Ran, Wei; Zare, Armin; Hack, M. J. Philipp; Jovanovic, Mihailo

    2017-11-01

    Low-complexity approximations of the Navier-Stokes equations have been widely used in the analysis of wall-bounded shear flows. In particular, the parabolized stability equations (PSE) and Floquet theory have been employed to capture the evolution of primary and secondary instabilities in spatially-evolving flows. We augment linear PSE with Floquet analysis to formally treat modal interactions and the evolution of secondary instabilities in the transitional boundary layer via a linear progression. To this end, we leverage Floquet theory by incorporating the primary instability into the base flow and accounting for different harmonics in the flow state. A stochastic forcing is introduced into the resulting linear dynamics to model the effect of nonlinear interactions on the evolution of modes. We examine the H-type transition scenario to demonstrate how our approach can be used to model nonlinear effects and capture the growth of the fundamental and subharmonic modes observed in direct numerical simulations and experiments.

  3. Stochastic models for inferring genetic regulation from microarray gene expression data.

    PubMed

    Tian, Tianhai

    2010-03-01

    Microarray expression profiles are inherently noisy and many different sources of variation exist in microarray experiments. It is still a significant challenge to develop stochastic models to realize noise in microarray expression profiles, which has profound influence on the reverse engineering of genetic regulation. Using the target genes of the tumour suppressor gene p53 as the test problem, we developed stochastic differential equation models and established the relationship between the noise strength of stochastic models and parameters of an error model for describing the distribution of the microarray measurements. Numerical results indicate that the simulated variance from stochastic models with a stochastic degradation process can be represented by a monomial in terms of the hybridization intensity and the order of the monomial depends on the type of stochastic process. The developed stochastic models with multiple stochastic processes generated simulations whose variance is consistent with the prediction of the error model. This work also established a general method to develop stochastic models from experimental information. 2009 Elsevier Ireland Ltd. All rights reserved.

  4. Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies

    NASA Astrophysics Data System (ADS)

    Williams, Paul; Howe, Nicola; Gregory, Jonathan; Smith, Robin; Joshi, Manoj

    2017-04-01

    In climate simulations, the impacts of the subgrid scales on the resolved scales are conventionally represented using deterministic closure schemes, which assume that the impacts are uniquely determined by the resolved scales. Stochastic parameterization relaxes this assumption, by sampling the subgrid variability in a computationally inexpensive manner. This study shows that the simulated climatological state of the ocean is improved in many respects by implementing a simple stochastic parameterization of ocean eddies into a coupled atmosphere-ocean general circulation model. Simulations from a high-resolution, eddy-permitting ocean model are used to calculate the eddy statistics needed to inject realistic stochastic noise into a low-resolution, non-eddy-permitting version of the same model. A suite of four stochastic experiments is then run to test the sensitivity of the simulated climate to the noise definition by varying the noise amplitude and decorrelation time within reasonable limits. The addition of zero-mean noise to the ocean temperature tendency is found to have a nonzero effect on the mean climate. Specifically, in terms of the ocean temperature and salinity fields both at the surface and at depth, the noise reduces many of the biases in the low-resolution model and causes it to more closely resemble the high-resolution model. The variability of the strength of the global ocean thermohaline circulation is also improved. It is concluded that stochastic ocean perturbations can yield reductions in climate model error that are comparable to those obtained by refining the resolution, but without the increased computational cost. Therefore, stochastic parameterizations of ocean eddies have the potential to significantly improve climate simulations. Reference Williams PD, Howe NJ, Gregory JM, Smith RS, and Joshi MM (2016) Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies. Journal of Climate, 29, 8763-8781. http://dx.doi.org/10.1175/JCLI-D-15-0746.1

  5. Oscillatory regulation of Hes1: Discrete stochastic delay modelling and simulation.

    PubMed

    Barrio, Manuel; Burrage, Kevin; Leier, André; Tian, Tianhai

    2006-09-08

    Discrete stochastic simulations are a powerful tool for understanding the dynamics of chemical kinetics when there are small-to-moderate numbers of certain molecular species. In this paper we introduce delays into the stochastic simulation algorithm, thus mimicking delays associated with transcription and translation. We then show that this process may well explain more faithfully than continuous deterministic models the observed sustained oscillations in expression levels of hes1 mRNA and Hes1 protein.

  6. Stochastic effects in a seasonally forced epidemic model

    NASA Astrophysics Data System (ADS)

    Rozhnova, G.; Nunes, A.

    2010-10-01

    The interplay of seasonality, the system’s nonlinearities and intrinsic stochasticity, is studied for a seasonally forced susceptible-exposed-infective-recovered stochastic model. The model is explored in the parameter region that corresponds to childhood infectious diseases such as measles. The power spectrum of the stochastic fluctuations around the attractors of the deterministic system that describes the model in the thermodynamic limit is computed analytically and validated by stochastic simulations for large system sizes. Size effects are studied through additional simulations. Other effects such as switching between coexisting attractors induced by stochasticity often mentioned in the literature as playing an important role in the dynamics of childhood infectious diseases are also investigated. The main conclusion is that stochastic amplification, rather than these effects, is the key ingredient to understand the observed incidence patterns.

  7. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A standardized measure of the local entropy was used to visualize uncertainty, where entropy values close to 1 correspond to high uncertainty, whilst values close to 0 correspond low uncertainty. The advantage of the usage of local entropy in this context is that it combines probabilities from multiple members into a single number for each location of the model. In conclusion, it is straightforward to use a sequential stochastic simulation approach to the assessment of uncertainty, when normality and homoscedasticity are violated. The visualization of uncertainty using the local entropy is effective and communicative to stakeholders because it represents the uncertainty through a single number within a [0, 1] scale. References: Bárdossy, Gy. & Fodor, J., 2004. Evaluation of Uncertainties and Risks in Geology. Springer-Verlag, Berlin Heidelberg. Deutsch, C.V. & Journel, A.G., 1998. GSLIB: geostatistical software library and user's guide. Oxford University Press, New York. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  8. Space time modelling of air quality for environmental-risk maps: A case study in South Portugal

    NASA Astrophysics Data System (ADS)

    Soares, Amilcar; Pereira, Maria J.

    2007-10-01

    Since the 1960s, there has been a strong industrial development in the Sines area, on the southern Atlantic coast of Portugal, including the construction of an important industrial harbour and of, mainly, petrochemical and energy-related industries. These industries are, nowadays, responsible for substantial emissions of SO2, NOx, particles, VOCs and part of the ozone polluting the atmosphere. The major industries are spatially concentrated in a restricted area, very close to populated areas and natural resources such as those protected by the European Natura 2000 network. Air quality parameters are measured at the emissions' sources and at a few monitoring stations. Although air quality parameters are measured on an hourly basis, the lack of representativeness in space of these non-homogeneous phenomena makes even their representativeness in time questionable. Hence, in this study, the regional spatial dispersion of contaminants is also evaluated, using diffusive-sampler (Radiello Passive Sampler) campaigns during given periods. Diffusive samplers cover the entire space extensively, but just for a limited period of time. In the first step of this study, a space-time model of pollutants was built, based on a stochastic simulation-direct sequential simulation-with local spatial trend. The spatial dispersion of the contaminants for a given period of time-corresponding to the exposure time of the diffusive samplers-was computed by ordinary kriging. Direct sequential simulation was applied to produce equiprobable spatial maps for each day of that period, using the kriged map as a spatial trend and the daily measurements of pollutants from the monitoring stations as hard data. In the second step, the following environmental risk and costs maps were computed from the set of simulated realizations of pollutants: (i) maps of the contribution of each emission to the pollutant concentration at any spatial location; (ii) costs of badly located monitoring stations.

  9. Stochastic modelling of microstructure formation in solidification processes

    NASA Astrophysics Data System (ADS)

    Nastac, Laurentiu; Stefanescu, Doru M.

    1997-07-01

    To relax many of the assumptions used in continuum approaches, a general stochastic model has been developed. The stochastic model can be used not only for an accurate description of the fraction of solid evolution, and therefore accurate cooling curves, but also for simulation of microstructure formation in castings. The advantage of using the stochastic approach is to give a time- and space-dependent description of solidification processes. Time- and space-dependent processes can also be described by partial differential equations. Unlike a differential formulation which, in most cases, has to be transformed into a difference equation and solved numerically, the stochastic approach is essentially a direct numerical algorithm. The stochastic model is comprehensive, since the competition between various phases is considered. Furthermore, grain impingement is directly included through the structure of the model. In the present research, all grain morphologies are simulated with this procedure. The relevance of the stochastic approach is that the simulated microstructures can be directly compared with microstructures obtained from experiments. The computer becomes a `dynamic metallographic microscope'. A comparison between deterministic and stochastic approaches has been performed. An important objective of this research was to answer the following general questions: (1) `Would fully deterministic approaches continue to be useful in solidification modelling?' and (2) `Would stochastic algorithms be capable of entirely replacing purely deterministic models?'

  10. Stochastic and deterministic models for agricultural production networks.

    PubMed

    Bai, P; Banks, H T; Dediu, S; Govan, A Y; Last, M; Lloyd, A L; Nguyen, H K; Olufsen, M S; Rempala, G; Slenning, B D

    2007-07-01

    An approach to modeling the impact of disturbances in an agricultural production network is presented. A stochastic model and its approximate deterministic model for averages over sample paths of the stochastic system are developed. Simulations, sensitivity and generalized sensitivity analyses are given. Finally, it is shown how diseases may be introduced into the network and corresponding simulations are discussed.

  11. Coupling all-atom molecular dynamics simulations of ions in water with Brownian dynamics.

    PubMed

    Erban, Radek

    2016-02-01

    Molecular dynamics (MD) simulations of ions (K + , Na + , Ca 2+ and Cl - ) in aqueous solutions are investigated. Water is described using the SPC/E model. A stochastic coarse-grained description for ion behaviour is presented and parametrized using MD simulations. It is given as a system of coupled stochastic and ordinary differential equations, describing the ion position, velocity and acceleration. The stochastic coarse-grained model provides an intermediate description between all-atom MD simulations and Brownian dynamics (BD) models. It is used to develop a multiscale method which uses all-atom MD simulations in parts of the computational domain and (less detailed) BD simulations in the remainder of the domain.

  12. Use of LANDSAT images of vegetation cover to estimate effective hydraulic properties of soils

    NASA Technical Reports Server (NTRS)

    Eagleson, Peter S.; Jasinski, Michael F.

    1988-01-01

    This work focuses on the characterization of natural, spatially variable, semivegetated landscapes using a linear, stochastic, canopy-soil reflectance model. A first application of the model was the investigation of the effects of subpixel and regional variability of scenes on the shape and structure of red-infrared scattergrams. Additionally, the model was used to investigate the inverse problem, the estimation of subpixel vegetation cover, given only the scattergrams of simulated satellite scale multispectral scenes. The major aspects of that work, including recent field investigations, are summarized.

  13. Comparing Lagrangian and Eulerian models for CO2 transport - a step towards Bayesian inverse modeling using WRF/STILT-VPRM

    NASA Astrophysics Data System (ADS)

    Pillai, D.; Gerbig, C.; Kretschmer, R.; Beck, V.; Karstens, U.; Neininger, B.; Heimann, M.

    2012-10-01

    We present simulations of atmospheric CO2 concentrations provided by two modeling systems, run at high spatial resolution: the Eulerian-based Weather Research Forecasting (WRF) model and the Lagrangian-based Stochastic Time-Inverted Lagrangian Transport (STILT) model, both of which are coupled to a diagnostic biospheric model, the Vegetation Photosynthesis and Respiration Model (VPRM). The consistency of the simulations is assessed with special attention paid to the details of horizontal as well as vertical transport and mixing of CO2 concentrations in the atmosphere. The dependence of model mismatch (Eulerian vs. Lagrangian) on models' spatial resolution is further investigated. A case study using airborne measurements during which two models showed large deviations from each other is analyzed in detail as an extreme case. Using aircraft observations and pulse release simulations, we identified differences in the representation of details in the interaction between turbulent mixing and advection through wind shear as the main cause of discrepancies between WRF and STILT transport at a spatial resolution such as 2 and 6 km. Based on observations and inter-model comparisons of atmospheric CO2 concentrations, we show that a refinement of the parameterization of turbulent velocity variance and Lagrangian time-scale in STILT is needed to achieve a better match between the Eulerian and the Lagrangian transport at such a high spatial resolution (e.g. 2 and 6 km). Nevertheless, the inter-model differences in simulated CO2 time series for a tall tower observatory at Ochsenkopf in Germany are about a factor of two smaller than the model-data mismatch and about a factor of three smaller than the mismatch between the current global model simulations and the data.

  14. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    NASA Astrophysics Data System (ADS)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  15. Practical Unitary Simulator for Non-Markovian Complex Processes

    NASA Astrophysics Data System (ADS)

    Binder, Felix C.; Thompson, Jayne; Gu, Mile

    2018-06-01

    Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.

  16. Approximate Bayesian computation for spatial SEIR(S) epidemic models.

    PubMed

    Brown, Grant D; Porter, Aaron T; Oleson, Jacob J; Hinman, Jessica A

    2018-02-01

    Approximate Bayesia n Computation (ABC) provides an attractive approach to estimation in complex Bayesian inferential problems for which evaluation of the kernel of the posterior distribution is impossible or computationally expensive. These highly parallelizable techniques have been successfully applied to many fields, particularly in cases where more traditional approaches such as Markov chain Monte Carlo (MCMC) are impractical. In this work, we demonstrate the application of approximate Bayesian inference to spatially heterogeneous Susceptible-Exposed-Infectious-Removed (SEIR) stochastic epidemic models. These models have a tractable posterior distribution, however MCMC techniques nevertheless become computationally infeasible for moderately sized problems. We discuss the practical implementation of these techniques via the open source ABSEIR package for R. The performance of ABC relative to traditional MCMC methods in a small problem is explored under simulation, as well as in the spatially heterogeneous context of the 2014 epidemic of Chikungunya in the Americas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE PAGES

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; ...

    2018-02-20

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  18. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    NASA Astrophysics Data System (ADS)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng

    2018-02-01

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.

  19. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The approach used follows the non-equilibrium statistical mechanical approach through a master equation. The aim is to represent the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and mass flux is a non-linear function of convective cell area, mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated mass flux variability under diurnally varying forcing. Besides its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to be capable of providing alternative, non-equilibrium, closure formulations for spectral mass flux parameterizations.« less

  20. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  1. Probabilistic models and uncertainty quantification for the ionization reaction rate of atomic Nitrogen

    NASA Astrophysics Data System (ADS)

    Miki, K.; Panesi, M.; Prudencio, E. E.; Prudhomme, S.

    2012-05-01

    The objective in this paper is to analyze some stochastic models for estimating the ionization reaction rate constant of atomic Nitrogen (N + e- → N+ + 2e-). Parameters of the models are identified by means of Bayesian inference using spatially resolved absolute radiance data obtained from the Electric Arc Shock Tube (EAST) wind-tunnel. The proposed methodology accounts for uncertainties in the model parameters as well as physical model inadequacies, providing estimates of the rate constant that reflect both types of uncertainties. We present four different probabilistic models by varying the error structure (either additive or multiplicative) and by choosing different descriptions of the statistical correlation among data points. In order to assess the validity of our methodology, we first present some calibration results obtained with manufactured data and then proceed by using experimental data collected at EAST experimental facility. In order to simulate the radiative signature emitted in the shock-heated air plasma, we use a one-dimensional flow solver with Park's two-temperature model that simulates non-equilibrium effects. We also discuss the implications of the choice of the stochastic model on the estimation of the reaction rate and its uncertainties. Our analysis shows that the stochastic models based on correlated multiplicative errors are the most plausible models among the four models proposed in this study. The rate of the atomic Nitrogen ionization is found to be (6.2 ± 3.3) × 1011 cm3 mol-1 s-1 at 10,000 K.

  2. Oscillatory Regulation of Hes1: Discrete Stochastic Delay Modelling and Simulation

    PubMed Central

    Barrio, Manuel; Burrage, Kevin; Leier, André; Tian, Tianhai

    2006-01-01

    Discrete stochastic simulations are a powerful tool for understanding the dynamics of chemical kinetics when there are small-to-moderate numbers of certain molecular species. In this paper we introduce delays into the stochastic simulation algorithm, thus mimicking delays associated with transcription and translation. We then show that this process may well explain more faithfully than continuous deterministic models the observed sustained oscillations in expression levels of hes1 mRNA and Hes1 protein. PMID:16965175

  3. Demographic inference under the coalescent in a spatial continuum.

    PubMed

    Guindon, Stéphane; Guo, Hongbin; Welch, David

    2016-10-01

    Understanding population dynamics from the analysis of molecular and spatial data requires sound statistical modeling. Current approaches assume that populations are naturally partitioned into discrete demes, thereby failing to be relevant in cases where individuals are scattered on a spatial continuum. Other models predict the formation of increasingly tight clusters of individuals in space, which, again, conflicts with biological evidence. Building on recent theoretical work, we introduce a new genealogy-based inference framework that alleviates these issues. This approach effectively implements a stochastic model in which the distribution of individuals is homogeneous and stationary, thereby providing a relevant null model for the fluctuation of genetic diversity in time and space. Importantly, the spatial density of individuals in a population and their range of dispersal during the course of evolution are two parameters that can be inferred separately with this method. The validity of the new inference framework is confirmed with extensive simulations and the analysis of influenza sequences collected over five seasons in the USA. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Variance decomposition in stochastic simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Maître, O. P., E-mail: olm@limsi.fr; Knio, O. M., E-mail: knio@duke.edu; Moraes, A., E-mail: alvaro.moraesgutierrez@kaust.edu.sa

    This work aims at the development of a mathematical and computational approach that enables quantification of the inherent sources of stochasticity and of the corresponding sensitivities in stochastic simulations of chemical reaction networks. The approach is based on reformulating the system dynamics as being generated by independent standardized Poisson processes. This reformulation affords a straightforward identification of individual realizations for the stochastic dynamics of each reaction channel, and consequently a quantitative characterization of the inherent sources of stochasticity in the system. By relying on the Sobol-Hoeffding decomposition, the reformulation enables us to perform an orthogonal decomposition of the solution variance.more » Thus, by judiciously exploiting the inherent stochasticity of the system, one is able to quantify the variance-based sensitivities associated with individual reaction channels, as well as the importance of channel interactions. Implementation of the algorithms is illustrated in light of simulations of simplified systems, including the birth-death, Schlögl, and Michaelis-Menten models.« less

  5. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    NASA Technical Reports Server (NTRS)

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  6. Multiscale Hy3S: hybrid stochastic simulation for supercomputers.

    PubMed

    Salis, Howard; Sotiropoulos, Vassilios; Kaznessis, Yiannis N

    2006-02-24

    Stochastic simulation has become a useful tool to both study natural biological systems and design new synthetic ones. By capturing the intrinsic molecular fluctuations of "small" systems, these simulations produce a more accurate picture of single cell dynamics, including interesting phenomena missed by deterministic methods, such as noise-induced oscillations and transitions between stable states. However, the computational cost of the original stochastic simulation algorithm can be high, motivating the use of hybrid stochastic methods. Hybrid stochastic methods partition the system into multiple subsets and describe each subset as a different representation, such as a jump Markov, Poisson, continuous Markov, or deterministic process. By applying valid approximations and self-consistently merging disparate descriptions, a method can be considerably faster, while retaining accuracy. In this paper, we describe Hy3S, a collection of multiscale simulation programs. Building on our previous work on developing novel hybrid stochastic algorithms, we have created the Hy3S software package to enable scientists and engineers to both study and design extremely large well-mixed biological systems with many thousands of reactions and chemical species. We have added adaptive stochastic numerical integrators to permit the robust simulation of dynamically stiff biological systems. In addition, Hy3S has many useful features, including embarrassingly parallelized simulations with MPI; special discrete events, such as transcriptional and translation elongation and cell division; mid-simulation perturbations in both the number of molecules of species and reaction kinetic parameters; combinatorial variation of both initial conditions and kinetic parameters to enable sensitivity analysis; use of NetCDF optimized binary format to quickly read and write large datasets; and a simple graphical user interface, written in Matlab, to help users create biological systems and analyze data. We demonstrate the accuracy and efficiency of Hy3S with examples, including a large-scale system benchmark and a complex bistable biochemical network with positive feedback. The software itself is open-sourced under the GPL license and is modular, allowing users to modify it for their own purposes. Hy3S is a powerful suite of simulation programs for simulating the stochastic dynamics of networks of biochemical reactions. Its first public version enables computational biologists to more efficiently investigate the dynamics of realistic biological systems.

  7. Limiting similarity and functional diversity along environmental gradients

    USGS Publications Warehouse

    Schwilk, D.W.; Ackerly, D.D.

    2005-01-01

    Recent developments in community models emphasize the importance of incorporating stochastic processes (e.g. ecological drift) in models of niche-structured community assembly. We constructed a finite, spatially explicit, lottery model to simulate the distribution of species in a one-dimensional landscape with an underlying gradient in environmental conditions. Our framework combines the potential for ecological drift with environmentally-mediated competition for space in a heterogeneous environment. We examined the influence of niche breadth, dispersal distances, community size (total number of individuals) and the breadth of the environmental gradient on levels of species and functional trait diversity (i.e. differences in niche optima). Three novel results emerge from this model: (1) niche differences between adjacent species (e.g. limiting similarity) increase in smaller communities, because of the interaction of competitive effects and finite population sizes; (2) immigration from a regional species pool, stochasticity and niche-assembly generate a bimodal distribution of species residence times ('transient' and 'resident') under a heterogeneous environment; and (3) the magnitude of environmental heterogeneity has a U-shaped effect on diversity, because of shifts in species richness of resident vs. transient species. These predictions illustrate the potential importance of stochastic (although not necessarily neutral) processes in community assembly. ??2005 Blackwell Publishing Ltd/CNRS.

  8. Coupled Stochastic Time-Inverted Lagrangian Transport/Weather Forecast and Research/Vegetation Photosynthesis and Respiration Model. Part II; Simulations of Tower-Based and Airborne CO2 Measurements

    NASA Technical Reports Server (NTRS)

    Eluszkiewicz, Janusz; Nehrkorn, Thomas; Wofsy, Steven C.; Matross, Daniel; Gerbig, Christoph; Lin, John C.; Freitas, Saulo; Longo, Marcos; Andrews, Arlyn E.; Peters, Wouter

    2007-01-01

    This paper evaluates simulations of atmospheric CO2 measured in 2004 at continental surface and airborne receptors, intended to test the capability to use data with high temporal and spatial resolution for analyses of carbon sources and sinks at regional and continental scales. The simulations were performed using the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by the Weather Forecast and Research (WRF) model, and linked to surface fluxes from the satellite-driven Vegetation Photosynthesis and Respiration Model (VPRM). The simulations provide detailed representations of hourly CO2 tower data and reproduce the shapes of airborne vertical profiles with high fidelity. WRF meteorology gives superior model performance compared with standard meteorological products, and the impact of including WRF convective mass fluxes in the STILT trajectory calculations is significant in individual cases. Important biases in the simulation are associated with the nighttime CO2 build-up and subsequent morning transition to convective conditions, and with errors in the advected lateral boundary condition. Comparison of STILT simulations driven by the WRF model against those driven by the Brazilian variant of the Regional Atmospheric Modeling System (BRAMS) shows that model-to-model differences are smaller than between an individual transport model and observations, pointing to systematic errors in the simulated transport. Future developments in the WRF model s data assimilation capabilities, basic research into the fundamental aspects of trajectory calculations, and intercomparison studies involving other transport models, are possible venues for reducing these errors. Overall, the STILT/WRF/VPRM offers a powerful tool for continental and regional scale carbon flux estimates.

  9. Dynamics Under Location Uncertainty: Model Derivation, Modified Transport and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Resseguier, V.; Memin, E.; Chapron, B.; Fox-Kemper, B.

    2017-12-01

    In order to better observe and predict geophysical flows, ensemble-based data assimilation methods are of high importance. In such methods, an ensemble of random realizations represents the variety of the simulated flow's likely behaviors. For this purpose, randomness needs to be introduced in a suitable way and physically-based stochastic subgrid parametrizations are promising paths. This talk will propose a new kind of such a parametrization referred to as modeling under location uncertainty. The fluid velocity is decomposed into a resolved large-scale component and an aliased small-scale one. The first component is possibly random but time-correlated whereas the second is white-in-time but spatially-correlated and possibly inhomogeneous and anisotropic. With such a velocity, the material derivative of any - possibly active - tracer is modified. Three new terms appear: a correction of the large-scale advection, a multiplicative noise and a possibly heterogeneous and anisotropic diffusion. This parameterization naturally ensures attractive properties such as energy conservation for each realization. Additionally, this stochastic material derivative and the associated Reynolds' transport theorem offer a systematic method to derive stochastic models. In particular, we will discuss the consequences of the Quasi-Geostrophic assumptions in our framework. Depending on the turbulence amount, different models with different physical behaviors are obtained. Under strong turbulence assumptions, a simplified diagnosis of frontolysis and frontogenesis at the surface of the ocean is possible in this framework. A Surface Quasi-Geostrophic (SQG) model with a weaker noise influence has also been simulated. A single realization better represents small scales than a deterministic SQG model at the same resolution. Moreover, an ensemble accurately predicts extreme events, bifurcations as well as the amplitudes and the positions of the simulation errors. Figure 1 highlights this last result and compares it to the strong error underestimation of an ensemble simulated from the deterministic dynamic with random initial conditions.

  10. Hybrid stochastic simulation of reaction-diffusion systems with slow and fast dynamics.

    PubMed

    Strehl, Robert; Ilie, Silvana

    2015-12-21

    In this paper, we present a novel hybrid method to simulate discrete stochastic reaction-diffusion models arising in biochemical signaling pathways. We study moderately stiff systems, for which we can partition each reaction or diffusion channel into either a slow or fast subset, based on its propensity. Numerical approaches missing this distinction are often limited with respect to computational run time or approximation quality. We design an approximate scheme that remedies these pitfalls by using a new blending strategy of the well-established inhomogeneous stochastic simulation algorithm and the tau-leaping simulation method. The advantages of our hybrid simulation algorithm are demonstrated on three benchmarking systems, with special focus on approximation accuracy and efficiency.

  11. Monte-Carlo Simulations of Drug Delivery on Biofilms

    NASA Astrophysics Data System (ADS)

    Buldum, Alper; Simpson, Andrew

    2013-03-01

    The focus of this work is on biofilms that grow in the lungs of cystic fibrosis (CF) patients. A discrete model which describes the nutrient and biomass as discrete particles is created. Diffusion of the nutrient, consumption of the nutrient by microbial particles, and growth and decay of microbial particles are simulated using stochastic processes. Our model extends the complexity of the biofilm system by including the conversion and reversion of living bacteria into a hibernated state, known as persister bacteria. Another new contribution is the inclusion of antimicrobial in two forms: an aqueous solution and encapsulated in biodegradable nanoparticles. The bacteria population growth and spatial variation of drugs and their effectiveness are investigated in this work. Supported by NIH

  12. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time.

    PubMed

    Dhar, Amrit; Minin, Vladimir N

    2017-05-01

    Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences.

  13. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time

    PubMed Central

    Dhar, Amrit

    2017-01-01

    Abstract Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences. PMID:28177780

  14. Spatial Structures of the Environment and of Dispersal Impact Species Distribution in Competitive Metacommunities

    PubMed Central

    Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang

    2013-01-01

    The correspondence between species distribution and the environment depends on species’ ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics. PMID:23874815

  15. Spatial structures of the environment and of dispersal impact species distribution in competitive metacommunities.

    PubMed

    Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang

    2013-01-01

    The correspondence between species distribution and the environment depends on species' ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics.

  16. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  17. Symmetries and stochastic symmetry breaking in multifractal geophysics: analysis and simulation with the help of the Lévy-Clifford algebra of cascade generators..

    NASA Astrophysics Data System (ADS)

    Schertzer, D. J. M.; Tchiguirinskaia, I.

    2016-12-01

    Multifractal fields, whose definition is rather independent of their domain dimension, have opened a new approach of geophysics enabling to explore its spatial extension that is of prime importance as underlined by the expression "spatial chaos". However multifractals have been until recently restricted to be scalar valued, i.e. to one-dimensional codomains. This has prevented to deal with the key question of complex component interactions and their non trivial symmetries. We first emphasize that the Lie algebra of stochastic generators of cascade processes enables us to generalize multifractals to arbitrarily large codomains, e.g. flows of vector fields on large dimensional manifolds. In particular, we have recently investigated the neat example of stable Levy generators on Clifford algebra that have a number of seductive properties, e.g. universal statistical and robust algebra properties, both defining the basic symmetries of the corresponding fields (Schertzer and Tchiguirinskaia, 2015). These properties provide a convenient multifractal framework to study both the symmetries of the fields and how they stochastically break the symmetries of the underlying equations due to boundary conditions, large scale rotations and forcings. These developments should help us to answer to challenging questions such as the climatology of (exo-) planets based on first principles (Pierrehumbert, 2013), to fully address the question of the limitations of quasi- geostrophic turbulence (Schertzer et al., 2012) and to explore the peculiar phenomenology of turbulent dynamics of the atmosphere or oceans that is neither two- or three-dimensional. Pierrehumbert, R.T., 2013. Strange news from other stars. Nature Geoscience, 6(2), pp.8183. Schertzer, D. et al., 2012. Quasi-geostrophic turbulence and generalized scale invariance, a theoretical reply. Atmos. Chem. Phys., 12, pp.327336. Schertzer, D. & Tchiguirinskaia, I., 2015. Multifractal vector fields and stochastic Clifford algebra. Chaos: An Interdisciplinary Journal of Nonlinear Science, 25(12), p.123127

  18. Employing Eigenvalue Ratios to Generate Prior Fracture-like Features for Stochastic Hydrogeophysical Characterization of a Fractured Aquifer System

    NASA Astrophysics Data System (ADS)

    Brewster, J.; Oware, E. K.

    2017-12-01

    Groundwater hosted in fractured rocks constitutes almost 65% of the principal aquifers in the US. The exploitation and contaminant management of fractured aquifers require fracture flow and transport modeling, which in turn requires a detailed understanding of the structure of the aquifer. The widely used equivalent porous medium approach to modeling fractured aquifer systems is inadequate to accurately predict fracture transport processes due to the averaging of the sharp lithological contrast between the matrix and the fractures. The potential of geophysical imaging (GI) to estimate spatially continuous subsurface profiles in a minimally invasive fashion is well proven. Conventional deterministic GI strategies, however, produce geologically unrealistic, smoothed-out results due to commonly enforced smoothing constraints. Stochastic GI of fractured aquifers is becoming increasing appealing due to its ability to recover realistic fracture features while providing multiple likely realizations that enable uncertainty assessment. Generating prior spatial features consistent with the expected target structures is crucial in stochastic imaging. We propose to utilize eigenvalue ratios to resolve the elongated fracture features expected in a fractured aquifer system. Eigenvalues capture the major and minor directions of variability in a region, which can be employed to evaluate shape descriptors, such as eccentricity (elongation) and orientation of features in the region. Eccentricity ranges from zero to one, representing a circularly sharped to a line feature, respectively. Here, we apply eigenvalue ratios to define a joint objective parameter consisting of eccentricity (shape) and direction terms to guide the generation of prior fracture-like features in some predefined principal directions for stochastic GI. Preliminary unconditional, synthetic experiments reveal the potential of the algorithm to simulate prior fracture-like features. We illustrate the strategy with a 2D, cross-borehole electrical resistivity tomography (ERT) in a fractured aquifer at the UB Environmental Geophysics Imaging Site, with tomograms validated with gamma and caliper logs obtained from the two ERT wells.

  19. Elucidating the Functional Roles of Spatial Organization in Cross-Membrane Signal Transduction by a Hybrid Simulation Method.

    PubMed

    Chen, Jiawen; Xie, Zhong-Ru; Wu, Yinghao

    2016-07-01

    The ligand-binding of membrane receptors on cell surfaces initiates the dynamic process of cross-membrane signal transduction. It is an indispensable part of the signaling network for cells to communicate with external environments. Recent experiments revealed that molecular components in signal transduction are not randomly mixed, but spatially organized into distinctive patterns. These patterns, such as receptor clustering and ligand oligomerization, lead to very different gene expression profiles. However, little is understood about the molecular mechanisms and functional impacts of this spatial-temporal regulation in cross-membrane signal transduction. In order to tackle this problem, we developed a hybrid computational method that decomposes a model of signaling network into two simulation modules. The physical process of binding between receptors and ligands on cell surfaces are simulated by a diffusion-reaction algorithm, while the downstream biochemical reactions are modeled by stochastic simulation of Gillespie algorithm. These two processes are coupled together by a synchronization framework. Using this method, we tested the dynamics of a simple signaling network in which the ligand binding of cell surface receptors triggers the phosphorylation of protein kinases, and in turn regulates the expression of target genes. We found that spatial aggregation of membrane receptors at cellular interfaces is able to either amplify or inhibit downstream signaling outputs, depending on the details of clustering mechanism. Moreover, by providing higher binding avidity, the co-localization of ligands into multi-valence complex modulates signaling in very different ways that are closely related to the binding affinity between ligand and receptor. We also found that the temporal oscillation of the signaling pathway that is derived from genetic feedback loops can be modified by the spatial clustering of membrane receptors. In summary, our method demonstrates the functional importance of spatial organization in cross-membrane signal transduction. The method can be applied to any specific signaling pathway in cells.

  20. High resolution modelling of soil moisture patterns with TerrSysMP: A comparison with sensor network data

    NASA Astrophysics Data System (ADS)

    Gebler, S.; Hendricks Franssen, H.-J.; Kollet, S. J.; Qu, W.; Vereecken, H.

    2017-04-01

    The prediction of the spatial and temporal variability of land surface states and fluxes with land surface models at high spatial resolution is still a challenge. This study compares simulation results using TerrSysMP including a 3D variably saturated groundwater flow model (ParFlow) coupled to the Community Land Model (CLM) of a 38 ha managed grassland head-water catchment in the Eifel (Germany), with soil water content (SWC) measurements from a wireless sensor network, actual evapotranspiration recorded by lysimeters and eddy covariance stations and discharge observations. TerrSysMP was discretized with a 10 × 10 m lateral resolution, variable vertical resolution (0.025-0.575 m), and the following parameterization strategies of the subsurface soil hydraulic parameters: (i) completely homogeneous, (ii) homogeneous parameters for different soil horizons, (iii) different parameters for each soil unit and soil horizon and (iv) heterogeneous stochastic realizations. Hydraulic conductivity and Mualem-Van Genuchten parameters in these simulations were sampled from probability density functions, constructed from either (i) soil texture measurements and Rosetta pedotransfer functions (ROS), or (ii) estimated soil hydraulic parameters by 1D inverse modelling using shuffle complex evolution (SCE). The results indicate that the spatial variability of SWC at the scale of a small headwater catchment is dominated by topography and spatially heterogeneous soil hydraulic parameters. The spatial variability of the soil water content thereby increases as a function of heterogeneity of soil hydraulic parameters. For lower levels of complexity, spatial variability of the SWC was underrepresented in particular for the ROS-simulations. Whereas all model simulations were able to reproduce the seasonal evapotranspiration variability, the poor discharge simulations with high model bias are likely related to short-term ET dynamics and the lack of information about bedrock characteristics and an on-site drainage system in the uncalibrated model. In general, simulation performance was better for the SCE setups. The SCE-simulations had a higher inverse air entry parameter resulting in SWC dynamics in better correspondence with data than the ROS simulations during dry periods. This illustrates that small scale measurements of soil hydraulic parameters cannot be transferred to the larger scale and that interpolated 1D inverse parameter estimates result in an acceptable performance for the catchment.

  1. Scalable domain decomposition solvers for stochastic PDEs in high performance computing

    DOE PAGES

    Desai, Ajit; Khalil, Mohammad; Pettit, Chris; ...

    2017-09-21

    Stochastic spectral finite element models of practical engineering systems may involve solutions of linear systems or linearized systems for non-linear problems with billions of unknowns. For stochastic modeling, it is therefore essential to design robust, parallel and scalable algorithms that can efficiently utilize high-performance computing to tackle such large-scale systems. Domain decomposition based iterative solvers can handle such systems. And though these algorithms exhibit excellent scalabilities, significant algorithmic and implementational challenges exist to extend them to solve extreme-scale stochastic systems using emerging computing platforms. Intrusive polynomial chaos expansion based domain decomposition algorithms are extended here to concurrently handle high resolutionmore » in both spatial and stochastic domains using an in-house implementation. Sparse iterative solvers with efficient preconditioners are employed to solve the resulting global and subdomain level local systems through multi-level iterative solvers. We also use parallel sparse matrix–vector operations to reduce the floating-point operations and memory requirements. Numerical and parallel scalabilities of these algorithms are presented for the diffusion equation having spatially varying diffusion coefficient modeled by a non-Gaussian stochastic process. Scalability of the solvers with respect to the number of random variables is also investigated.« less

  2. Scalable domain decomposition solvers for stochastic PDEs in high performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desai, Ajit; Khalil, Mohammad; Pettit, Chris

    Stochastic spectral finite element models of practical engineering systems may involve solutions of linear systems or linearized systems for non-linear problems with billions of unknowns. For stochastic modeling, it is therefore essential to design robust, parallel and scalable algorithms that can efficiently utilize high-performance computing to tackle such large-scale systems. Domain decomposition based iterative solvers can handle such systems. And though these algorithms exhibit excellent scalabilities, significant algorithmic and implementational challenges exist to extend them to solve extreme-scale stochastic systems using emerging computing platforms. Intrusive polynomial chaos expansion based domain decomposition algorithms are extended here to concurrently handle high resolutionmore » in both spatial and stochastic domains using an in-house implementation. Sparse iterative solvers with efficient preconditioners are employed to solve the resulting global and subdomain level local systems through multi-level iterative solvers. We also use parallel sparse matrix–vector operations to reduce the floating-point operations and memory requirements. Numerical and parallel scalabilities of these algorithms are presented for the diffusion equation having spatially varying diffusion coefficient modeled by a non-Gaussian stochastic process. Scalability of the solvers with respect to the number of random variables is also investigated.« less

  3. An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Carter, M. C.; Madison, M. W.

    1973-01-01

    The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

  4. Point-source stochastic-method simulations of ground motions for the PEER NGA-East Project

    USGS Publications Warehouse

    Boore, David

    2015-01-01

    Ground-motions for the PEER NGA-East project were simulated using a point-source stochastic method. The simulated motions are provided for distances between of 0 and 1200 km, M from 4 to 8, and 25 ground-motion intensity measures: peak ground velocity (PGV), peak ground acceleration (PGA), and 5%-damped pseudoabsolute response spectral acceleration (PSA) for 23 periods ranging from 0.01 s to 10.0 s. Tables of motions are provided for each of six attenuation models. The attenuation-model-dependent stress parameters used in the stochastic-method simulations were derived from inversion of PSA data from eight earthquakes in eastern North America.

  5. Mode Analyses of Gyrokinetic Simulations of Plasma Microturbulence

    NASA Astrophysics Data System (ADS)

    Hatch, David R.

    This thesis presents analysis of the excitation and role of damped modes in gyrokinetic simulations of plasma microturbulence. In order to address this question, mode decompositions are used to analyze gyrokinetic simulation data. A mode decomposition can be constructed by projecting a nonlinearly evolved gyrokinetic distribution function onto a set of linear eigenmodes, or alternatively by constructing a proper orthogonal decomposition of the distribution function. POD decompositions are used to examine the role of damped modes in saturating ion temperature gradient driven turbulence. In order to identify the contribution of different modes to the energy sources and sinks, numerical diagnostics for a gyrokinetic energy quantity were developed for the GENE code. The use of these energy diagnostics in conjunction with POD mode decompositions demonstrates that ITG turbulence saturates largely through dissipation by damped modes at the same perpendicular spatial scales as those of the driving instabilities. This defines a picture of turbulent saturation that is very different from both traditional hydrodynamic scenarios and also many common theories for the saturation of plasma turbulence. POD mode decompositions are also used to examine the role of subdominant modes in causing magnetic stochasticity in electromagnetic gyrokinetic simulations. It is shown that the magnetic stochasticity, which appears to be ubiquitous in electromagnetic microturbulence, is caused largely by subdominant modes with tearing parity. The application of higher-order singular value decomposition (HOSVD) to the full distribution function from gyrokinetic simulations is presented. This is an effort to demonstrate the ability to characterize and extract insight from a very large, complex, and high-dimensional data-set - the 5-D (plus time) gyrokinetic distribution function.

  6. Addressing spatial scales and new mechanisms in climate impact ecosystem modeling

    NASA Astrophysics Data System (ADS)

    Poulter, B.; Joetzjer, E.; Renwick, K.; Ogunkoya, G.; Emmett, K.

    2015-12-01

    Climate change impacts on vegetation distributions are typically addressed using either an empirical approach, such as a species distribution model (SDM), or with process-based methods, for example, dynamic global vegetation models (DGVMs). Each approach has its own benefits and disadvantages. For example, an SDM is constrained by data and few parameters, but does not include adaptation or acclimation processes or other ecosystem feedbacks that may act to mitigate or enhance climate effects. Alternatively, a DGVM model includes many mechanisms relating plant growth and disturbance to climate, but simulations are costly to perform at high-spatial resolution and there remains large uncertainty on a variety of fundamental physical processes. To address these issues, here, we present two DGVM-based case studies where i) high-resolution (1 km) simulations are being performed for vegetation in the Greater Yellowstone Ecosystem using a biogeochemical, forest gap model, LPJ-GUESS, and ii) where new mechanisms for simulating tropical tree-mortality are being introduced. High-resolution DGVM model simulations require not only computing and reorganizing code but also a consideration of scaling issues on vegetation dynamics and stochasticity and also on disturbance and migration. New mechanisms for simulating forest mortality must consider hydraulic limitations and carbon reserves and their interactions on source-sink dynamics and in controlling water potentials. Improving DGVM approaches by addressing spatial scale challenges and integrating new approaches for estimating forest mortality will provide new insights more relevant for land management and possibly reduce uncertainty by physical processes more directly comparable to experimental and observational evidence.

  7. A Two-Step Method to Select Major Surge-Producing Extratropical Cyclones from a 10,000-Year Stochastic Catalog

    NASA Astrophysics Data System (ADS)

    Keshtpoor, M.; Carnacina, I.; Yablonsky, R. M.

    2016-12-01

    Extratropical cyclones (ETCs) are the primary driver of storm surge events along the UK and northwest mainland Europe coastlines. In an effort to evaluate the storm surge risk in coastal communities in this region, a stochastic catalog is developed by perturbing the historical storm seeds of European ETCs to account for 10,000 years of possible ETCs. Numerical simulation of the storm surge generated by the full 10,000-year stochastic catalog, however, is computationally expensive and may take several months to complete with available computational resources. A new statistical regression model is developed to select the major surge-generating events from the stochastic ETC catalog. This regression model is based on the maximum storm surge, obtained via numerical simulations using a calibrated version of the Delft3D-FM hydrodynamic model with a relatively coarse mesh, of 1750 historical ETC events that occurred over the past 38 years in Europe. These numerically-simulated surge values were regressed to the local sea level pressure and the U and V components of the wind field at the location of 196 tide gauge stations near the UK and northwest mainland Europe coastal areas. The regression model suggests that storm surge values in the area of interest are highly correlated to the U- and V-component of wind speed, as well as the sea level pressure. Based on these correlations, the regression model was then used to select surge-generating storms from the 10,000-year stochastic catalog. Results suggest that roughly 105,000 events out of 480,000 stochastic storms are surge-generating events and need to be considered for numerical simulation using a hydrodynamic model. The selected stochastic storms were then simulated in Delft3D-FM, and the final refinement of the storm population was performed based on return period analysis of the 1750 historical event simulations at each of the 196 tide gauges in preparation for Delft3D-FM fine mesh simulations.

  8. Hybrid stochastic simulations of intracellular reaction-diffusion systems.

    PubMed

    Kalantzis, Georgios

    2009-06-01

    With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed an adaptive partitioning strategy in which processes with high frequency are simulated with deterministic rate-based equations, and those with low frequency using the exact stochastic algorithm of Gillespie. Therefore the stochastic behavior of cellular pathways is preserved while being able to apply it to large populations of molecules. We describe our method and demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors.

  9. Estimating rare events in biochemical systems using conditional sampling.

    PubMed

    Sundar, V S

    2017-01-28

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  10. Stochastic Effects in Computational Biology of Space Radiation Cancer Risk

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Pluth, Janis; Harper, Jane; O'Neill, Peter

    2007-01-01

    Estimating risk from space radiation poses important questions on the radiobiology of protons and heavy ions. We are considering systems biology models to study radiation induced repair foci (RIRF) at low doses, in which less than one-track on average transverses the cell, and the subsequent DNA damage processing and signal transduction events. Computational approaches for describing protein regulatory networks coupled to DNA and oxidative damage sites include systems of differential equations, stochastic equations, and Monte-Carlo simulations. We review recent developments in the mathematical description of protein regulatory networks and possible approaches to radiation effects simulation. These include robustness, which states that regulatory networks maintain their functions against external and internal perturbations due to compensating properties of redundancy and molecular feedback controls, and modularity, which leads to general theorems for considering molecules that interact through a regulatory mechanism without exchange of matter leading to a block diagonal reduction of the connecting pathways. Identifying rate-limiting steps, robustness, and modularity in pathways perturbed by radiation damage are shown to be valid techniques for reducing large molecular systems to realistic computer simulations. Other techniques studied are the use of steady-state analysis, and the introduction of composite molecules or rate-constants to represent small collections of reactants. Applications of these techniques to describe spatial and temporal distributions of RIRF and cell populations following low dose irradiation are described.

  11. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro; Alexander, Nicholas A.; Kongko, Widjo; Muhari, Abdul

    2017-12-01

    This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0) that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan - including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal-vertical evacuation time maps - has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  12. The ISI distribution of the stochastic Hodgkin-Huxley neuron.

    PubMed

    Rowat, Peter F; Greenwood, Priscilla E

    2014-01-01

    The simulation of ion-channel noise has an important role in computational neuroscience. In recent years several approximate methods of carrying out this simulation have been published, based on stochastic differential equations, and all giving slightly different results. The obvious, and essential, question is: which method is the most accurate and which is most computationally efficient? Here we make a contribution to the answer. We compare interspike interval histograms from simulated data using four different approximate stochastic differential equation (SDE) models of the stochastic Hodgkin-Huxley neuron, as well as the exact Markov chain model simulated by the Gillespie algorithm. One of the recent SDE models is the same as the Kurtz approximation first published in 1978. All the models considered give similar ISI histograms over a wide range of deterministic and stochastic input. Three features of these histograms are an initial peak, followed by one or more bumps, and then an exponential tail. We explore how these features depend on deterministic input and on level of channel noise, and explain the results using the stochastic dynamics of the model. We conclude with a rough ranking of the four SDE models with respect to the similarity of their ISI histograms to the histogram of the exact Markov chain model.

  13. Peculiar orientational disorder in 4-bromo-4'-nitrobiphenyl (BNBP) and 4-bromo-4'-cyanobiphenyl (BCNBP) leading to bipolar crystals.

    PubMed

    Burgener, Matthias; Aboulfadl, Hanane; Labat, Gaël Charles; Bonin, Michel; Sommer, Martin; Sankolli, Ravish; Wübbenhorst, Michael; Hulliger, Jürg

    2016-05-01

    180° orientational disorder of molecular building blocks can lead to a peculiar spatial distribution of polar properties in molecular crystals. Here we present two examples [4-bromo-4'-nitrobiphenyl (BNBP) and 4-bromo-4'-cyanobiphenyl (BCNBP)] which develop into a bipolar final growth state. This means orientational disorder taking place at the crystal/nutrient interface produces domains of opposite average polarity for as-grown crystals. The spatial inhomogeneous distribution of polarity was investigated by scanning pyroelectric microscopy (SPEM), phase-sensitive second harmonic microscopy (PS-SHM) and selected volume X-ray diffraction (SVXD). As a result, the acceptor groups (NO2 or CN) are predominantly present at crystal surfaces. However, the stochastic process of polarity formation can be influenced by adding a symmetrical biphenyl to a growing system. For this case, Monte Carlo simulations predict an inverted net polarity compared with the growth of pure BNBP and BCNBP. SPEM results clearly demonstrate that 4,4'-dibromobiphenyl (DBBP) can invert the polarity for both crystals. Phenomena reported in this paper belong to the most striking processes seen for molecular crystals, demonstrated by a stochastic process giving rise to symmetry breaking. We encounter here further examples supporting the general thesis that monodomain polar molecular crystals for fundamental reasons cannot exist.

  14. Hybrid stochastic simulation of reaction-diffusion systems with slow and fast dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strehl, Robert; Ilie, Silvana, E-mail: silvana@ryerson.ca

    2015-12-21

    In this paper, we present a novel hybrid method to simulate discrete stochastic reaction-diffusion models arising in biochemical signaling pathways. We study moderately stiff systems, for which we can partition each reaction or diffusion channel into either a slow or fast subset, based on its propensity. Numerical approaches missing this distinction are often limited with respect to computational run time or approximation quality. We design an approximate scheme that remedies these pitfalls by using a new blending strategy of the well-established inhomogeneous stochastic simulation algorithm and the tau-leaping simulation method. The advantages of our hybrid simulation algorithm are demonstrated onmore » three benchmarking systems, with special focus on approximation accuracy and efficiency.« less

  15. Selection of polynomial chaos bases via Bayesian model uncertainty methods with applications to sparse approximation of PDEs with stochastic inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karagiannis, Georgios, E-mail: georgios.karagiannis@pnnl.gov; Lin, Guang, E-mail: guang.lin@pnnl.gov

    2014-02-15

    Generalized polynomial chaos (gPC) expansions allow us to represent the solution of a stochastic system using a series of polynomial chaos basis functions. The number of gPC terms increases dramatically as the dimension of the random input variables increases. When the number of the gPC terms is larger than that of the available samples, a scenario that often occurs when the corresponding deterministic solver is computationally expensive, evaluation of the gPC expansion can be inaccurate due to over-fitting. We propose a fully Bayesian approach that allows for global recovery of the stochastic solutions, in both spatial and random domains, bymore » coupling Bayesian model uncertainty and regularization regression methods. It allows the evaluation of the PC coefficients on a grid of spatial points, via (1) the Bayesian model average (BMA) or (2) the median probability model, and their construction as spatial functions on the spatial domain via spline interpolation. The former accounts for the model uncertainty and provides Bayes-optimal predictions; while the latter provides a sparse representation of the stochastic solutions by evaluating the expansion on a subset of dominating gPC bases. Moreover, the proposed methods quantify the importance of the gPC bases in the probabilistic sense through inclusion probabilities. We design a Markov chain Monte Carlo (MCMC) sampler that evaluates all the unknown quantities without the need of ad-hoc techniques. The proposed methods are suitable for, but not restricted to, problems whose stochastic solutions are sparse in the stochastic space with respect to the gPC bases while the deterministic solver involved is expensive. We demonstrate the accuracy and performance of the proposed methods and make comparisons with other approaches on solving elliptic SPDEs with 1-, 14- and 40-random dimensions.« less

  16. Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.

    PubMed

    Caglar, Mehmet Umut; Pal, Ranadip

    2013-01-01

    Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.

  17. Hybrid approaches for multiple-species stochastic reaction–diffusion models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spill, Fabian, E-mail: fspill@bu.edu; Department of Mechanical Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139; Guerrero, Pilar

    2015-10-15

    Reaction–diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and smallmore » in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction–diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model. - Highlights: • A novel hybrid stochastic/deterministic reaction–diffusion simulation method is given. • Can massively speed up stochastic simulations while preserving stochastic effects. • Can handle multiple reacting species. • Can handle moving boundaries.« less

  18. Stochastic Evolution of Augmented Born-Infeld Equations

    NASA Astrophysics Data System (ADS)

    Holm, Darryl D.

    2018-06-01

    This paper compares the results of applying a recently developed method of stochastic uncertainty quantification designed for fluid dynamics to the Born-Infeld model of nonlinear electromagnetism. The similarities in the results are striking. Namely, the introduction of Stratonovich cylindrical noise into each of their Hamiltonian formulations introduces stochastic Lie transport into their dynamics in the same form for both theories. Moreover, the resulting stochastic partial differential equations retain their unperturbed form, except for an additional term representing induced Lie transport by the set of divergence-free vector fields associated with the spatial correlations of the cylindrical noise. The explanation for this remarkable similarity lies in the method of construction of the Hamiltonian for the Stratonovich stochastic contribution to the motion in both cases, which is done via pairing spatial correlation eigenvectors for cylindrical noise with the momentum map for the deterministic motion. This momentum map is responsible for the well-known analogy between hydrodynamics and electromagnetism. The momentum map for the Maxwell and Born-Infeld theories of electromagnetism treated here is the 1-form density known as the Poynting vector. Two appendices treat the Hamiltonian structures underlying these results.

  19. Pair and triplet approximation of a spatial lattice population model with multiscale dispersal using Markov chains for estimating spatial autocorrelation.

    PubMed

    Hiebeler, David E; Millett, Nicholas E

    2011-06-21

    We investigate a spatial lattice model of a population employing dispersal to nearest and second-nearest neighbors, as well as long-distance dispersal across the landscape. The model is studied via stochastic spatial simulations, ordinary pair approximation, and triplet approximation. The latter method, which uses the probabilities of state configurations of contiguous blocks of three sites as its state variables, is demonstrated to be greatly superior to pair approximations for estimating spatial correlation information at various scales. Correlations between pairs of sites separated by arbitrary distances are estimated by constructing spatial Markov processes using the information from both approximations. These correlations demonstrate why pair approximation misses basic qualitative features of the model, such as decreasing population density as a large proportion of offspring are dropped on second-nearest neighbors, and why triplet approximation is able to include them. Analytical and numerical results show that, excluding long-distance dispersal, the initial growth rate of an invading population is maximized and the equilibrium population density is also roughly maximized when the population spreads its offspring evenly over nearest and second-nearest neighboring sites. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. GARCH modelling of covariance in dynamical estimation of inverse solutions

    NASA Astrophysics Data System (ADS)

    Galka, Andreas; Yamashita, Okito; Ozaki, Tohru

    2004-12-01

    The problem of estimating unobserved states of spatially extended dynamical systems poses an inverse problem, which can be solved approximately by a recently developed variant of Kalman filtering; in order to provide the model of the dynamics with more flexibility with respect to space and time, we suggest to combine the concept of GARCH modelling of covariance, well known in econometrics, with Kalman filtering. We formulate this algorithm for spatiotemporal systems governed by stochastic diffusion equations and demonstrate its feasibility by presenting a numerical simulation designed to imitate the situation of the generation of electroencephalographic recordings by the human cortex.

  1. Stochastic simulations on a model of circadian rhythm generation.

    PubMed

    Miura, Shigehiro; Shimokawa, Tetsuya; Nomura, Taishin

    2008-01-01

    Biological phenomena are often modeled by differential equations, where states of a model system are described by continuous real values. When we consider concentrations of molecules as dynamical variables for a set of biochemical reactions, we implicitly assume that numbers of the molecules are large enough so that their changes can be regarded as continuous and they are described deterministically. However, for a system with small numbers of molecules, changes in their numbers are apparently discrete and molecular noises become significant. In such cases, models with deterministic differential equations may be inappropriate, and the reactions must be described by stochastic equations. In this study, we focus a clock gene expression for a circadian rhythm generation, which is known as a system involving small numbers of molecules. Thus it is appropriate for the system to be modeled by stochastic equations and analyzed by methodologies of stochastic simulations. The interlocked feedback model proposed by Ueda et al. as a set of deterministic ordinary differential equations provides a basis of our analyses. We apply two stochastic simulation methods, namely Gillespie's direct method and the stochastic differential equation method also by Gillespie, to the interlocked feedback model. To this end, we first reformulated the original differential equations back to elementary chemical reactions. With those reactions, we simulate and analyze the dynamics of the model using two methods in order to compare them with the dynamics obtained from the original deterministic model and to characterize dynamics how they depend on the simulation methodologies.

  2. Model-assisted probability of detection of flaws in aluminum blocks using polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming

    2018-04-01

    Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.

  3. Combining cellular automata and Lattice Boltzmann method to model multiscale avascular tumor growth coupled with nutrient diffusion and immune competition.

    PubMed

    Alemani, Davide; Pappalardo, Francesco; Pennisi, Marzio; Motta, Santo; Brusic, Vladimir

    2012-02-28

    In the last decades the Lattice Boltzmann method (LB) has been successfully used to simulate a variety of processes. The LB model describes the microscopic processes occurring at the cellular level and the macroscopic processes occurring at the continuum level with a unique function, the probability distribution function. Recently, it has been tried to couple deterministic approaches with probabilistic cellular automata (probabilistic CA) methods with the aim to model temporal evolution of tumor growths and three dimensional spatial evolution, obtaining hybrid methodologies. Despite the good results attained by CA-PDE methods, there is one important issue which has not been completely solved: the intrinsic stochastic nature of the interactions at the interface between cellular (microscopic) and continuum (macroscopic) level. CA methods are able to cope with the stochastic phenomena because of their probabilistic nature, while PDE methods are fully deterministic. Even if the coupling is mathematically correct, there could be important statistical effects that could be missed by the PDE approach. For such a reason, to be able to develop and manage a model that takes into account all these three level of complexity (cellular, molecular and continuum), we believe that PDE should be replaced with a statistic and stochastic model based on the numerical discretization of the Boltzmann equation: The Lattice Boltzmann (LB) method. In this work we introduce a new hybrid method to simulate tumor growth and immune system, by applying Cellular Automata Lattice Boltzmann (CA-LB) approach. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Fast and Precise Emulation of Stochastic Biochemical Reaction Networks With Amplified Thermal Noise in Silicon Chips.

    PubMed

    Kim, Jaewook; Woo, Sung Sik; Sarpeshkar, Rahul

    2018-04-01

    The analysis and simulation of complex interacting biochemical reaction pathways in cells is important in all of systems biology and medicine. Yet, the dynamics of even a modest number of noisy or stochastic coupled biochemical reactions is extremely time consuming to simulate. In large part, this is because of the expensive cost of random number and Poisson process generation and the presence of stiff, coupled, nonlinear differential equations. Here, we demonstrate that we can amplify inherent thermal noise in chips to emulate randomness physically, thus alleviating these costs significantly. Concurrently, molecular flux in thermodynamic biochemical reactions maps to thermodynamic electronic current in a transistor such that stiff nonlinear biochemical differential equations are emulated exactly in compact, digitally programmable, highly parallel analog "cytomorphic" transistor circuits. For even small-scale systems involving just 80 stochastic reactions, our 0.35-μm BiCMOS chips yield a 311× speedup in the simulation time of Gillespie's stochastic algorithm over COPASI, a fast biochemical-reaction software simulator that is widely used in computational biology; they yield a 15 500× speedup over equivalent MATLAB stochastic simulations. The chip emulation results are consistent with these software simulations over a large range of signal-to-noise ratios. Most importantly, our physical emulation of Poisson chemical dynamics does not involve any inherently sequential processes and updates such that, unlike prior exact simulation approaches, they are parallelizable, asynchronous, and enable even more speedup for larger-size networks.

  5. Gryphon: A Hybrid Agent-Based Modeling and Simulation Platform for Infectious Diseases

    NASA Astrophysics Data System (ADS)

    Yu, Bin; Wang, Jijun; McGowan, Michael; Vaidyanathan, Ganesh; Younger, Kristofer

    In this paper we present Gryphon, a hybrid agent-based stochastic modeling and simulation platform developed for characterizing the geographic spread of infectious diseases and the effects of interventions. We study both local and non-local transmission dynamics of stochastic simulations based on the published parameters and data for SARS. The results suggest that the expected numbers of infections and the timeline of control strategies predicted by our stochastic model are in reasonably good agreement with previous studies. These preliminary results indicate that Gryphon is able to characterize other future infectious diseases and identify endangered regions in advance.

  6. Effect of monthly areal rainfall uncertainty on streamflow simulation

    NASA Astrophysics Data System (ADS)

    Ndiritu, J. G.; Mkhize, N.

    2017-08-01

    Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic monthly rainfalls were 86 and 90% of the mean naturalised streamflow. In calibration, 33% of the naturalised flow located within the streamflow ranges with historic rainfall simulations and using stochastic rainfalls increased this to 66%. In validation the respective percentages of naturalised flows located within the simulated streamflow ranges were 32 and 72% respectively. The analysis reveals that monthly areal rainfall uncertainty is significant and incorporating it into streamflow simulation would add validity to the results.

  7. Conceptual Issues in Quantifying Unusualness and Conceiving Stochastic Experiments: Insights from Students' Experiences in Designing Sampling Simulations

    ERIC Educational Resources Information Center

    Saldanha, Luis

    2016-01-01

    This article reports on a classroom teaching experiment that engaged a group of high school students in designing sampling simulations within a computer microworld. The simulation-design activities aimed to foster students' abilities to conceive of contextual situations as stochastic experiments, and to engage them with the logic of hypothesis…

  8. Modeling of stochastic motion of bacteria propelled spherical microbeads

    NASA Astrophysics Data System (ADS)

    Arabagi, Veaceslav; Behkam, Bahareh; Cheung, Eugene; Sitti, Metin

    2011-06-01

    This work proposes a stochastic dynamic model of bacteria propelled spherical microbeads as potential swimming microrobotic bodies. Small numbers of S. marcescens bacteria are attached with their bodies to surfaces of spherical microbeads. Average-behavior stochastic models that are normally adopted when studying such biological systems are generally not effective for cases in which a small number of agents are interacting in a complex manner, hence a stochastic model is proposed to simulate the behavior of 8-41 bacteria assembled on a curved surface. Flexibility of the flagellar hook is studied via comparing simulated and experimental results for scenarios of increasing bead size and the number of attached bacteria on a bead. Although requiring more experimental data to yield an exact, certain flagellar hook stiffness value, the examined results favor a stiffer flagella. The stochastic model is intended to be used as a design and simulation tool for future potential targeted drug delivery and disease diagnosis applications of bacteria propelled microrobots.

  9. Stochasticity in numerical solutions of the nonlinear Schroedinger equation

    NASA Technical Reports Server (NTRS)

    Shen, Mei-Mei; Nicholson, D. R.

    1987-01-01

    The cubically nonlinear Schroedinger equation is an important model of nonlinear phenomena in fluids and plasmas. Numerical solutions in a spatially periodic system commonly involve truncation to a finite number of Fourier modes. These solutions are found to be stochastic in the sense that the largest Liapunov exponent is positive. As the number of modes is increased, the size of this exponent appears to converge to zero, in agreement with the recent demonstration of the integrability of the spatially periodic case.

  10. Performance Evaluation of 18F Radioluminescence Microscopy Using Computational Simulation

    PubMed Central

    Wang, Qian; Sengupta, Debanti; Kim, Tae Jin; Pratx, Guillem

    2017-01-01

    Purpose Radioluminescence microscopy can visualize the distribution of beta-emitting radiotracers in live single cells with high resolution. Here, we perform a computational simulation of 18F positron imaging using this modality to better understand how radioluminescence signals are formed and to assist in optimizing the experimental setup and image processing. Methods First, the transport of charged particles through the cell and scintillator and the resulting scintillation is modeled using the GEANT4 Monte-Carlo simulation. Then, the propagation of the scintillation light through the microscope is modeled by a convolution with a depth-dependent point-spread function, which models the microscope response. Finally, the physical measurement of the scintillation light using an electron-multiplying charge-coupled device (EMCCD) camera is modeled using a stochastic numerical photosensor model, which accounts for various sources of noise. The simulated output of the EMCCD camera is further processed using our ORBIT image reconstruction methodology to evaluate the endpoint images. Results The EMCCD camera model was validated against experimentally acquired images and the simulated noise, as measured by the standard deviation of a blank image, was found to be accurate within 2% of the actual detection. Furthermore, point-source simulations found that a reconstructed spatial resolution of 18.5 μm can be achieved near the scintillator. As the source is moved away from the scintillator, spatial resolution degrades at a rate of 3.5 μm per μm distance. These results agree well with the experimentally measured spatial resolution of 30–40 μm (live cells). The simulation also shows that the system sensitivity is 26.5%, which is also consistent with our previous experiments. Finally, an image of a simulated sparse set of single cells is visually similar to the measured cell image. Conclusions Our simulation methodology agrees with experimental measurements taken with radioluminescence microscopy. This in silico approach can be used to guide further instrumentation developments and to provide a framework for improving image reconstruction. PMID:28273348

  11. Bayesian parameter estimation for stochastic models of biological cell migration

    NASA Astrophysics Data System (ADS)

    Dieterich, Peter; Preuss, Roland

    2013-08-01

    Cell migration plays an essential role under many physiological and patho-physiological conditions. It is of major importance during embryonic development and wound healing. In contrast, it also generates negative effects during inflammation processes, the transmigration of tumors or the formation of metastases. Thus, a reliable quantification and characterization of cell paths could give insight into the dynamics of these processes. Typically stochastic models are applied where parameters are extracted by fitting models to the so-called mean square displacement of the observed cell group. We show that this approach has several disadvantages and problems. Therefore, we propose a simple procedure directly relying on the positions of the cell's trajectory and the covariance matrix of the positions. It is shown that the covariance is identical with the spatial aging correlation function for the supposed linear Gaussian models of Brownian motion with drift and fractional Brownian motion. The technique is applied and illustrated with simulated data showing a reliable parameter estimation from single cell paths.

  12. Fourier-interpolation superresolution optical fluctuation imaging (fSOFi) (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Enderlein, Joerg; Stein, Simon C.; Huss, Anja; Hähnel, Dirk; Gregor, Ingo

    2016-02-01

    Stochastic Optical Fluctuation Imaging (SOFI) is a superresolution fluorescence microscopy technique which allows to enhance the spatial resolution of an image by evaluating the temporal fluctuations of blinking fluorescent emitters. SOFI is not based on the identification and localization of single molecules such as in the widely used Photoactivation Localization Microsopy (PALM) or Stochastic Optical Reconstruction Microscopy (STORM), but computes a superresolved image via temporal cumulants from a recorded movie. A technical challenge hereby is that, when directly applying the SOFI algorithm to a movie of raw images, the pixel size of the final SOFI image is the same as that of the original images, which becomes problematic when the final SOFI resolution is much smaller than this value. In the past, sophisticated cross-correlation schemes have been used for tackling this problem. Here, we present an alternative, exact, straightforward, and simple solution using an interpolation scheme based on Fourier transforms. We exemplify the method on simulated and experimental data.

  13. Neutral Community Dynamics and the Evolution of Species Interactions.

    PubMed

    Coelho, Marco Túlio P; Rangel, Thiago F

    2018-04-01

    A contemporary goal in ecology is to determine the ecological and evolutionary processes that generate recurring structural patterns in mutualistic networks. One of the great challenges is testing the capacity of neutral processes to replicate observed patterns in ecological networks, since the original formulation of the neutral theory lacks trophic interactions. Here, we develop a stochastic-simulation neutral model adding trophic interactions to the neutral theory of biodiversity. Without invoking ecological differences among individuals of different species, and assuming that ecological interactions emerge randomly, we demonstrate that a spatially explicit multitrophic neutral model is able to capture the recurrent structural patterns of mutualistic networks (i.e., degree distribution, connectance, nestedness, and phylogenetic signal of species interactions). Nonrandom species distribution, caused by probabilistic events of migration and speciation, create nonrandom network patterns. These findings have broad implications for the interpretation of niche-based processes as drivers of ecological networks, as well as for the integration of network structures with demographic stochasticity.

  14. Metallic-thin-film instability with spatially correlated thermal noise.

    PubMed

    Diez, Javier A; González, Alejandro G; Fernández, Roberto

    2016-01-01

    We study the effects of stochastic thermal fluctuations on the instability of the free surface of a flat liquid metallic film on a solid substrate. These fluctuations are represented by a stochastic noise term added to the deterministic equation for the film thickness within the long-wave approximation. Unlike the case of polymeric films, we find that this noise, while remaining white in time, must be colored in space, at least in some regimes. The corresponding noise term is characterized by a nonzero correlation length, ℓ_{c}, which, combined with the size of the system, leads to a dimensionless parameter β that accounts for the relative importance of the spatial correlation (β∼ℓ_{c}^{-1}). We perform the linear stability analysis (LSA) of the film both with and without the noise term and find that for ℓ_{c} larger than some critical value (depending on the system size), the wavelength of the peak of the spectrum is larger than that corresponding to the deterministic case, while for smaller ℓ_{c} this peak corresponds to smaller wavelength than the latter. Interestingly, whatever the value of ℓ_{c}, the peak always approaches the deterministic one for larger times. We compare LSA results with the numerical simulations of the complete nonlinear problem and find a good agreement in the power spectra for early times at different values of β. For late times, we find that the stochastic LSA predicts well the position of the dominant wavelength, showing that nonlinear interactions do not modify the trends of the early linear stages. Finally, we fit the theoretical spectra to experimental data from a nanometric laser-melted copper film and find that at later times, the adjustment requires smaller values of β (larger space correlations).

  15. Metallic-thin-film instability with spatially correlated thermal noise

    NASA Astrophysics Data System (ADS)

    Diez, Javier A.; González, Alejandro G.; Fernández, Roberto

    2016-01-01

    We study the effects of stochastic thermal fluctuations on the instability of the free surface of a flat liquid metallic film on a solid substrate. These fluctuations are represented by a stochastic noise term added to the deterministic equation for the film thickness within the long-wave approximation. Unlike the case of polymeric films, we find that this noise, while remaining white in time, must be colored in space, at least in some regimes. The corresponding noise term is characterized by a nonzero correlation length, ℓc, which, combined with the size of the system, leads to a dimensionless parameter β that accounts for the relative importance of the spatial correlation (β ˜ℓc-1 ). We perform the linear stability analysis (LSA) of the film both with and without the noise term and find that for ℓc larger than some critical value (depending on the system size), the wavelength of the peak of the spectrum is larger than that corresponding to the deterministic case, while for smaller ℓc this peak corresponds to smaller wavelength than the latter. Interestingly, whatever the value of ℓc, the peak always approaches the deterministic one for larger times. We compare LSA results with the numerical simulations of the complete nonlinear problem and find a good agreement in the power spectra for early times at different values of β . For late times, we find that the stochastic LSA predicts well the position of the dominant wavelength, showing that nonlinear interactions do not modify the trends of the early linear stages. Finally, we fit the theoretical spectra to experimental data from a nanometric laser-melted copper film and find that at later times, the adjustment requires smaller values of β (larger space correlations).

  16. Stochastic estimation of human arm impedance under nonlinear friction in robot joints: a model study.

    PubMed

    Chang, Pyung Hun; Kang, Sang Hoon

    2010-05-30

    The basic assumption of stochastic human arm impedance estimation methods is that the human arm and robot behave linearly for small perturbations. In the present work, we have identified the degree of influence of nonlinear friction in robot joints to the stochastic human arm impedance estimation. Internal model based impedance control (IMBIC) is then proposed as a means to make the estimation accurate by compensating for the nonlinear friction. From simulations with a nonlinear Lugre friction model, it is observed that the reliability and accuracy of the estimation are severely degraded with nonlinear friction: below 2 Hz, multiple and partial coherence functions are far less than unity; estimated magnitudes and phases are severely deviated from that of a real human arm throughout the frequency range of interest; and the accuracy is not enhanced with an increase of magnitude of the force perturbations. In contrast, the combined use of stochastic estimation and IMBIC provides with accurate estimation results even with large friction: the multiple coherence functions are larger than 0.9 throughout the frequency range of interest and the estimated magnitudes and phases are well matched with that of a real human arm. Furthermore, the performance of suggested method is independent of human arm and robot posture, and human arm impedance. Therefore, the IMBIC will be useful in measuring human arm impedance with conventional robot, as well as in designing a spatial impedance measuring robot, which requires gearing. (c) 2010 Elsevier B.V. All rights reserved.

  17. Discrete stochastic simulation methods for chemically reacting systems.

    PubMed

    Cao, Yang; Samuels, David C

    2009-01-01

    Discrete stochastic chemical kinetics describe the time evolution of a chemically reacting system by taking into account the fact that, in reality, chemical species are present with integer populations and exhibit some degree of randomness in their dynamical behavior. In recent years, with the development of new techniques to study biochemistry dynamics in a single cell, there are increasing studies using this approach to chemical kinetics in cellular systems, where the small copy number of some reactant species in the cell may lead to deviations from the predictions of the deterministic differential equations of classical chemical kinetics. This chapter reviews the fundamental theory related to stochastic chemical kinetics and several simulation methods based on that theory. We focus on nonstiff biochemical systems and the two most important discrete stochastic simulation methods: Gillespie's stochastic simulation algorithm (SSA) and the tau-leaping method. Different implementation strategies of these two methods are discussed. Then we recommend a relatively simple and efficient strategy that combines the strengths of the two methods: the hybrid SSA/tau-leaping method. The implementation details of the hybrid strategy are given here and a related software package is introduced. Finally, the hybrid method is applied to simple biochemical systems as a demonstration of its application.

  18. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks

    PubMed Central

    Walpole, J.; Chappell, J.C.; Cluceru, J.G.; Mac Gabhann, F.; Bautch, V.L.; Peirce, S. M.

    2015-01-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods. PMID:26158406

  19. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.

    PubMed

    Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M

    2015-09-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods.

  20. Adaptive hybrid simulations for multiscale stochastic reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hepp, Benjamin; Gupta, Ankit; Khammash, Mustafa

    2015-01-21

    The probability distribution describing the state of a Stochastic Reaction Network (SRN) evolves according to the Chemical Master Equation (CME). It is common to estimate its solution using Monte Carlo methods such as the Stochastic Simulation Algorithm (SSA). In many cases, these simulations can take an impractical amount of computational time. Therefore, many methods have been developed that approximate sample paths of the underlying stochastic process and estimate the solution of the CME. A prominent class of these methods include hybrid methods that partition the set of species and the set of reactions into discrete and continuous subsets. Such amore » partition separates the dynamics into a discrete and a continuous part. Simulating such a stochastic process can be computationally much easier than simulating the exact discrete stochastic process with SSA. Moreover, the quasi-stationary assumption to approximate the dynamics of fast subnetworks can be applied for certain classes of networks. However, as the dynamics of a SRN evolves, these partitions may have to be adapted during the simulation. We develop a hybrid method that approximates the solution of a CME by automatically partitioning the reactions and species sets into discrete and continuous components and applying the quasi-stationary assumption on identifiable fast subnetworks. Our method does not require any user intervention and it adapts to exploit the changing timescale separation between reactions and/or changing magnitudes of copy-numbers of constituent species. We demonstrate the efficiency of the proposed method by considering examples from systems biology and showing that very good approximations to the exact probability distributions can be achieved in significantly less computational time. This is especially the case for systems with oscillatory dynamics, where the system dynamics change considerably throughout the time-period of interest.« less

  1. Adaptive hybrid simulations for multiscale stochastic reaction networks.

    PubMed

    Hepp, Benjamin; Gupta, Ankit; Khammash, Mustafa

    2015-01-21

    The probability distribution describing the state of a Stochastic Reaction Network (SRN) evolves according to the Chemical Master Equation (CME). It is common to estimate its solution using Monte Carlo methods such as the Stochastic Simulation Algorithm (SSA). In many cases, these simulations can take an impractical amount of computational time. Therefore, many methods have been developed that approximate sample paths of the underlying stochastic process and estimate the solution of the CME. A prominent class of these methods include hybrid methods that partition the set of species and the set of reactions into discrete and continuous subsets. Such a partition separates the dynamics into a discrete and a continuous part. Simulating such a stochastic process can be computationally much easier than simulating the exact discrete stochastic process with SSA. Moreover, the quasi-stationary assumption to approximate the dynamics of fast subnetworks can be applied for certain classes of networks. However, as the dynamics of a SRN evolves, these partitions may have to be adapted during the simulation. We develop a hybrid method that approximates the solution of a CME by automatically partitioning the reactions and species sets into discrete and continuous components and applying the quasi-stationary assumption on identifiable fast subnetworks. Our method does not require any user intervention and it adapts to exploit the changing timescale separation between reactions and/or changing magnitudes of copy-numbers of constituent species. We demonstrate the efficiency of the proposed method by considering examples from systems biology and showing that very good approximations to the exact probability distributions can be achieved in significantly less computational time. This is especially the case for systems with oscillatory dynamics, where the system dynamics change considerably throughout the time-period of interest.

  2. Machine learning from computer simulations with applications in rail vehicle dynamics

    NASA Astrophysics Data System (ADS)

    Taheri, Mehdi; Ahmadian, Mehdi

    2016-05-01

    The application of stochastic modelling for learning the behaviour of a multibody dynamics (MBD) models is investigated. Post-processing data from a simulation run are used to train the stochastic model that estimates the relationship between model inputs (suspension relative displacement and velocity) and the output (sum of suspension forces). The stochastic model can be used to reduce the computational burden of the MBD model by replacing a computationally expensive subsystem in the model (suspension subsystem). With minor changes, the stochastic modelling technique is able to learn the behaviour of a physical system and integrate its behaviour within MBD models. The technique is highly advantageous for MBD models where real-time simulations are necessary, or with models that have a large number of repeated substructures, e.g. modelling a train with a large number of railcars. The fact that the training data are acquired prior to the development of the stochastic model discards the conventional sampling plan strategies like Latin Hypercube sampling plans where simulations are performed using the inputs dictated by the sampling plan. Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, a sampling plan suitable for the process is developed where the most space-filling subset of the acquired data with ? number of sample points that best describes the dynamic behaviour of the system under study is selected as the training data.

  3. Stochastic Investigation of Natural Frequency for Functionally Graded Plates

    NASA Astrophysics Data System (ADS)

    Karsh, P. K.; Mukhopadhyay, T.; Dey, S.

    2018-03-01

    This paper presents the stochastic natural frequency analysis of functionally graded plates by applying artificial neural network (ANN) approach. Latin hypercube sampling is utilised to train the ANN model. The proposed algorithm for stochastic natural frequency analysis of FGM plates is validated and verified with original finite element method and Monte Carlo simulation (MCS). The combined stochastic variation of input parameters such as, elastic modulus, shear modulus, Poisson ratio, and mass density are considered. Power law is applied to distribute the material properties across the thickness. The present ANN model reduces the sample size and computationally found efficient as compared to conventional Monte Carlo simulation.

  4. An efficient hybrid method for stochastic reaction-diffusion biochemical systems with delay

    NASA Astrophysics Data System (ADS)

    Sayyidmousavi, Alireza; Ilie, Silvana

    2017-12-01

    Many chemical reactions, such as gene transcription and translation in living cells, need a certain time to finish once they are initiated. Simulating stochastic models of reaction-diffusion systems with delay can be computationally expensive. In the present paper, a novel hybrid algorithm is proposed to accelerate the stochastic simulation of delayed reaction-diffusion systems. The delayed reactions may be of consuming or non-consuming delay type. The algorithm is designed for moderately stiff systems in which the events can be partitioned into slow and fast subsets according to their propensities. The proposed algorithm is applied to three benchmark problems and the results are compared with those of the delayed Inhomogeneous Stochastic Simulation Algorithm. The numerical results show that the new hybrid algorithm achieves considerable speed-up in the run time and very good accuracy.

  5. Emergence of chaos in a spatially confined reactive system

    NASA Astrophysics Data System (ADS)

    Voorsluijs, Valérie; De Decker, Yannick

    2016-11-01

    In spatially restricted media, interactions between particles and local fluctuations of density can lead to important deviations of the dynamics from the unconfined, deterministic picture. In this context, we investigated how molecular crowding can affect the emergence of chaos in small reactive systems. We developed to this end an amended version of the Willamowski-Rössler model, where we account for the impenetrability of the reactive species. We analyzed the deterministic kinetics of this model and studied it with spatially-extended stochastic simulations in which the mobility of particles is included explicitly. We show that homogeneous fluctuations can lead to a destruction of chaos through a fluctuation-induced collision between chaotic trajectories and absorbing states. However, an interplay between the size of the system and the mobility of particles can counterbalance this effect so that chaos can indeed be found when particles diffuse slowly. This unexpected effect can be traced back to the emergence of spatial correlations which strongly affect the dynamics. The mobility of particles effectively acts as a new bifurcation parameter, enabling the system to switch from stationary states to absorbing states, oscillations or chaos.

  6. Stochastic Simulation Using @ Risk for Dairy Business Investment Decisions

    USDA-ARS?s Scientific Manuscript database

    A dynamic, stochastic, mechanistic simulation model of a dairy business was developed to evaluate the cost and benefit streams coinciding with technology investments. The model was constructed to embody the biological and economical complexities of a dairy farm system within a partial budgeting fram...

  7. ASSESSING RESIDENTIAL EXPOSURE USING THE STOCHASTIC HUMAN EXPOSURE AND DOSE SIMULATION (SHEDS) MODEL

    EPA Science Inventory

    As part of a workshop sponsored by the Environmental Protection Agency's Office of Research and Development and Office of Pesticide Programs, the Aggregate Stochastic Human Exposure and Dose Simulation (SHEDS) Model was used to assess potential aggregate residential pesticide e...

  8. Stochastic Human Exposure and Dose Simulation for Air Toxics

    EPA Science Inventory

    The Stochastic Human Exposure and Dose Simulation model for Air Toxics (SHEDS-AirToxics) is a multimedia, multipathway population-based exposure and dose model for air toxics developed by the US EPA's National Exposure Research Laboratory (NERL). SHEDS-AirToxics uses a probabili...

  9. Nonlocal birth-death competitive dynamics with volume exclusion

    NASA Astrophysics Data System (ADS)

    Khalil, Nagi; López, Cristóbal; Hernández-García, Emilio

    2017-06-01

    A stochastic birth-death competition model for particles with excluded volume is proposed. The particles move, reproduce, and die on a regular lattice. While the death rate is constant, the birth rate is spatially nonlocal and implements inter-particle competition by a dependence on the number of particles within a finite distance. The finite volume of particles is accounted for by fixing an upper value to the number of particles that can occupy a lattice node, compromising births and movements. We derive closed macroscopic equations for the density of particles and spatial correlation at two adjacent sites. Under different conditions, the description is further reduced to a single equation for the particle density that contains three terms: diffusion, a linear death, and a highly nonlinear and nonlocal birth term. Steady-state homogeneous solutions, their stability which reveals spatial pattern formation, and the dynamics of time-dependent homogeneous solutions are discussed and compared, in the one-dimensional case, with numerical simulations of the particle system.

  10. Groundwater management under uncertainty using a stochastic multi-cell model

    NASA Astrophysics Data System (ADS)

    Joodavi, Ata; Zare, Mohammad; Ziaei, Ali Naghi; Ferré, Ty P. A.

    2017-08-01

    The optimization of spatially complex groundwater management models over long time horizons requires the use of computationally efficient groundwater flow models. This paper presents a new stochastic multi-cell lumped-parameter aquifer model that explicitly considers uncertainty in groundwater recharge. To achieve this, the multi-cell model is combined with the constrained-state formulation method. In this method, the lower and upper bounds of groundwater heads are incorporated into the mass balance equation using indicator functions. This provides expressions for the means, variances and covariances of the groundwater heads, which can be included in the constraint set in an optimization model. This method was used to formulate two separate stochastic models: (i) groundwater flow in a two-cell aquifer model with normal and non-normal distributions of groundwater recharge; and (ii) groundwater management in a multiple cell aquifer in which the differences between groundwater abstractions and water demands are minimized. The comparison between the results obtained from the proposed modeling technique with those from Monte Carlo simulation demonstrates the capability of the proposed models to approximate the means, variances and covariances. Significantly, considering covariances between the heads of adjacent cells allows a more accurate estimate of the variances of the groundwater heads. Moreover, this modeling technique requires no discretization of state variables, thus offering an efficient alternative to computationally demanding methods.

  11. A moment-convergence method for stochastic analysis of biochemical reaction networks.

    PubMed

    Zhang, Jiajun; Nie, Qing; Zhou, Tianshou

    2016-05-21

    Traditional moment-closure methods need to assume that high-order cumulants of a probability distribution approximate to zero. However, this strong assumption is not satisfied for many biochemical reaction networks. Here, we introduce convergent moments (defined in mathematics as the coefficients in the Taylor expansion of the probability-generating function at some point) to overcome this drawback of the moment-closure methods. As such, we develop a new analysis method for stochastic chemical kinetics. This method provides an accurate approximation for the master probability equation (MPE). In particular, the connection between low-order convergent moments and rate constants can be more easily derived in terms of explicit and analytical forms, allowing insights that would be difficult to obtain through direct simulation or manipulation of the MPE. In addition, it provides an accurate and efficient way to compute steady-state or transient probability distribution, avoiding the algorithmic difficulty associated with stiffness of the MPE due to large differences in sizes of rate constants. Applications of the method to several systems reveal nontrivial stochastic mechanisms of gene expression dynamics, e.g., intrinsic fluctuations can induce transient bimodality and amplify transient signals, and slow switching between promoter states can increase fluctuations in spatially heterogeneous signals. The overall approach has broad applications in modeling, analysis, and computation of complex biochemical networks with intrinsic noise.

  12. Evaluation of the risk of classical swine fever (CSF) spread from backyard pigs to other domestic pigs by using the spatial stochastic disease spread model Be-FAST: the example of Bulgaria.

    PubMed

    Martínez-López, Beatriz; Ivorra, Benjamin; Ramos, Angel Manuel; Fernández-Carrión, Eduardo; Alexandrov, Tsviatko; Sánchez-Vizcaíno, José Manuel

    2013-07-26

    The study presented here is one of the very first aimed at exploring the potential spread of classical swine fever (CSF) from backyard pigs to other domestic pigs. Specifically, we used a spatial stochastic spread model, called Be-FAST, to evaluate the potential spread of CSF virus (CSFV) in Bulgaria, which holds a large number of backyards (96% of the total number of pig farms) and is one of the very few countries for which backyard pigs and farm counts are available. The model revealed that, despite backyard pigs being very likely to become infected, infections from backyard pigs to other domestic pigs were rare. In general, the magnitude and duration of the CSF simulated epidemics were small, with a median [95% PI] number of infected farms per epidemic of 1 [1,4] and a median [95% PI] duration of the epidemic of 44 [17,101] days. CSFV transmission occurs primarily (81.16%) due to indirect contacts (i.e. vehicles, people and local spread) whereas detection of infected premises was mainly (69%) associated with the observation of clinical signs on farm rather than with implementation of tracing or zoning. Methods and results of this study may support the implementation of risk-based strategies more cost-effectively to prevent, control and, ultimately, eradicate CSF from Bulgaria. The model may also be easily adapted to other countries in which the backyard system is predominant. It can also be used to simulate other similar diseases such as African swine fever. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    PubMed Central

    Banerjee, Debjani; Bellesia, Giovanni; Daigle, Bernie J.; Douglas, Geoffrey; Gu, Mengyuan; Gupta, Anand; Hellander, Stefan; Horuk, Chris; Nath, Dibyendu; Takkar, Aviral; Lötstedt, Per; Petzold, Linda R.

    2016-01-01

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity. PMID:27930676

  14. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    DOE PAGES

    Drawert, Brian; Hellander, Andreas; Bales, Ben; ...

    2016-12-08

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources andmore » exchange models via a public model repository. We also demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.« less

  15. Simulation of proliferation and differentiation of cells in a stem-cell niche

    NASA Astrophysics Data System (ADS)

    Zhdanov, Vladimir P.

    2008-10-01

    Stem-cell niches represent microscopic compartments formed of environmental cells that nurture stem cells and enable them to maintain tissue homeostasis. The spatio-temporal kinetics of proliferation and differentiation of cells in such niches depend on the specifics of the niche structure and on adhesion and communication between cells and may also be influenced by spatial constraints on cell division. We propose a generic lattice model, taking all these factors into account, and systematically illustrate their role. The model is motivated by the experimental data available for the niches located in the subventricular zone of adult mammalian brain. The general conclusions drawn from our Monte Carlo simulations are applicable to other niches as well. One of our main findings is that the kinetics under consideration are highly stochastic due to a relatively small number of cells proliferating and differentiating in a niche and the autocatalytic character of the symmetric cell division. In particular, the kinetics exhibit huge stochastic bursts especially if the adhesion between cells is taken into account. In addition, the results obtained show that despite the small number of cells present in stem-cell niches, their arrangement can be predetermined to appreciable extent provided that the adhesion of different cells is different so that they tend to segregate.

  16. Errors and uncertainties in regional climate simulations of rainfall variability over Tunisia: a multi-model and multi-member approach

    NASA Astrophysics Data System (ADS)

    Fathalli, Bilel; Pohl, Benjamin; Castel, Thierry; Safi, Mohamed Jomâa

    2018-02-01

    Temporal and spatial variability of rainfall over Tunisia (at 12 km spatial resolution) is analyzed in a multi-year (1992-2011) ten-member ensemble simulation performed using the WRF model, and a sample of regional climate hindcast simulations from Euro-CORDEX. RCM errors and skills are evaluated against a dense network of local rain gauges. Uncertainties arising, on the one hand, from the different model configurations and, on the other hand, from internal variability are furthermore quantified and ranked at different timescales using simple spread metrics. Overall, the WRF simulation shows good skill for simulating spatial patterns of rainfall amounts over Tunisia, marked by strong altitudinal and latitudinal gradients, as well as the rainfall interannual variability, in spite of systematic errors. Mean rainfall biases are wet in both DJF and JJA seasons for the WRF ensemble, while they are dry in winter and wet in summer for most of the used Euro-CORDEX models. The sign of mean annual rainfall biases over Tunisia can also change from one member of the WRF ensemble to another. Skills in regionalizing precipitation over Tunisia are season dependent, with better correlations and weaker biases in winter. Larger inter-member spreads are observed in summer, likely because of (1) an attenuated large-scale control on Mediterranean and Tunisian climate, and (2) a larger contribution of local convective rainfall to the seasonal amounts. Inter-model uncertainties are globally stronger than those attributed to model's internal variability. However, inter-member spreads can be of the same magnitude in summer, emphasizing the important stochastic nature of the summertime rainfall variability over Tunisia.

  17. Bringing consistency to simulation of population models--Poisson simulation as a bridge between micro and macro simulation.

    PubMed

    Gustafsson, Leif; Sternad, Mikael

    2007-10-01

    Population models concern collections of discrete entities such as atoms, cells, humans, animals, etc., where the focus is on the number of entities in a population. Because of the complexity of such models, simulation is usually needed to reproduce their complete dynamic and stochastic behaviour. Two main types of simulation models are used for different purposes, namely micro-simulation models, where each individual is described with its particular attributes and behaviour, and macro-simulation models based on stochastic differential equations, where the population is described in aggregated terms by the number of individuals in different states. Consistency between micro- and macro-models is a crucial but often neglected aspect. This paper demonstrates how the Poisson Simulation technique can be used to produce a population macro-model consistent with the corresponding micro-model. This is accomplished by defining Poisson Simulation in strictly mathematical terms as a series of Poisson processes that generate sequences of Poisson distributions with dynamically varying parameters. The method can be applied to any population model. It provides the unique stochastic and dynamic macro-model consistent with a correct micro-model. The paper also presents a general macro form for stochastic and dynamic population models. In an appendix Poisson Simulation is compared with Markov Simulation showing a number of advantages. Especially aggregation into state variables and aggregation of many events per time-step makes Poisson Simulation orders of magnitude faster than Markov Simulation. Furthermore, you can build and execute much larger and more complicated models with Poisson Simulation than is possible with the Markov approach.

  18. An overview of the utility of population simulation software in molecular ecology.

    PubMed

    Hoban, Sean

    2014-05-01

    Stochastic simulation software that simultaneously model genetic, population and environmental processes can inform many topics in molecular ecology. These include forecasting species and community response to environmental change, inferring dispersal ecology, revealing cryptic mating, quantifying past population dynamics, assessing in situ management options and monitoring neutral and adaptive biodiversity change. Advances in population demographic-genetic simulation software, especially with respect to individual life history, landscapes and genetic processes, are transforming and expanding the ways that molecular data can be used. The aim of this review is to explain the roles that such software can play in molecular ecology studies (whether as a principal component or a supporting function) so that researchers can decide whether, when and precisely how simulations can be incorporated into their work. First, I use seven case studies to demonstrate how simulations are employed, their specific advantage/necessity and what alternative or complementary (nonsimulation) approaches are available. I also explain how simulations can be integrated with existing spatial, environmental, historical and genetic data sets. I next describe simulation features that may be of interest to molecular ecologists, such as spatial and behavioural considerations and species' interactions, to provide guidance on how particular simulation capabilities can serve particular needs. Lastly, I discuss the prospect of simulation software in emerging challenges (climate change, biodiversity monitoring, population exploitation) and opportunities (genomics, ancient DNA), in order to emphasize that the scope of simulation-based work is expanding. I also suggest practical considerations, priorities and elements of best practice. This should accelerate the uptake of simulation approaches and firmly embed them as a versatile tool in the molecular ecologist's toolbox. © 2014 John Wiley & Sons Ltd.

  19. Development of an in silico stochastic 4D model of tumor growth with angiogenesis.

    PubMed

    Forster, Jake C; Douglass, Michael J J; Harriss-Phillips, Wendy M; Bezak, Eva

    2017-04-01

    A stochastic computer model of tumour growth with spatial and temporal components that includes tumour angiogenesis was developed. In the current work it was used to simulate head and neck tumour growth. The model also provides the foundation for a 4D cellular radiotherapy simulation tool. The model, developed in Matlab, contains cell positions randomised in 3D space without overlap. Blood vessels are represented by strings of blood vessel units which branch outwards to achieve the desired tumour relative vascular volume. Hypoxic cells have an increased cell cycle time and become quiescent at oxygen tensions less than 1 mmHg. Necrotic cells are resorbed. A hierarchy of stem cells, transit cells and differentiated cells is considered along with differentiated cell loss. Model parameters include the relative vascular volume (2-10%), blood oxygenation (20-100 mmHg), distance from vessels to the onset of necrosis (80-300 μm) and probability for stem cells to undergo symmetric division (2%). Simulations were performed to observe the effects of hypoxia on tumour growth rate for head and neck cancers. Simulations were run on a supercomputer with eligible parts running in parallel on 12 cores. Using biologically plausible model parameters for head and neck cancers, the tumour volume doubling time varied from 45 ± 5 days (n = 3) for well oxygenated tumours to 87 ± 5 days (n = 3) for severely hypoxic tumours. The main achievements of the current model were randomised cell positions and the connected vasculature structure between the cells. These developments will also be beneficial when irradiating the simulated tumours using Monte Carlo track structure methods. © 2017 American Association of Physicists in Medicine.

  20. Parallel Stochastic discrete event simulation of calcium dynamics in neuron.

    PubMed

    Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W

    2017-09-26

    The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.

  1. Simulation of quantum dynamics based on the quantum stochastic differential equation.

    PubMed

    Li, Ming

    2013-01-01

    The quantum stochastic differential equation derived from the Lindblad form quantum master equation is investigated. The general formulation in terms of environment operators representing the quantum state diffusion is given. The numerical simulation algorithm of stochastic process of direct photodetection of a driven two-level system for the predictions of the dynamical behavior is proposed. The effectiveness and superiority of the algorithm are verified by the performance analysis of the accuracy and the computational cost in comparison with the classical Runge-Kutta algorithm.

  2. Further studies using matched filter theory and stochastic simulation for gust loads prediction

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd Iii

    1993-01-01

    This paper describes two analysis methods -- one deterministic, the other stochastic -- for computing maximized and time-correlated gust loads for aircraft with nonlinear control systems. The first method is based on matched filter theory; the second is based on stochastic simulation. The paper summarizes the methods, discusses the selection of gust intensity for each method and presents numerical results. A strong similarity between the results from the two methods is seen to exist for both linear and nonlinear configurations.

  3. Stochastic genome-nuclear lamina interactions: modulating roles of Lamin A and BAF.

    PubMed

    Kind, Jop; van Steensel, Bas

    2014-01-01

    The nuclear lamina (NL) is thought to aid in the spatial organization of interphase chromosomes by providing an anchoring platform for hundreds of large genomic regions named lamina associated domains (LADs). Recently, a new live-cell imaging approach demonstrated directly that LAD-NL interactions are dynamic and in part stochastic. Here we discuss implications of these new findings and introduce Lamin A and BAF as potential modulators of stochastic LAD positioning.

  4. Density behavior of spatial birth-and-death stochastic evolution of mutating genotypes under selection rates

    NASA Astrophysics Data System (ADS)

    Finkelshtein, D.; Kondratiev, Yu.; Kutoviy, O.; Molchanov, S.; Zhizhina, E.

    2014-10-01

    We consider birth-and-death stochastic evolution of genotypes with different lengths. The genotypes might mutate, which provides a stochastic changing of lengths by a free diffusion law. The birth and death rates are length dependent, which corresponds to a selection effect. We study an asymptotic behavior of a density for an infinite collection of genotypes. The cases of space homogeneous and space heterogeneous densities are considered.

  5. Comparing Lagrangian and Eulerian models for CO2 transport - a step towards Bayesian inverse modeling using WRF/STILT-VPRM

    NASA Astrophysics Data System (ADS)

    Pillai, D.; Gerbig, C.; Kretschmer, R.; Beck, V.; Karstens, U.; Neininger, B.; Heimann, M.

    2012-01-01

    We present simulations of atmospheric CO2 concentrations provided by two modeling systems, run at high spatial resolution: the Eulerian-based Weather Research Forecasting (WRF) model and the Lagrangian-based Stochastic Time-Inverted Lagrangian Transport (STILT) model, both of which are coupled to a diagnostic biospheric model, the Vegetation Photosynthesis and Respiration Model (VPRM). The consistency of the simulations is assessed with special attention paid to the details of horizontal as well as vertical transport and mixing of CO2 concentrations in the atmosphere. The dependence of model mismatch (Eulerian vs. Lagrangian) on models' spatial resolution is further investigated. A case study using airborne measurements during which both models showed large deviations from each other is analyzed in detail as an extreme case. Using aircraft observations and pulse release simulations, we identified differences in the representation of details in the interaction between turbulent mixing and advection through wind shear as the main cause of discrepancies between WRF and STILT transport at a spatial resolution such as 2 and 6 km. Based on observations and inter-model comparisons of atmospheric CO2 concentrations, we show that a refinement of the parameterization of turbulent velocity variance and Lagrangian time-scale in STILT is needed to achieve a better match between the Eulerian and the Lagrangian transport at such a high spatial resolution (e.g. 2 and 6 km). Nevertheless, the inter-model differences in simulated CO2 time series for a tall tower observatory at Ochsenkopf in Germany are about a factor of two smaller than the model-data mismatch and about a factor of three smaller than the mismatch between the current global model simulations and the data. Thus suggests that it is reasonable to use STILT as an adjoint model of WRF atmospheric transport.

  6. Parallel stochastic simulation of macroscopic calcium currents.

    PubMed

    González-Vélez, Virginia; González-Vélez, Horacio

    2007-06-01

    This work introduces MACACO, a macroscopic calcium currents simulator. It provides a parameter-sweep framework which computes macroscopic Ca(2+) currents from the individual aggregation of unitary currents, using a stochastic model for L-type Ca(2+) channels. MACACO uses a simplified 3-state Markov model to simulate the response of each Ca(2+) channel to different voltage inputs to the cell. In order to provide an accurate systematic view for the stochastic nature of the calcium channels, MACACO is composed of an experiment generator, a central simulation engine and a post-processing script component. Due to the computational complexity of the problem and the dimensions of the parameter space, the MACACO simulation engine employs a grid-enabled task farm. Having been designed as a computational biology tool, MACACO heavily borrows from the way cell physiologists conduct and report their experimental work.

  7. A multiscale computational model of spatially resolved calcium cycling in cardiac myocytes: from detailed cleft dynamics to the whole cell concentration profiles

    PubMed Central

    Vierheller, Janine; Neubert, Wilhelm; Falcke, Martin; Gilbert, Stephen H.; Chamakuri, Nagaiah

    2015-01-01

    Mathematical modeling of excitation-contraction coupling (ECC) in ventricular cardiac myocytes is a multiscale problem, and it is therefore difficult to develop spatially detailed simulation tools. ECC involves gradients on the length scale of 100 nm in dyadic spaces and concentration profiles along the 100 μm of the whole cell, as well as the sub-millisecond time scale of local concentration changes and the change of lumenal Ca2+ content within tens of seconds. Our concept for a multiscale mathematical model of Ca2+ -induced Ca2+ release (CICR) and whole cardiomyocyte electrophysiology incorporates stochastic simulation of individual LC- and RyR-channels, spatially detailed concentration dynamics in dyadic clefts, rabbit membrane potential dynamics, and a system of partial differential equations for myoplasmic and lumenal free Ca2+ and Ca2+-binding molecules in the bulk of the cell. We developed a novel computational approach to resolve the concentration gradients from dyadic space to cell level by using a quasistatic approximation within the dyad and finite element methods for integrating the partial differential equations. We show whole cell Ca2+-concentration profiles using three previously published RyR-channel Markov schemes. PMID:26441674

  8. Water quality modeling in the dead end sections of drinking water (Supplement)

    EPA Pesticide Factsheets

    Dead-end sections of drinking water distribution networks are known to be problematic zones in terms of water quality degradation. Extended residence time due to water stagnation leads to rapid reduction of disinfectant residuals allowing the regrowth of microbial pathogens. Water quality models developed so far apply spatial aggregation and temporal averaging techniques for hydraulic parameters by assigning hourly averaged water demands to the main nodes of the network. Although this practice has generally resulted in minimal loss of accuracy for the predicted disinfectant concentrations in main water transmission lines, this is not the case for the peripheries of the distribution network. This study proposes a new approach for simulating disinfectant residuals in dead end pipes while accounting for both spatial and temporal variability in hydraulic and transport parameters. A stochastic demand generator was developed to represent residential water pulses based on a non-homogenous Poisson process. Dispersive solute transport was considered using highly dynamic dispersion rates. A genetic algorithm was used tocalibrate the axial hydraulic profile of the dead-end pipe based on the different demand shares of the withdrawal nodes. A parametric sensitivity analysis was done to assess the model performance under variation of different simulation parameters. A group of Monte-Carlo ensembles was carried out to investigate the influence of spatial and temporal variation

  9. Water Quality Modeling in the Dead End Sections of Drinking ...

    EPA Pesticide Factsheets

    Dead-end sections of drinking water distribution networks are known to be problematic zones in terms of water quality degradation. Extended residence time due to water stagnation leads to rapid reduction of disinfectant residuals allowing the regrowth of microbial pathogens. Water quality models developed so far apply spatial aggregation and temporal averaging techniques for hydraulic parameters by assigning hourly averaged water demands to the main nodes of the network. Although this practice has generally resulted in minimal loss of accuracy for the predicted disinfectant concentrations in main water transmission lines, this is not the case for the peripheries of a distribution network. This study proposes a new approach for simulating disinfectant residuals in dead end pipes while accounting for both spatial and temporal variability in hydraulic and transport parameters. A stochastic demand generator was developed to represent residential water pulses based on a non-homogenous Poisson process. Dispersive solute transport was considered using highly dynamic dispersion rates. A genetic algorithm was used to calibrate the axial hydraulic profile of the dead-end pipe based on the different demand shares of the withdrawal nodes. A parametric sensitivity analysis was done to assess the model performance under variation of different simulation parameters. A group of Monte-Carlo ensembles was carried out to investigate the influence of spatial and temporal variations

  10. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  11. Enhanced simulator software for image validation and interpretation for multimodal localization super-resolution fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Erdélyi, Miklós; Sinkó, József; Gajdos, Tamás.; Novák, Tibor

    2017-02-01

    Optical super-resolution techniques such as single molecule localization have become one of the most dynamically developed areas in optical microscopy. These techniques routinely provide images of fixed cells or tissues with sub-diffraction spatial resolution, and can even be applied for live cell imaging under appropriate circumstances. Localization techniques are based on the precise fitting of the point spread functions (PSF) to the measured images of stochastically excited, identical fluorescent molecules. These techniques require controlling the rate between the on, off and the bleached states, keeping the number of active fluorescent molecules at an optimum value, so their diffraction limited images can be detected separately both spatially and temporally. Because of the numerous (and sometimes unknown) parameters, the imaging system can only be handled stochastically. For example, the rotation of the dye molecules obscures the polarization dependent PSF shape, and only an averaged distribution - typically estimated by a Gaussian function - is observed. TestSTORM software was developed to generate image stacks for traditional localization microscopes, where localization meant the precise determination of the spatial position of the molecules. However, additional optical properties (polarization, spectra, etc.) of the emitted photons can be used for further monitoring the chemical and physical properties (viscosity, pH, etc.) of the local environment. The image stack generating program was upgraded by several new features, such as: multicolour, polarization dependent PSF, built-in 3D visualization, structured background. These features make the program an ideal tool for optimizing the imaging and sample preparation conditions.

  12. Numerical simulation of runaway electrons: 3-D effects on synchrotron radiation and impurity-based runaway current dissipation

    NASA Astrophysics Data System (ADS)

    del-Castillo-Negrete, D.; Carbajal, L.; Spong, D.; Izzo, V.

    2018-05-01

    Numerical simulations of runaway electrons (REs) with a particular emphasis on orbit dependent effects in 3-D magnetic fields are presented. The simulations were performed using the recently developed Kinetic Orbit Runaway electron Code (KORC) that computes the full-orbit relativistic dynamics in prescribed electric and magnetic fields including radiation damping and collisions. The two main problems of interest are synchrotron radiation and impurity-based RE dissipation. Synchrotron radiation is studied in axisymmetric fields and in 3-D magnetic configurations exhibiting magnetic islands and stochasticity. For passing particles in axisymmetric fields, neglecting orbit effects might underestimate or overestimate the total radiation power depending on the direction of the radial shift of the drift orbits. For trapped particles, the spatial distribution of synchrotron radiation exhibits localized "hot" spots at the tips of the banana orbits. In general, the radiation power per particle for trapped particles is higher than the power emitted by passing particles. The spatial distribution of synchrotron radiation in stochastic magnetic fields, obtained using the MHD code NIMROD, is strongly influenced by the presence of magnetic islands. 3-D magnetic fields also introduce a toroidal dependence on the synchrotron spectra, and neglecting orbit effects underestimates the total radiation power. In the presence of magnetic islands, the radiation damping of trapped particles is larger than the radiation damping of passing particles. Results modeling synchrotron emission by RE in DIII-D quiescent plasmas are also presented. The computation uses EFIT reconstructed magnetic fields and RE energy distributions fitted to the experimental measurements. Qualitative agreement is observed between the numerical simulations and the experiments for simplified RE pitch angle distributions. However, it is noted that to achieve quantitative agreement, it is necessary to use pitch angle distributions that depart from simplified 2-D Fokker-Planck equilibria. Finally, using the guiding center orbit model (KORC-GC), a preliminary study of pellet mitigated discharges in DIII-D is presented. The dependence of RE energy decay and current dissipation on initial energy and ionization levels of neon impurities is studied. The computed decay rates are within the range of experimental observations.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Wei; Wang, Jin, E-mail: jin.wang.1@stonybrook.edu; State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, 130022 Changchun, China and College of Physics, Jilin University, 130021 Changchun

    We have established a general non-equilibrium thermodynamic formalism consistently applicable to both spatially homogeneous and, more importantly, spatially inhomogeneous systems, governed by the Langevin and Fokker-Planck stochastic dynamics with multiple state transition mechanisms, using the potential-flux landscape framework as a bridge connecting stochastic dynamics with non-equilibrium thermodynamics. A set of non-equilibrium thermodynamic equations, quantifying the relations of the non-equilibrium entropy, entropy flow, entropy production, and other thermodynamic quantities, together with their specific expressions, is constructed from a set of dynamical decomposition equations associated with the potential-flux landscape framework. The flux velocity plays a pivotal role on both the dynamic andmore » thermodynamic levels. On the dynamic level, it represents a dynamic force breaking detailed balance, entailing the dynamical decomposition equations. On the thermodynamic level, it represents a thermodynamic force generating entropy production, manifested in the non-equilibrium thermodynamic equations. The Ornstein-Uhlenbeck process and more specific examples, the spatial stochastic neuronal model, in particular, are studied to test and illustrate the general theory. This theoretical framework is particularly suitable to study the non-equilibrium (thermo)dynamics of spatially inhomogeneous systems abundant in nature. This paper is the second of a series.« less

  14. Universal Stochastic Multiscale Image Fusion: An Example Application for Shale Rock.

    PubMed

    Gerke, Kirill M; Karsanina, Marina V; Mallants, Dirk

    2015-11-02

    Spatial data captured with sensors of different resolution would provide a maximum degree of information if the data were to be merged into a single image representing all scales. We develop a general solution for merging multiscale categorical spatial data into a single dataset using stochastic reconstructions with rescaled correlation functions. The versatility of the method is demonstrated by merging three images of shale rock representing macro, micro and nanoscale spatial information on mineral, organic matter and porosity distribution. Merging multiscale images of shale rock is pivotal to quantify more reliably petrophysical properties needed for production optimization and environmental impacts minimization. Images obtained by X-ray microtomography and scanning electron microscopy were fused into a single image with predefined resolution. The methodology is sufficiently generic for implementation of other stochastic reconstruction techniques, any number of scales, any number of material phases, and any number of images for a given scale. The methodology can be further used to assess effective properties of fused porous media images or to compress voluminous spatial datasets for efficient data storage. Practical applications are not limited to petroleum engineering or more broadly geosciences, but will also find their way in material sciences, climatology, and remote sensing.

  15. Universal Stochastic Multiscale Image Fusion: An Example Application for Shale Rock

    PubMed Central

    Gerke, Kirill M.; Karsanina, Marina V.; Mallants, Dirk

    2015-01-01

    Spatial data captured with sensors of different resolution would provide a maximum degree of information if the data were to be merged into a single image representing all scales. We develop a general solution for merging multiscale categorical spatial data into a single dataset using stochastic reconstructions with rescaled correlation functions. The versatility of the method is demonstrated by merging three images of shale rock representing macro, micro and nanoscale spatial information on mineral, organic matter and porosity distribution. Merging multiscale images of shale rock is pivotal to quantify more reliably petrophysical properties needed for production optimization and environmental impacts minimization. Images obtained by X-ray microtomography and scanning electron microscopy were fused into a single image with predefined resolution. The methodology is sufficiently generic for implementation of other stochastic reconstruction techniques, any number of scales, any number of material phases, and any number of images for a given scale. The methodology can be further used to assess effective properties of fused porous media images or to compress voluminous spatial datasets for efficient data storage. Practical applications are not limited to petroleum engineering or more broadly geosciences, but will also find their way in material sciences, climatology, and remote sensing. PMID:26522938

  16. Multivariate Non-Symmetric Stochastic Models for Spatial Dependence Models

    NASA Astrophysics Data System (ADS)

    Haslauer, C. P.; Bárdossy, A.

    2017-12-01

    A copula based multivariate framework allows more flexibility to describe different kind of dependences than what is possible using models relying on the confining assumption of symmetric Gaussian models: different quantiles can be modelled with a different degree of dependence; it will be demonstrated how this can be expected given process understanding. maximum likelihood based multivariate quantitative parameter estimation yields stable and reliable results; not only improved results in cross-validation based measures of uncertainty are obtained but also a more realistic spatial structure of uncertainty compared to second order models of dependence; as much information as is available is included in the parameter estimation: incorporation of censored measurements (e.g., below detection limit, or ones that are above the sensitive range of the measurement device) yield to more realistic spatial models; the proportion of true zeros can be jointly estimated with and distinguished from censored measurements which allow estimates about the age of a contaminant in the system; secondary information (categorical and on the rational scale) has been used to improve the estimation of the primary variable; These copula based multivariate statistical techniques are demonstrated based on hydraulic conductivity observations at the Borden (Canada) site, the MADE site (USA), and a large regional groundwater quality data-set in south-west Germany. Fields of spatially distributed K were simulated with identical marginal simulation, identical second order spatial moments, yet substantially differing solute transport characteristics when numerical tracer tests were performed. A statistical methodology is shown that allows the delineation of a boundary layer separating homogenous parts of a spatial data-set. The effects of this boundary layer (macro structure) and the spatial dependence of K (micro structure) on solute transport behaviour is shown.

  17. Emergent Archetype Hydrological-Biogeochemical Response Patterns in Heterogeneous Catchments

    NASA Astrophysics Data System (ADS)

    Jawitz, J. W.; Gall, H. E.; Rao, P.

    2013-12-01

    What can spatiotemporally integrated patterns observed in stream hydrologic and biogeochemical signals generated in response to transient hydro-climatic and anthropogenic forcing tell us about the interactions between spatially heterogeneous soil-mediated hydrological and biogeochemical processes? We seek to understand how the spatial structure of solute sources coupled with hydrologic responses affect observed concentration-discharge (C-Q) patterns. These patterns are expressions of the spatiotemporal structure of solute loads exported from managed catchments, and their likely ecological consequences manifested in receiving water bodies (e.g., wetlands, rivers, lakes, and coastal waters). We investigated the following broad questions: (1) How does the correlation between flow-generating areas and biogeochemical source areas across a catchment evolve under stochastic hydro-climatic forcing? (2) What are the feasible hydrologic and biogeochemical responses that lead to the emergence of the observed archetype C-Q patterns? and; (3) What implications do these coupled dynamics have for catchment monitoring and implementation of management practices? We categorize the observed temporal signals into three archetypical C-Q patterns: dilution; accretion, and constant concentration. We introduce a parsimonious stochastic model of heterogeneous catchments, which act as hydrologic and biogeochemical filters, to examine the relationship between spatial heterogeneity and temporal history of solute export signals. The core concept of the modeling framework is considering the types and degree of spatial correlation between solute source zones and flow generating zones, and activation of different portions of the catchments during rainfall events. Our overarching hypothesis is that each of the archetype C-Q patterns can be generated by explicitly linking landscape-scale hydrologic responses and spatial distributions of solute source properties within a catchment. The model simulations reproduce the three major C-Q patterns observed in published data, offering valuable insight into coupled catchment processes. The findings have important implications for effective catchment management for water quality improvement, and stream monitoring strategies.

  18. AESS: Accelerated Exact Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Jenkins, David D.; Peterson, Gregory D.

    2011-12-01

    The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution method: The Accelerated Exact Stochastic Simulation (AESS) tool provides implementations of a wide variety of popular variations on the Gillespie method. Users can select the specific algorithm considered most appropriate. Comparisons between the methods and with other available implementations indicate that AESS provides the fastest known implementation of Gillespie's method for a variety of test models. Users may wish to execute ensembles of simulations to sweep parameters or to obtain better statistical results, so AESS supports acceleration of ensembles of simulation using parallel processing with MPI, SSE vector units on x86 processors, and/or using NVIDIA GPUs with CUDA.

  19. Recent topographic evolution and erosion of the deglaciated Washington Cascades inferred from a stochastic landscape evolution model

    NASA Astrophysics Data System (ADS)

    Moon, S.; Shelef, E.; Hilley, G. E.

    2013-12-01

    The Washington Cascades is currently in topographic and erosional disequilibrium after deglaciation occurred around 11- 17 ka ago. The topography still shows the features inherited from prior alpine glacial processes (e.g., cirques, steep side-valleys, and flat valley bottoms), though postglacial processes are currently denuding this landscape. Our previous study in this area calculated the thousand-year-timescale denudation rates using cosmogenic 10Be concentration (CRN-denudation rates), and showed that they were ~ four times higher than million-year-timescale uplift rates. In addition, the spatial distribution of denudation rates showed a good correlation with a factor-of-ten variation in precipitation. We interpreted this correlation as reflecting the sensitivity of landslide triggering in over-steepened deglaciated topography to precipitation, which produced high denudation rates in wet areas that experienced frequent landsliding. We explored this interpretation using a model of postglacial surface processes that predicts the evolution of the topography and denudation rates within the deglaciated Washington Cascades. Specifically, we used the model to understand the controls on and timescales of landscape response to changes in the surface process regime after deglaciation. The postglacial adjustment of this landscape is modeled using a geomorphic-transport-law-based numerical model that includes processes of river incision, hillslope diffusion, and stochastic landslides. The surface lowering due to landslides is parameterized using a physically-based slope stability model coupled to a stochastic model of the generation of landslides. The model parameters of river incision and stochastic landslides are calibrated based on the rates and distribution of thousand-year-timescale denudation rates measured from cosmogenic 10Be isotopes. The probability distribution of model parameters required to fit the observed denudation rates shows comparable ranges from previous studies in similar rock types and climatic conditions. The calibrated parameters suggest that the dominant sediment source of river sediments originates from stochastic landslides. The magnitude of landslide denudation rates is determined by failure density (similar to landslide frequency), while their spatial distribution is largely controlled by precipitation and slope angles. Simulation results show that denudation rates decay over time and take approximately 130-180 ka to reach steady-state rates. This response timescale is longer than glacial/interglacial cycles, suggesting that frequent climatic perturbations during the Quaternary may prevent these types of landscapes from reaching a dynamic equilibrium with postglacial processes.

  20. Stochastic representation of fire behavior in a wildland fire protection planning model for California.

    Treesearch

    J. Keith Gilless; Jeremy S. Fried

    1998-01-01

    A fire behavior module was developed for the California Fire Economics Simulator version 2 (CFES2), a stochastic simulation model of initial attack on wildland fire used by the California Department of Forestry and Fire Protection. Fire rate of spread (ROS) and fire dispatch level (FDL) for simulated fires "occurring" on the same day are determined by making...

  1. Modeling animal movements using stochastic differential equations

    Treesearch

    Haiganoush K. Preisler; Alan A. Ager; Bruce K. Johnson; John G. Kie

    2004-01-01

    We describe the use of bivariate stochastic differential equations (SDE) for modeling movements of 216 radiocollared female Rocky Mountain elk at the Starkey Experimental Forest and Range in northeastern Oregon. Spatially and temporally explicit vector fields were estimated using approximating difference equations and nonparametric regression techniques. Estimated...

  2. Breaking the theoretical scaling limit for predicting quasiparticle energies: the stochastic GW approach.

    PubMed

    Neuhauser, Daniel; Gao, Yi; Arntsen, Christopher; Karshenas, Cyrus; Rabani, Eran; Baer, Roi

    2014-08-15

    We develop a formalism to calculate the quasiparticle energy within the GW many-body perturbation correction to the density functional theory. The occupied and virtual orbitals of the Kohn-Sham Hamiltonian are replaced by stochastic orbitals used to evaluate the Green function G, the polarization potential W, and, thereby, the GW self-energy. The stochastic GW (sGW) formalism relies on novel theoretical concepts such as stochastic time-dependent Hartree propagation, stochastic matrix compression, and spatial or temporal stochastic decoupling techniques. Beyond the theoretical interest, the formalism enables linear scaling GW calculations breaking the theoretical scaling limit for GW as well as circumventing the need for energy cutoff approximations. We illustrate the method for silicon nanocrystals of varying sizes with N_{e}>3000 electrons.

  3. Impact of number of realizations on the suitability of simulated weather data for hydrologic and environmental applications

    USDA-ARS?s Scientific Manuscript database

    Stochastic weather generators are widely used in hydrological, environmental, and agricultural applications to simulate and forecast weather time series. However, such stochastic processes usually produce random outputs hence the question on how representative the generated data are if obtained fro...

  4. Gene regulatory networks: a coarse-grained, equation-free approach to multiscale computation.

    PubMed

    Erban, Radek; Kevrekidis, Ioannis G; Adalsteinsson, David; Elston, Timothy C

    2006-02-28

    We present computer-assisted methods for analyzing stochastic models of gene regulatory networks. The main idea that underlies this equation-free analysis is the design and execution of appropriately initialized short bursts of stochastic simulations; the results of these are processed to estimate coarse-grained quantities of interest, such as mesoscopic transport coefficients. In particular, using a simple model of a genetic toggle switch, we illustrate the computation of an effective free energy Phi and of a state-dependent effective diffusion coefficient D that characterize an unavailable effective Fokker-Planck equation. Additionally we illustrate the linking of equation-free techniques with continuation methods for performing a form of stochastic "bifurcation analysis"; estimation of mean switching times in the case of a bistable switch is also implemented in this equation-free context. The accuracy of our methods is tested by direct comparison with long-time stochastic simulations. This type of equation-free analysis appears to be a promising approach to computing features of the long-time, coarse-grained behavior of certain classes of complex stochastic models of gene regulatory networks, circumventing the need for long Monte Carlo simulations.

  5. Digital hardware implementation of a stochastic two-dimensional neuron model.

    PubMed

    Grassia, F; Kohno, T; Levi, T

    2016-11-01

    This study explores the feasibility of stochastic neuron simulation in digital systems (FPGA), which realizes an implementation of a two-dimensional neuron model. The stochasticity is added by a source of current noise in the silicon neuron using an Ornstein-Uhlenbeck process. This approach uses digital computation to emulate individual neuron behavior using fixed point arithmetic operation. The neuron model's computations are performed in arithmetic pipelines. It was designed in VHDL language and simulated prior to mapping in the FPGA. The experimental results confirmed the validity of the developed stochastic FPGA implementation, which makes the implementation of the silicon neuron more biologically plausible for future hybrid experiments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Multicellular automaticity of cardiac cell monolayers: effects of density and spatial distribution of pacemaker cells

    NASA Astrophysics Data System (ADS)

    Elber Duverger, James; Boudreau-Béland, Jonathan; Le, Minh Duc; Comtois, Philippe

    2014-11-01

    Self-organization of pacemaker (PM) activity of interconnected elements is important to the general theory of reaction-diffusion systems as well as for applications such as PM activity in cardiac tissue to initiate beating of the heart. Monolayer cultures of neonatal rat ventricular myocytes (NRVMs) are often used as experimental models in studies on cardiac electrophysiology. These monolayers exhibit automaticity (spontaneous activation) of their electrical activity. At low plated density, cells usually show a heterogeneous population consisting of PM and quiescent excitable cells (QECs). It is therefore highly probable that monolayers of NRVMs consist of a heterogeneous network of the two cell types. However, the effects of density and spatial distribution of the PM cells on spontaneous activity of monolayers remain unknown. Thus, a simple stochastic pattern formation algorithm was implemented to distribute PM and QECs in a binary-like 2D network. A FitzHugh-Nagumo excitable medium was used to simulate electrical spontaneous and propagating activity. Simulations showed a clear nonlinear dependency of spontaneous activity (occurrence and amplitude of spontaneous period) on the spatial patterns of PM cells. In most simulations, the first initiation sites were found to be located near the substrate boundaries. Comparison with experimental data obtained from cardiomyocyte monolayers shows important similarities in the position of initiation site activity. However, limitations in the model that do not reflect the complex beat-to-beat variation found in experiments indicate the need for a more realistic cardiomyocyte representation.

  7. Multithreaded Stochastic PDES for Reactions and Diffusions in Neurons.

    PubMed

    Lin, Zhongwei; Tropper, Carl; Mcdougal, Robert A; Patoary, Mohammand Nazrul Ishlam; Lytton, William W; Yao, Yiping; Hines, Michael L

    2017-07-01

    Cells exhibit stochastic behavior when the number of molecules is small. Hence a stochastic reaction-diffusion simulator capable of working at scale can provide a more accurate view of molecular dynamics within the cell. This paper describes a parallel discrete event simulator, Neuron Time Warp-Multi Thread (NTW-MT), developed for the simulation of reaction diffusion models of neurons. To the best of our knowledge, this is the first parallel discrete event simulator oriented towards stochastic simulation of chemical reactions in a neuron. The simulator was developed as part of the NEURON project. NTW-MT is optimistic and thread-based, which attempts to capitalize on multi-core architectures used in high performance machines. It makes use of a multi-level queue for the pending event set and a single roll-back message in place of individual anti-messages to disperse contention and decrease the overhead of processing rollbacks. Global Virtual Time is computed asynchronously both within and among processes to get rid of the overhead for synchronizing threads. Memory usage is managed in order to avoid locking and unlocking when allocating and de-allocating memory and to maximize cache locality. We verified our simulator on a calcium buffer model. We examined its performance on a calcium wave model, comparing it to the performance of a process based optimistic simulator and a threaded simulator which uses a single priority queue for each thread. Our multi-threaded simulator is shown to achieve superior performance to these simulators. Finally, we demonstrated the scalability of our simulator on a larger CICR model and a more detailed CICR model.

  8. The subtle business of model reduction for stochastic chemical kinetics

    NASA Astrophysics Data System (ADS)

    Gillespie, Dan T.; Cao, Yang; Sanft, Kevin R.; Petzold, Linda R.

    2009-02-01

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S1⇌S2→S3, whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S3-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  9. The subtle business of model reduction for stochastic chemical kinetics.

    PubMed

    Gillespie, Dan T; Cao, Yang; Sanft, Kevin R; Petzold, Linda R

    2009-02-14

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S(1)<=>S(2)-->S(3), whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S(3)-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  10. Biochemical Network Stochastic Simulator (BioNetS): software for stochastic modeling of biochemical networks.

    PubMed

    Adalsteinsson, David; McMillen, David; Elston, Timothy C

    2004-03-08

    Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA) molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. We have developed the software package Biochemical Network Stochastic Simulator (BioNetS) for efficiently and accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous) for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solves the appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  11. Stochastic Convection Parameterizations

    NASA Technical Reports Server (NTRS)

    Teixeira, Joao; Reynolds, Carolyn; Suselj, Kay; Matheou, Georgios

    2012-01-01

    computational fluid dynamics, radiation, clouds, turbulence, convection, gravity waves, surface interaction, radiation interaction, cloud and aerosol microphysics, complexity (vegetation, biogeochemistry, radiation versus turbulence/convection stochastic approach, non-linearities, Monte Carlo, high resolutions, large-Eddy Simulations, cloud structure, plumes, saturation in tropics, forecasting, parameterizations, stochastic, radiation-clod interaction, hurricane forecasts

  12. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    DOE PAGES

    Moon, Jae; Manuel, Lance; Churchfield, Matthew; ...

    2017-12-28

    Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC) standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES). Stochastic characteristics of these LES waked wind velocity field,more » including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR) with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study's overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.« less

  13. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moon, Jae; Manuel, Lance; Churchfield, Matthew

    Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC) standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES). Stochastic characteristics of these LES waked wind velocity field,more » including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR) with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study's overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.« less

  14. Spontaneous action potentials and neural coding in unmyelinated axons.

    PubMed

    O'Donnell, Cian; van Rossum, Mark C W

    2015-04-01

    The voltage-gated Na and K channels in neurons are responsible for action potential generation. Because ion channels open and close in a stochastic fashion, spontaneous (ectopic) action potentials can result even in the absence of stimulation. While spontaneous action potentials have been studied in detail in single-compartment models, studies on spatially extended processes have been limited. The simulations and analysis presented here show that spontaneous rate in unmyelinated axon depends nonmonotonically on the length of the axon, that the spontaneous activity has sub-Poisson statistics, and that neural coding can be hampered by the spontaneous spikes by reducing the probability of transmitting the first spike in a train.

  15. Stochastic flux analysis of chemical reaction networks

    PubMed Central

    2013-01-01

    Background Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. Results We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. Conclusions We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network. PMID:24314153

  16. Stochastic flux analysis of chemical reaction networks.

    PubMed

    Kahramanoğulları, Ozan; Lynch, James F

    2013-12-07

    Chemical reaction networks provide an abstraction scheme for a broad range of models in biology and ecology. The two common means for simulating these networks are the deterministic and the stochastic approaches. The traditional deterministic approach, based on differential equations, enjoys a rich set of analysis techniques, including a treatment of reaction fluxes. However, the discrete stochastic simulations, which provide advantages in some cases, lack a quantitative treatment of network fluxes. We describe a method for flux analysis of chemical reaction networks, where flux is given by the flow of species between reactions in stochastic simulations of the network. Extending discrete event simulation algorithms, our method constructs several data structures, and thereby reveals a variety of statistics about resource creation and consumption during the simulation. We use these structures to quantify the causal interdependence and relative importance of the reactions at arbitrary time intervals with respect to the network fluxes. This allows us to construct reduced networks that have the same flux-behavior, and compare these networks, also with respect to their time series. We demonstrate our approach on an extended example based on a published ODE model of the same network, that is, Rho GTP-binding proteins, and on other models from biology and ecology. We provide a fully stochastic treatment of flux analysis. As in deterministic analysis, our method delivers the network behavior in terms of species transformations. Moreover, our stochastic analysis can be applied, not only at steady state, but at arbitrary time intervals, and used to identify the flow of specific species between specific reactions. Our cases study of Rho GTP-binding proteins reveals the role played by the cyclic reverse fluxes in tuning the behavior of this network.

  17. Joint space-time geostatistical model for air quality surveillance

    NASA Astrophysics Data System (ADS)

    Russo, A.; Soares, A.; Pereira, M. J.

    2009-04-01

    Air pollution and peoples' generalized concern about air quality are, nowadays, considered to be a global problem. Although the introduction of rigid air pollution regulations has reduced pollution from industry and power stations, the growing number of cars on the road poses a new pollution problem. Considering the characteristics of the atmospheric circulation and also the residence times of certain pollutants in the atmosphere, a generalized and growing interest on air quality issues led to research intensification and publication of several articles with quite different levels of scientific depth. As most natural phenomena, air quality can be seen as a space-time process, where space-time relationships have usually quite different characteristics and levels of uncertainty. As a result, the simultaneous integration of space and time is not an easy task to perform. This problem is overcome by a variety of methodologies. The use of stochastic models and neural networks to characterize space-time dispersion of air quality is becoming a common practice. The main objective of this work is to produce an air quality model which allows forecasting critical concentration episodes of a certain pollutant by means of a hybrid approach, based on the combined use of neural network models and stochastic simulations. A stochastic simulation of the spatial component with a space-time trend model is proposed to characterize critical situations, taking into account data from the past and a space-time trend from the recent past. To identify near future critical episodes, predicted values from neural networks are used at each monitoring station. In this paper, we describe the design of a hybrid forecasting tool for ambient NO2 concentrations in Lisbon, Portugal.

  18. Exact and approximate stochastic simulation of intracellular calcium dynamics.

    PubMed

    Wieder, Nicolas; Fink, Rainer H A; Wegner, Frederic von

    2011-01-01

    In simulations of chemical systems, the main task is to find an exact or approximate solution of the chemical master equation (CME) that satisfies certain constraints with respect to computation time and accuracy. While Brownian motion simulations of single molecules are often too time consuming to represent the mesoscopic level, the classical Gillespie algorithm is a stochastically exact algorithm that provides satisfying results in the representation of calcium microdomains. Gillespie's algorithm can be approximated via the tau-leap method and the chemical Langevin equation (CLE). Both methods lead to a substantial acceleration in computation time and a relatively small decrease in accuracy. Elimination of the noise terms leads to the classical, deterministic reaction rate equations (RRE). For complex multiscale systems, hybrid simulations are increasingly proposed to combine the advantages of stochastic and deterministic algorithms. An often used exemplary cell type in this context are striated muscle cells (e.g., cardiac and skeletal muscle cells). The properties of these cells are well described and they express many common calcium-dependent signaling pathways. The purpose of the present paper is to provide an overview of the aforementioned simulation approaches and their mutual relationships in the spectrum ranging from stochastic to deterministic algorithms.

  19. Stochastic wave-function simulation of irreversible emission processes for open quantum systems in a non-Markovian environment

    NASA Astrophysics Data System (ADS)

    Polyakov, Evgeny A.; Rubtsov, Alexey N.

    2018-02-01

    When conducting the numerical simulation of quantum transport, the main obstacle is a rapid growth of the dimension of entangled Hilbert subspace. The Quantum Monte Carlo simulation techniques, while being capable of treating the problems of high dimension, are hindered by the so-called "sign problem". In the quantum transport, we have fundamental asymmetry between the processes of emission and absorption of environment excitations: the emitted excitations are rapidly and irreversibly scattered away. Whereas only a small part of these excitations is absorbed back by the open subsystem, thus exercising the non-Markovian self-action of the subsystem onto itself. We were able to devise a method for the exact simulation of the dominant quantum emission processes, while taking into account the small backaction effects in an approximate self-consistent way. Such an approach allows us to efficiently conduct simulations of real-time dynamics of small quantum subsystems immersed in non-Markovian bath for large times, reaching the quasistationary regime. As an example we calculate the spatial quench dynamics of Kondo cloud for a bozonized Kodno impurity model.

  20. Monte Carlo simulation of reflection spectra of random multilayer media strongly scattering and absorbing light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meglinskii, I V

    2001-12-31

    The reflection spectra of a multilayer random medium - the human skin - strongly scattering and absorbing light are numerically simulated. The propagation of light in the medium and the absorption spectra are simulated by the stochastic Monte Carlo method, which combines schemes for calculations of real photon trajectories and the statistical weight method. The model takes into account the inhomogeneous spatial distribution of blood vessels, water, and melanin, the degree of blood oxygenation, and the hematocrit index. The attenuation of the incident radiation caused by reflection and refraction at Fresnel boundaries of layers inside the medium is also considered.more » The simulated reflection spectra are compared with the experimental reflection spectra of the human skin. It is shown that a set of parameters that was used to describe the optical properties of skin layers and their possible variations, despite being far from complete, is nevertheless sufficient for the simulation of the reflection spectra of the human skin and their quantitative analysis. (laser applications and other topics in quantum electronics)« less

  1. Stochastic modeling and simulation of reaction-diffusion system with Hill function dynamics.

    PubMed

    Chen, Minghan; Li, Fei; Wang, Shuo; Cao, Young

    2017-03-14

    Stochastic simulation of reaction-diffusion systems presents great challenges for spatiotemporal biological modeling and simulation. One widely used framework for stochastic simulation of reaction-diffusion systems is reaction diffusion master equation (RDME). Previous studies have discovered that for the RDME, when discretization size approaches zero, reaction time for bimolecular reactions in high dimensional domains tends to infinity. In this paper, we demonstrate that in the 1D domain, highly nonlinear reaction dynamics given by Hill function may also have dramatic change when discretization size is smaller than a critical value. Moreover, we discuss methods to avoid this problem: smoothing over space, fixed length smoothing over space and a hybrid method. Our analysis reveals that the switch-like Hill dynamics reduces to a linear function of discretization size when the discretization size is small enough. The three proposed methods could correctly (under certain precision) simulate Hill function dynamics in the microscopic RDME system.

  2. Theoretical study of a molecular turbine.

    PubMed

    Perez-Carrasco, R; Sancho, J M

    2013-10-01

    We present an analytic and stochastic simulation study of a molecular engine working with a flux of particles as a turbine. We focus on the physical observables of velocity, flux, power, and efficiency. The control parameters are the external conservative force and the particle densities. We revise a simpler previous study by using a more realistic model containing multiple equidistant vanes complemented by stochastic simulations of the particles and the turbine. Here we show that the effect of the thermal fluctuations into the flux and the efficiency of these nanometric devices are relevant to the working scale of the system. The stochastic simulations of the Brownian motion of the particles and turbine support the simplified analytical calculations performed.

  3. The viability of ADVANTG deterministic method for synthetic radiography generation

    NASA Astrophysics Data System (ADS)

    Bingham, Andrew; Lee, Hyoung K.

    2018-07-01

    Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.

  4. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    PubMed

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license.

  5. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    PubMed Central

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license. PMID:22723865

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bellesia, Giovanni; Bales, Benjamin B.

    Here, we investigate, via Brownian dynamics simulations, the reaction dynamics of a generic, nonlinear chemical network under spatial confinement and crowding conditions. In detail, the Willamowski-Rossler chemical reaction system has been “extended” and considered as a prototype reaction-diffusion system. These results are potentially relevant to a number of open problems in biophysics and biochemistry, such as the synthesis of primitive cellular units (protocells) and the definition of their role in the chemical origin of life and the characterization of vesicle-mediated drug delivery processes. More generally, the computational approach presented in this work makes the case for the use of spatialmore » stochastic simulation methods for the study of biochemical networks in vivo where the “well-mixed” approximation is invalid and both thermal and intrinsic fluctuations linked to the possible presence of molecular species in low number copies cannot be averaged out.« less

  7. Time series analysis of personal exposure to ambient air pollution and mortality using an exposure simulator.

    PubMed

    Chang, Howard H; Fuentes, Montserrat; Frey, H Christopher

    2012-09-01

    This paper describes a modeling framework for estimating the acute effects of personal exposure to ambient air pollution in a time series design. First, a spatial hierarchical model is used to relate Census tract-level daily ambient concentrations and simulated exposures for a subset of the study period. The complete exposure time series is then imputed for risk estimation. Modeling exposure via a statistical model reduces the computational burden associated with simulating personal exposures considerably. This allows us to consider personal exposures at a finer spatial resolution to improve exposure assessment and for a longer study period. The proposed approach is applied to an analysis of fine particulate matter of <2.5 μm in aerodynamic diameter (PM(2.5)) and daily mortality in the New York City metropolitan area during the period 2001-2005. Personal PM(2.5) exposures were simulated from the Stochastic Human Exposure and Dose Simulation. Accounting for exposure uncertainty, the authors estimated a 2.32% (95% posterior interval: 0.68, 3.94) increase in mortality per a 10 μg/m(3) increase in personal exposure to PM(2.5) from outdoor sources on the previous day. The corresponding estimates per a 10 μg/m(3) increase in PM(2.5) ambient concentration was 1.13% (95% confidence interval: 0.27, 2.00). The risks of mortality associated with PM(2.5) were also higher during the summer months.

  8. The High-Throughput Stochastic Human Exposure and Dose Simulation Model (SHEDS-HT) & The Chemical and Products Database (CPDat)

    EPA Science Inventory

    The Stochastic Human Exposure and Dose Simulation Model – High-Throughput (SHEDS-HT) is a U.S. Environmental Protection Agency research tool for predicting screening-level (low-tier) exposures to chemicals in consumer products. This course will present an overview of this m...

  9. Comparison of holstein and jersey milk production with a new stochastic animal reproduction model

    USDA-ARS?s Scientific Manuscript database

    Holsteins and Jerseys are the most popular breeds in the US dairy industry. We built a stochastic, Monte Carlo life events simulation model in Python to test if Jersey cattle’s higher conception rate offsets their lower milk production. The model simulates individual cows and their life events such ...

  10. Development and Evaluation of a New Air Exchange Rate Algorithm for the Stochastic Human Exposure and Dose Simulation Model

    EPA Science Inventory

    between-home and between-city variability in residential pollutant infiltration. This is likely a result of differences in home ventilation, or air exchange rates (AER). The Stochastic Human Exposure and Dose Simulation (SHEDS) model is a population exposure model that uses a pro...

  11. Parameter-based stochastic simulation of selection and breeding for multiple traits

    Treesearch

    Jennifer Myszewski; Thomas Byram; Floyd Bridgwater

    2006-01-01

    To increase the adaptability and economic value of plantations, tree improvement professionals often manage multiple traits in their breeding programs. When these traits are unfavorably correlated, breeders must weigh the economic importance of each trait and select for a desirable aggregate phenotype. Stochastic simulation allows breeders to test the effects of...

  12. Reflected stochastic differential equation models for constrained animal movement

    USGS Publications Warehouse

    Hanks, Ephraim M.; Johnson, Devin S.; Hooten, Mevin B.

    2017-01-01

    Movement for many animal species is constrained in space by barriers such as rivers, shorelines, or impassable cliffs. We develop an approach for modeling animal movement constrained in space by considering a class of constrained stochastic processes, reflected stochastic differential equations. Our approach generalizes existing methods for modeling unconstrained animal movement. We present methods for simulation and inference based on augmenting the constrained movement path with a latent unconstrained path and illustrate this augmentation with a simulation example and an analysis of telemetry data from a Steller sea lion (Eumatopias jubatus) in southeast Alaska.

  13. State-dependent biasing method for importance sampling in the weighted stochastic simulation algorithm.

    PubMed

    Roh, Min K; Gillespie, Dan T; Petzold, Linda R

    2010-11-07

    The weighted stochastic simulation algorithm (wSSA) was developed by Kuwahara and Mura [J. Chem. Phys. 129, 165101 (2008)] to efficiently estimate the probabilities of rare events in discrete stochastic systems. The wSSA uses importance sampling to enhance the statistical accuracy in the estimation of the probability of the rare event. The original algorithm biases the reaction selection step with a fixed importance sampling parameter. In this paper, we introduce a novel method where the biasing parameter is state-dependent. The new method features improved accuracy, efficiency, and robustness.

  14. Spatial constraints govern competition of mutant clones in human epidermis.

    PubMed

    Lynch, M D; Lynch, C N S; Craythorne, E; Liakath-Ali, K; Mallipeddi, R; Barker, J N; Watt, F M

    2017-10-24

    Deep sequencing can detect somatic DNA mutations in tissues permitting inference of clonal relationships. This has been applied to human epidermis, where sun exposure leads to the accumulation of mutations and an increased risk of skin cancer. However, previous studies have yielded conflicting conclusions about the relative importance of positive selection and neutral drift in clonal evolution. Here, we sequenced larger areas of skin than previously, focusing on cancer-prone skin spanning five decades of life. The mutant clones identified were too large to be accounted for solely by neutral drift. Rather, using mathematical modelling and computational lattice-based simulations, we show that observed clone size distributions can be explained by a combination of neutral drift and stochastic nucleation of mutations at the boundary of expanding mutant clones that have a competitive advantage. These findings demonstrate that spatial context and cell competition cooperate to determine the fate of a mutant stem cell.

  15. Output Feedback Stabilization for a Class of Multi-Variable Bilinear Stochastic Systems with Stochastic Coupling Attenuation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Qichun; Zhou, Jinglin; Wang, Hong

    In this paper, stochastic coupling attenuation is investigated for a class of multi-variable bilinear stochastic systems and a novel output feedback m-block backstepping controller with linear estimator is designed, where gradient descent optimization is used to tune the design parameters of the controller. It has been shown that the trajectories of the closed-loop stochastic systems are bounded in probability sense and the stochastic coupling of the system outputs can be effectively attenuated by the proposed control algorithm. Moreover, the stability of the stochastic systems is analyzed and the effectiveness of the proposed method has been demonstrated using a simulated example.

  16. Recent topographic evolution and erosion of the deglaciated Washington Cascades inferred from a stochastic landscape evolution model

    NASA Astrophysics Data System (ADS)

    Moon, Seulgi; Shelef, Eitan; Hilley, George E.

    2015-05-01

    In this study, we model postglacial surface processes and examine the evolution of the topography and denudation rates within the deglaciated Washington Cascades to understand the controls on and time scales of landscape response to changes in the surface process regime after deglaciation. The postglacial adjustment of this landscape is modeled using a geomorphic-transport-law-based numerical model that includes processes of river incision, hillslope diffusion, and stochastic landslides. The surface lowering due to landslides is parameterized using a physically based slope stability model coupled to a stochastic model of the generation of landslides. The model parameters of river incision and stochastic landslides are calibrated based on the rates and distribution of thousand-year-time scale denudation rates measured from cosmogenic 10Be isotopes. The probability distributions of those model parameters calculated based on a Bayesian inversion scheme show comparable ranges from previous studies in similar rock types and climatic conditions. The magnitude of landslide denudation rates is determined by failure density (similar to landslide frequency), whereas precipitation and slopes affect the spatial variation in landslide denudation rates. Simulation results show that postglacial denudation rates decay over time and take longer than 100 kyr to reach time-invariant rates. Over time, the landslides in the model consume the steep slopes characteristic of deglaciated landscapes. This response time scale is on the order of or longer than glacial/interglacial cycles, suggesting that frequent climatic perturbations during the Quaternary may produce a significant and prolonged impact on denudation and topography.

  17. Using Equation-Free Computation to Accelerate Network-Free Stochastic Simulation of Chemical Kinetics.

    PubMed

    Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S

    2018-06-21

    The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.

  18. Climate SPHINX: evaluating the impact of resolution and stochastic physics parameterisations in the EC-Earth global climate model

    NASA Astrophysics Data System (ADS)

    Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Christensen, Hannah M.; Juricke, Stephan; Subramanian, Aneesh; Watson, Peter A. G.; Weisheimer, Antje; Palmer, Tim N.

    2017-03-01

    The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), together with coupled transient runs (1850-2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate - specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).

  19. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. © The Author(s) 2016.

  20. Compressing random microstructures via stochastic Wang tilings.

    PubMed

    Novák, Jan; Kučerová, Anna; Zeman, Jan

    2012-10-01

    This Rapid Communication presents a stochastic Wang tiling-based technique to compress or reconstruct disordered microstructures on the basis of given spatial statistics. Unlike the existing approaches based on a single unit cell, it utilizes a finite set of tiles assembled by a stochastic tiling algorithm, thereby allowing to accurately reproduce long-range orientation orders in a computationally efficient manner. Although the basic features of the method are demonstrated for a two-dimensional particulate suspension, the present framework is fully extensible to generic multidimensional media.

  1. Scattering theory of stochastic electromagnetic light waves.

    PubMed

    Wang, Tao; Zhao, Daomu

    2010-07-15

    We generalize scattering theory to stochastic electromagnetic light waves. It is shown that when a stochastic electromagnetic light wave is scattered from a medium, the properties of the scattered field can be characterized by a 3 x 3 cross-spectral density matrix. An example of scattering of a spatially coherent electromagnetic light wave from a deterministic medium is discussed. Some interesting phenomena emerge, including the changes of the spectral degree of coherence and of the spectral degree of polarization of the scattered field.

  2. Physically-based parameterization of spatially variable soil and vegetation using satellite multispectral data

    NASA Technical Reports Server (NTRS)

    Jasinski, Michael F.; Eagleson, Peter S.

    1989-01-01

    A stochastic-geometric landsurface reflectance model is formulated and tested for the parameterization of spatially variable vegetation and soil at subpixel scales using satellite multispectral images without ground truth. Landscapes are conceptualized as 3-D Lambertian reflecting surfaces consisting of plant canopies, represented by solid geometric figures, superposed on a flat soil background. A computer simulation program is developed to investigate image characteristics at various spatial aggregations representative of satellite observational scales, or pixels. The evolution of the shape and structure of the red-infrared space, or scattergram, of typical semivegetated scenes is investigated by sequentially introducing model variables into the simulation. The analytical moments of the total pixel reflectance, including the mean, variance, spatial covariance, and cross-spectral covariance, are derived in terms of the moments of the individual fractional cover and reflectance components. The moments are applied to the solution of the inverse problem: The estimation of subpixel landscape properties on a pixel-by-pixel basis, given only one multispectral image and limited assumptions on the structure of the landscape. The landsurface reflectance model and inversion technique are tested using actual aerial radiometric data collected over regularly spaced pecan trees, and using both aerial and LANDSAT Thematic Mapper data obtained over discontinuous, randomly spaced conifer canopies in a natural forested watershed. Different amounts of solar backscattered diffuse radiation are assumed and the sensitivity of the estimated landsurface parameters to those amounts is examined.

  3. Contributions of the stochastic shape wake model to predictions of aerodynamic loads and power under single wake conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doubrawa, P.; Barthelmie, R. J.; Wang, H.

    The contribution of wake meandering and shape asymmetry to load and power estimates is quantified by comparing aeroelastic simulations initialized with different inflow conditions: an axisymmetric base wake, an unsteady stochastic shape wake, and a large-eddy simulation with rotating actuator-line turbine representation. Time series of blade-root and tower base bending moments are analyzed. We find that meandering has a large contribution to the fluctuation of the loads. Moreover, considering the wake edge intermittence via the stochastic shape model improves the simulation of load and power fluctuations and of the fatigue damage equivalent loads. Furthermore, these results indicate that the stochasticmore » shape wake simulator is a valuable addition to simplified wake models when seeking to obtain higher-fidelity computationally inexpensive predictions of loads and power.« less

  4. Contributions of the stochastic shape wake model to predictions of aerodynamic loads and power under single wake conditions

    DOE PAGES

    Doubrawa, P.; Barthelmie, R. J.; Wang, H.; ...

    2016-10-03

    The contribution of wake meandering and shape asymmetry to load and power estimates is quantified by comparing aeroelastic simulations initialized with different inflow conditions: an axisymmetric base wake, an unsteady stochastic shape wake, and a large-eddy simulation with rotating actuator-line turbine representation. Time series of blade-root and tower base bending moments are analyzed. We find that meandering has a large contribution to the fluctuation of the loads. Moreover, considering the wake edge intermittence via the stochastic shape model improves the simulation of load and power fluctuations and of the fatigue damage equivalent loads. Furthermore, these results indicate that the stochasticmore » shape wake simulator is a valuable addition to simplified wake models when seeking to obtain higher-fidelity computationally inexpensive predictions of loads and power.« less

  5. On the deterministic and stochastic use of hydrologic models

    USGS Publications Warehouse

    Farmer, William H.; Vogel, Richard M.

    2016-01-01

    Environmental simulation models, such as precipitation-runoff watershed models, are increasingly used in a deterministic manner for environmental and water resources design, planning, and management. In operational hydrology, simulated responses are now routinely used to plan, design, and manage a very wide class of water resource systems. However, all such models are calibrated to existing data sets and retain some residual error. This residual, typically unknown in practice, is often ignored, implicitly trusting simulated responses as if they are deterministic quantities. In general, ignoring the residuals will result in simulated responses with distributional properties that do not mimic those of the observed responses. This discrepancy has major implications for the operational use of environmental simulation models as is shown here. Both a simple linear model and a distributed-parameter precipitation-runoff model are used to document the expected bias in the distributional properties of simulated responses when the residuals are ignored. The systematic reintroduction of residuals into simulated responses in a manner that produces stochastic output is shown to improve the distributional properties of the simulated responses. Every effort should be made to understand the distributional behavior of simulation residuals and to use environmental simulation models in a stochastic manner.

  6. Geographic variation in density-dependent dynamics impacts the synchronizing effect of dispersal and regional stochasticity

    Treesearch

    Andrew M. Liebhold; Derek M. Johnson; Ottar N. Bj& #248rnstad

    2006-01-01

    Explanations for the ubiquitous presence of spatially synchronous population dynamics have assumed that density-dependent processes governing the dynamics of local populations are identical among disjunct populations, and low levels of dispersal or small amounts of regionalized stochasticity ("Moran effect") can act to synchronize populations. In this study...

  7. Seed availability constrains plant species sorting along a soil fertility gradient

    Treesearch

    Bryan L. Foster; Erin J. Questad; Cathy D. Collins; Cheryl A. Murphy; Timothy L. Dickson; Val H. Smith

    2011-01-01

    1. Spatial variation in species composition within and among communities may be caused by deterministic, niche-based species sorting in response to underlying environmental heterogeneity as well as by stochastic factors such as dispersal limitation and variable species pools. An important goal in ecology is to reconcile deterministic and stochastic perspectives of...

  8. Stochastic Nature in Cellular Processes

    NASA Astrophysics Data System (ADS)

    Liu, Bo; Liu, Sheng-Jun; Wang, Qi; Yan, Shi-Wei; Geng, Yi-Zhao; Sakata, Fumihiko; Gao, Xing-Fa

    2011-11-01

    The importance of stochasticity in cellular processes is increasingly recognized in both theoretical and experimental studies. General features of stochasticity in gene regulation and expression are briefly reviewed in this article, which include the main experimental phenomena, classification, quantization and regulation of noises. The correlation and transmission of noise in cascade networks are analyzed further and the stochastic simulation methods that can capture effects of intrinsic and extrinsic noise are described.

  9. Introducing Stochastic Simulation of Chemical Reactions Using the Gillespie Algorithm and MATLAB: Revisited and Augmented

    ERIC Educational Resources Information Center

    Argoti, A.; Fan, L. T.; Cruz, J.; Chou, S. T.

    2008-01-01

    The stochastic simulation of chemical reactions, specifically, a simple reversible chemical reaction obeying the first-order, i.e., linear, rate law, has been presented by Martinez-Urreaga and his collaborators in this journal. The current contribution is intended to complement and augment their work in two aspects. First, the simple reversible…

  10. Analysing initial attack on wildland fires using stochastic simulation.

    Treesearch

    Jeremy S. Fried; J. Keith Gilless; James Spero

    2006-01-01

    Stochastic simulation models of initial attack on wildland fire can be designed to reflect the complexity of the environmental, administrative, and institutional context in which wildland fire protection agencies operate, but such complexity may come at the cost of a considerable investment in data acquisition and management. This cost may be well justified when it...

  11. Integrating Ecosystem Carbon Dynamics into State-and-Transition Simulation Models of Land Use/Land Cover Change

    NASA Astrophysics Data System (ADS)

    Sleeter, B. M.; Daniel, C.; Frid, L.; Fortin, M. J.

    2016-12-01

    State-and-transition simulation models (STSMs) provide a general approach for incorporating uncertainty into forecasts of landscape change. Using a Monte Carlo approach, STSMs generate spatially-explicit projections of the state of a landscape based upon probabilistic transitions defined between states. While STSMs are based on the basic principles of Markov chains, they have additional properties that make them applicable to a wide range of questions and types of landscapes. A current limitation of STSMs is that they are only able to track the fate of discrete state variables, such as land use/land cover (LULC) classes. There are some landscape modelling questions, however, for which continuous state variables - for example carbon biomass - are also required. Here we present a new approach for integrating continuous state variables into spatially-explicit STSMs. Specifically we allow any number of continuous state variables to be defined for each spatial cell in our simulations; the value of each continuous variable is then simulated forward in discrete time as a stochastic process based upon defined rates of change between variables. These rates can be defined as a function of the realized states and transitions of each cell in the STSM, thus providing a connection between the continuous variables and the dynamics of the landscape. We demonstrate this new approach by (1) developing a simple IPCC Tier 3 compliant model of ecosystem carbon biomass, where the continuous state variables are defined as terrestrial carbon biomass pools and the rates of change as carbon fluxes between pools, and (2) integrating this carbon model with an existing LULC change model for the state of Hawaii, USA.

  12. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    NASA Astrophysics Data System (ADS)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  13. StochKit2: software for discrete stochastic simulation of biochemical systems with events.

    PubMed

    Sanft, Kevin R; Wu, Sheng; Roh, Min; Fu, Jin; Lim, Rone Kwei; Petzold, Linda R

    2011-09-01

    StochKit2 is the first major upgrade of the popular StochKit stochastic simulation software package. StochKit2 provides highly efficient implementations of several variants of Gillespie's stochastic simulation algorithm (SSA), and tau-leaping with automatic step size selection. StochKit2 features include automatic selection of the optimal SSA method based on model properties, event handling, and automatic parallelism on multicore architectures. The underlying structure of the code has been completely updated to provide a flexible framework for extending its functionality. StochKit2 runs on Linux/Unix, Mac OS X and Windows. It is freely available under GPL version 3 and can be downloaded from http://sourceforge.net/projects/stochkit/. petzold@engineering.ucsb.edu.

  14. Spatial stochastic modelling of the Hes1 gene regulatory network: intrinsic noise can explain heterogeneity in embryonic stem cell differentiation.

    PubMed

    Sturrock, Marc; Hellander, Andreas; Matzavinos, Anastasios; Chaplain, Mark A J

    2013-03-06

    Individual mouse embryonic stem cells have been found to exhibit highly variable differentiation responses under the same environmental conditions. The noisy cyclic expression of Hes1 and its downstream genes are known to be responsible for this, but the mechanism underlying this variability in expression is not well understood. In this paper, we show that the observed experimental data and diverse differentiation responses can be explained by a spatial stochastic model of the Hes1 gene regulatory network. We also propose experiments to control the precise differentiation response using drug treatment.

  15. A multiple-point geostatistical method for characterizing uncertainty of subsurface alluvial units and its effects on flow and transport

    USGS Publications Warehouse

    Cronkite-Ratcliff, C.; Phelps, G.A.; Boucher, A.

    2012-01-01

    This report provides a proof-of-concept to demonstrate the potential application of multiple-point geostatistics for characterizing geologic heterogeneity and its effect on flow and transport simulation. The study presented in this report is the result of collaboration between the U.S. Geological Survey (USGS) and Stanford University. This collaboration focused on improving the characterization of alluvial deposits by incorporating prior knowledge of geologic structure and estimating the uncertainty of the modeled geologic units. In this study, geologic heterogeneity of alluvial units is characterized as a set of stochastic realizations, and uncertainty is indicated by variability in the results of flow and transport simulations for this set of realizations. This approach is tested on a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. Yucca Flat was chosen as a data source for this test case because it includes both complex geologic and hydrologic characteristics and also contains a substantial amount of both surface and subsurface geologic data. Multiple-point geostatistics is used to model geologic heterogeneity in the subsurface. A three-dimensional (3D) model of spatial variability is developed by integrating alluvial units mapped at the surface with vertical drill-hole data. The SNESIM (Single Normal Equation Simulation) algorithm is used to represent geologic heterogeneity stochastically by generating 20 realizations, each of which represents an equally probable geologic scenario. A 3D numerical model is used to simulate groundwater flow and contaminant transport for each realization, producing a distribution of flow and transport responses to the geologic heterogeneity. From this distribution of flow and transport responses, the frequency of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary.

  16. Backward-stochastic-differential-equation approach to modeling of gene expression

    NASA Astrophysics Data System (ADS)

    Shamarova, Evelina; Chertovskih, Roman; Ramos, Alexandre F.; Aguiar, Paulo

    2017-03-01

    In this article, we introduce a backward method to model stochastic gene expression and protein-level dynamics. The protein amount is regarded as a diffusion process and is described by a backward stochastic differential equation (BSDE). Unlike many other SDE techniques proposed in the literature, the BSDE method is backward in time; that is, instead of initial conditions it requires the specification of end-point ("final") conditions, in addition to the model parametrization. To validate our approach we employ Gillespie's stochastic simulation algorithm (SSA) to generate (forward) benchmark data, according to predefined gene network models. Numerical simulations show that the BSDE method is able to correctly infer the protein-level distributions that preceded a known final condition, obtained originally from the forward SSA. This makes the BSDE method a powerful systems biology tool for time-reversed simulations, allowing, for example, the assessment of the biological conditions (e.g., protein concentrations) that preceded an experimentally measured event of interest (e.g., mitosis, apoptosis, etc.).

  17. Backward-stochastic-differential-equation approach to modeling of gene expression.

    PubMed

    Shamarova, Evelina; Chertovskih, Roman; Ramos, Alexandre F; Aguiar, Paulo

    2017-03-01

    In this article, we introduce a backward method to model stochastic gene expression and protein-level dynamics. The protein amount is regarded as a diffusion process and is described by a backward stochastic differential equation (BSDE). Unlike many other SDE techniques proposed in the literature, the BSDE method is backward in time; that is, instead of initial conditions it requires the specification of end-point ("final") conditions, in addition to the model parametrization. To validate our approach we employ Gillespie's stochastic simulation algorithm (SSA) to generate (forward) benchmark data, according to predefined gene network models. Numerical simulations show that the BSDE method is able to correctly infer the protein-level distributions that preceded a known final condition, obtained originally from the forward SSA. This makes the BSDE method a powerful systems biology tool for time-reversed simulations, allowing, for example, the assessment of the biological conditions (e.g., protein concentrations) that preceded an experimentally measured event of interest (e.g., mitosis, apoptosis, etc.).

  18. Perpendicular and Parallel Ion Stochastic Heating by Kinetic Alfvén Wave Turbulence in the Solar Wind

    NASA Astrophysics Data System (ADS)

    Hoppock, I. W.; Chandran, B. D. G.

    2017-12-01

    The dissipation of turbulence is a prime candidate to explain the heating of collisionless plasmas like the solar wind. We consider the heating of protons and alpha particles using test particle simulations with a broad spectrum of randomly phased kinetic Alfvén waves (KAWs). Previous research extensively simulated and analytically considered stochastic heating at low plasma beta for conditions similar to coronal holes and the near-sun solar wind. We verify the analytical models of proton and alpha particle heating rates, and extend these simulations to plasmas with beta of order unity like in the solar wind at 1 au. Furthermore, we consider cases with very large beta of order 100, relevant to other astrophysical plasmas. We explore the parameter dependency of the critical KAW amplitude that breaks the gyro-center approximation and leads to stochastic gyro-orbits of the particles. Our results suggest that stochastic heating by KAW turbulence is an efficient heating mechanisms for moderate to high beta plasmas.

  19. Numerical modeling of interface displacement in heterogeneously wetting porous media

    NASA Astrophysics Data System (ADS)

    Hiller, T.; Brinkmann, M.; Herminghaus, S.

    2013-12-01

    We use the mesoscopic particle method stochastic rotation dynamics (SRD) to simulate immiscible multi-phase flow on the pore and sub-pore scale in three dimensions. As an extension to the standard SRD method, we present an approach on implementing complex wettability on heterogeneous surfaces. We use 3D SRD to simulate immiscible two-phase flow through a model porous medium (disordered packing of spherical beads) where the substrate exhibits different spatial wetting patterns. The simulations are designed to resemble experimental measurements of capillary pressure saturation. We show that the correlation length of the wetting patterns influences the temporal evolution of the interface and thus percolation, residual saturation and work dissipated during the fluid displacement. Our numerical results are in qualitatively good agreement with the experimental data. Besides of modeling flow in porous media, our SRD implementation allows us to address various questions of interfacial dynamics, e.g. the formation of capillary bridges between spherical beads or droplets in microfluidic applications to name only a few.

  20. CERENA: ChEmical REaction Network Analyzer--A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics.

    PubMed

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.

  1. CERENA: ChEmical REaction Network Analyzer—A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics

    PubMed Central

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J.; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/. PMID:26807911

  2. Getting the biggest birch for the bang: restoring and expanding upland birchwoods in the Scottish Highlands by managing red deer

    PubMed Central

    Tanentzap, Andrew J; Zou, James; Coomes, David A

    2013-01-01

    High deer populations threaten the conservation value of woodlands and grasslands, but predicting the success of deer culling, in terms of allowing vegetation to recover, is difficult. Numerical simulation modeling is one approach to gain insight into the outcomes of management scenarios. We develop a spatially explicit model to predict the responses of Betula spp. to red deer (Cervus elaphus) and land management in the Scottish Highlands. Our model integrates a Bayesian stochastic stage-based matrix model within the framework of a widely used individual-based forest simulation model, using data collected along spatial and temporal gradients in deer browsing. By initializing our model with the historical spatial locations of trees, we find that densities of juvenile trees (<3 m tall) predicted after 9–13 years closely match counts observed in the field. This is among the first tests of the accuracy of a dynamical simulation model for predicting the responses of tree regeneration to herbivores. We then test the relative importance of deer browsing, ground cover vegetation, and seed availability in facilitating landscape-level birch regeneration using simulations in which we varied these three variables. We find that deer primarily control transitions of birch to taller (>3 m) height tiers over 30 years, but regeneration also requires suitable ground cover for seedling establishment. Densities of adult seed sources did not influence regeneration, nor did an active management scenario where we altered the spatial configuration of adults by creating “woodland islets”. Our results show that managers interested in maximizing tree regeneration cannot simply reduce deer densities but must also improve ground cover for seedling establishment, and the model we develop now enables managers to quantify explicitly how much both these factors need to be altered. More broadly, our findings emphasize the need for land managers to consider the impacts of large herbivores rather than their densities. PMID:23919137

  3. Spreading speeds for plant populations in landscapes with low environmental variation.

    PubMed

    Gilbert, Mark A; Gaffney, Eamonn A; Bullock, James M; White, Steven M

    2014-12-21

    Characterising the spread of biological populations is crucial in responding to both biological invasions and the shifting of habitat under climate change. Spreading speeds can be studied through mathematical models such as the discrete-time integro-difference equation (IDE) framework. The usual approach in implementing IDE models has been to ignore spatial variation in the demographic and dispersal parameters and to assume that these are spatially homogeneous. On the other hand, real landscapes are rarely spatially uniform with environmental variation being very important in determining biological spread. This raises the question of under what circumstances spatial structure need not be modelled explicitly. Recent work has shown that spatial variation can be ignored for the specific case where the scale of landscape variation is much smaller than the spreading population׳s dispersal scale. We consider more general types of landscape, where the spatial scales of environmental variation are arbitrarily large, but the maximum change in environmental parameters is relatively small. We find that the difference between the wave-speeds of populations spreading in a spatially structured periodic landscape and its homogenisation is, in general, proportional to ϵ(2), where ϵ governs the degree of environmental variation. For stochastically generated landscapes we numerically demonstrate that the error decays faster than ϵ. In both cases, this means that for sufficiently small ϵ, the homogeneous approximation is better than might be expected. Hence, in many situations, the precise details of the landscape can be ignored in favour of spatially homogeneous parameters. This means that field ecologists can use the homogeneous IDE as a relatively simple modelling tool--in terms of both measuring parameter values and doing the modelling itself. However, as ϵ increases, this homogeneous approximation loses its accuracy. The change in wave-speed due to the extrinsic (landscape) variation can be positive or negative, which is in contrast to the reduction in wave-speed caused by intrinsic stochasticity. To deal with the loss of accuracy as ϵ increases, we formulate a second-order approximation to the wave-speed for periodic landscapes and compare both approximations against the results of numerical simulation and show that they are both accurate for the range of landscapes considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. New “Tau-Leap” Strategy for Accelerated Stochastic Simulation

    PubMed Central

    2015-01-01

    The “Tau-Leap” strategy for stochastic simulations of chemical reaction systems due to Gillespie and co-workers has had considerable impact on various applications. This strategy is reexamined with Chebyshev’s inequality for random variables as it provides a rigorous probabilistic basis for a measured τ-leap thus adding significantly to simulation efficiency. It is also shown that existing strategies for simulation times have no probabilistic assurance that they satisfy the τ-leap criterion while the use of Chebyshev’s inequality leads to a specified degree of certainty with which the τ-leap criterion is satisfied. This reduces the loss of sample paths which do not comply with the τ-leap criterion. The performance of the present algorithm is assessed, with respect to one discussed by Cao et al. (J. Chem. Phys.2006, 124, 044109), a second pertaining to binomial leap (Tian and Burrage J. Chem. Phys.2004, 121, 10356; Chatterjee et al. J. Chem. Phys.2005, 122, 024112; Peng et al. J. Chem. Phys.2007, 126, 224109), and a third regarding the midpoint Poisson leap (Peng et al., 2007; Gillespie J. Chem. Phys.2001, 115, 1716). The performance assessment is made by estimating the error in the histogram measured against that obtained with the so-called stochastic simulation algorithm. It is shown that the current algorithm displays notably less histogram error than its predecessor for a fixed computation time and, conversely, less computation time for a fixed accuracy. This computational advantage is an asset in repetitive calculations essential for modeling stochastic systems. The importance of stochastic simulations is derived from diverse areas of application in physical and biological sciences, process systems, and economics, etc. Computational improvements such as those reported herein are therefore of considerable significance. PMID:25620846

  5. New "Tau-Leap" Strategy for Accelerated Stochastic Simulation.

    PubMed

    Ramkrishna, Doraiswami; Shu, Che-Chi; Tran, Vu

    2014-12-10

    The "Tau-Leap" strategy for stochastic simulations of chemical reaction systems due to Gillespie and co-workers has had considerable impact on various applications. This strategy is reexamined with Chebyshev's inequality for random variables as it provides a rigorous probabilistic basis for a measured τ-leap thus adding significantly to simulation efficiency. It is also shown that existing strategies for simulation times have no probabilistic assurance that they satisfy the τ-leap criterion while the use of Chebyshev's inequality leads to a specified degree of certainty with which the τ-leap criterion is satisfied. This reduces the loss of sample paths which do not comply with the τ-leap criterion. The performance of the present algorithm is assessed, with respect to one discussed by Cao et al. ( J. Chem. Phys. 2006 , 124 , 044109), a second pertaining to binomial leap (Tian and Burrage J. Chem. Phys. 2004 , 121 , 10356; Chatterjee et al. J. Chem. Phys. 2005 , 122 , 024112; Peng et al. J. Chem. Phys. 2007 , 126 , 224109), and a third regarding the midpoint Poisson leap (Peng et al., 2007; Gillespie J. Chem. Phys. 2001 , 115 , 1716). The performance assessment is made by estimating the error in the histogram measured against that obtained with the so-called stochastic simulation algorithm. It is shown that the current algorithm displays notably less histogram error than its predecessor for a fixed computation time and, conversely, less computation time for a fixed accuracy. This computational advantage is an asset in repetitive calculations essential for modeling stochastic systems. The importance of stochastic simulations is derived from diverse areas of application in physical and biological sciences, process systems, and economics, etc. Computational improvements such as those reported herein are therefore of considerable significance.

  6. Stochastic locality and master-field simulations of very large lattices

    NASA Astrophysics Data System (ADS)

    Lüscher, Martin

    2018-03-01

    In lattice QCD and other field theories with a mass gap, the field variables in distant regions of a physically large lattice are only weakly correlated. Accurate stochastic estimates of the expectation values of local observables may therefore be obtained from a single representative field. Such master-field simulations potentially allow very large lattices to be simulated, but require various conceptual and technical issues to be addressed. In this talk, an introduction to the subject is provided and some encouraging results of master-field simulations of the SU(3) gauge theory are reported.

  7. A moment-convergence method for stochastic analysis of biochemical reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiajun; Nie, Qing; Zhou, Tianshou, E-mail: mcszhtsh@mail.sysu.edu.cn

    Traditional moment-closure methods need to assume that high-order cumulants of a probability distribution approximate to zero. However, this strong assumption is not satisfied for many biochemical reaction networks. Here, we introduce convergent moments (defined in mathematics as the coefficients in the Taylor expansion of the probability-generating function at some point) to overcome this drawback of the moment-closure methods. As such, we develop a new analysis method for stochastic chemical kinetics. This method provides an accurate approximation for the master probability equation (MPE). In particular, the connection between low-order convergent moments and rate constants can be more easily derived in termsmore » of explicit and analytical forms, allowing insights that would be difficult to obtain through direct simulation or manipulation of the MPE. In addition, it provides an accurate and efficient way to compute steady-state or transient probability distribution, avoiding the algorithmic difficulty associated with stiffness of the MPE due to large differences in sizes of rate constants. Applications of the method to several systems reveal nontrivial stochastic mechanisms of gene expression dynamics, e.g., intrinsic fluctuations can induce transient bimodality and amplify transient signals, and slow switching between promoter states can increase fluctuations in spatially heterogeneous signals. The overall approach has broad applications in modeling, analysis, and computation of complex biochemical networks with intrinsic noise.« less

  8. A Stochastic Polygons Model for Glandular Structures in Colon Histology Images.

    PubMed

    Sirinukunwattana, Korsuk; Snead, David R J; Rajpoot, Nasir M

    2015-11-01

    In this paper, we present a stochastic model for glandular structures in histology images of tissue slides stained with Hematoxylin and Eosin, choosing colon tissue as an example. The proposed Random Polygons Model (RPM) treats each glandular structure in an image as a polygon made of a random number of vertices, where the vertices represent approximate locations of epithelial nuclei. We formulate the RPM as a Bayesian inference problem by defining a prior for spatial connectivity and arrangement of neighboring epithelial nuclei and a likelihood for the presence of a glandular structure. The inference is made via a Reversible-Jump Markov chain Monte Carlo simulation. To the best of our knowledge, all existing published algorithms for gland segmentation are designed to mainly work on healthy samples, adenomas, and low grade adenocarcinomas. One of them has been demonstrated to work on intermediate grade adenocarcinomas at its best. Our experimental results show that the RPM yields favorable results, both quantitatively and qualitatively, for extraction of glandular structures in histology images of normal human colon tissues as well as benign and cancerous tissues, excluding undifferentiated carcinomas.

  9. Final Technical Report: Mathematical Foundations for Uncertainty Quantification in Materials Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plechac, Petr; Vlachos, Dionisios G.

    We developed path-wise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of non-equilibrium extended molecular systems. The combination of these novel methodologies provided the first methods in the literature which are capable to handle UQ questions for stochastic complex systems with some or all of the following features: (a) multi-scale stochastic models such as (bio)chemical reaction networks, with a very large number of parameters, (b) spatially distributed systems such as Kinetic Monte Carlo or Langevin Dynamics, (c) non-equilibrium processes typically associated with coupled physico-chemical mechanisms, driven boundary conditions, hybrid micro-macro systems,more » etc. A particular computational challenge arises in simulations of multi-scale reaction networks and molecular systems. Mathematical techniques were applied to in silico prediction of novel materials with emphasis on the effect of microstructure on model uncertainty quantification (UQ). We outline acceleration methods to make calculations of real chemistry feasible followed by two complementary tasks on structure optimization and microstructure-induced UQ.« less

  10. Spiking computation and stochastic amplification in a neuron-like semiconductor microstructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samardak, A. S.; Laboratory of Thin Film Technologies, Far Eastern Federal University, Vladivostok 690950; Nogaret, A.

    2011-05-15

    We have demonstrated the proof of principle of a semiconductor neuron, which has dendrites, axon, and a soma and computes information encoded in electrical pulses in the same way as biological neurons. Electrical impulses applied to dendrites diffuse along microwires to the soma. The soma is the active part of the neuron, which regenerates input pulses above a voltage threshold and transmits them into the axon. Our concept of neuron is a major step forward because its spatial structure controls the timing of pulses, which arrive at the soma. Dendrites and axon act as transmission delay lines, which modify themore » information, coded in the timing of pulses. We have finally shown that noise enhances the detection sensitivity of the neuron by helping the transmission of weak periodic signals. A maximum enhancement of signal transmission was observed at an optimum noise level known as stochastic resonance. The experimental results are in excellent agreement with simulations of the FitzHugh-Nagumo model. Our neuron is therefore extremely well suited to providing feedback on the various mathematical approximations of neurons and building functional networks.« less

  11. Macromolecular Crowding Regulates the Gene Expression Profile by Limiting Diffusion

    DOE PAGES

    Golkaram, Mahdi; Hellander, Stefan; Drawert, Brian; ...

    2016-11-28

    We seek to elucidate the role of macromolecular crowding in transcription and translation. It is well known that stochasticity in gene expression can lead to differential gene expression and heterogeneity in a cell population. Recent experimental observations by Tan et al. have improved our understanding of the functional role of macromolecular crowding. It can be inferred from their observations that macromolecular crowding can lead to robustness in gene expression, resulting in a more homogeneous cell population. We introduce a spatial stochastic model to provide insight into this process. Our results show that macromolecular crowding reduces noise (as measured by themore » kurtosis of the mRNA distribution) in a cell population by limiting the diffusion of transcription factors (i.e. removing the unstable intermediate states), and that crowding by large molecules reduces noise more efficiently than crowding by small molecules. Finally, our simulation results provide evidence that the local variation in chromatin density as well as the total volume exclusion of the chromatin in the nucleus can induce a homogenous cell population« less

  12. STOCHASTICITY AND EFFICIENCY IN SIMPLIFIED MODELS OF CORE-COLLAPSE SUPERNOVA EXPLOSIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardall, Christian Y.; Budiardja, Reuben D., E-mail: cardallcy@ornl.gov, E-mail: reubendb@utk.edu

    2015-11-01

    We present an initial report on 160 simulations of a highly simplified model of the post-bounce core-collapse supernova environment in three spatial dimensions (3D). We set different values of a parameter characterizing the impact of nuclear dissociation at the stalled shock in order to regulate the post-shock fluid velocity, thereby determining the relative importance of convection and the stationary accretion shock instability (SASI). While our convection-dominated runs comport with the paradigmatic notion of a “critical neutrino luminosity” for explosion at a given mass accretion rate (albeit with a nontrivial spread in explosion times just above threshold), the outcomes of ourmore » SASI-dominated runs are much more stochastic: a sharp threshold critical luminosity is “smeared out” into a rising probability of explosion over a ∼20% range of luminosity. We also find that the SASI-dominated models are able to explode with 3–4 times less efficient neutrino heating, indicating that progenitor properties, and fluid and neutrino microphysics, conducive to the SASI would make the neutrino-driven explosion mechanism more robust.« less

  13. A stochastic modeling of isotope exchange reactions in glutamine synthetase

    NASA Astrophysics Data System (ADS)

    Kazmiruk, N. V.; Boronovskiy, S. E.; Nartsissov, Ya R.

    2017-11-01

    The model presented in this work allows simulation of isotopic exchange reactions at chemical equilibrium catalyzed by a glutamine synthetase. To simulate the functioning of the enzyme the algorithm based on the stochastic approach was applied. The dependence of exchange rates for 14C and 32P on metabolite concentration was estimated. The simulation results confirmed the hypothesis of the ascertained validity for preferred order random binding mechanism. Corresponding values of K0.5 were also obtained.

  14. Stochastic search in structural optimization - Genetic algorithms and simulated annealing

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1993-01-01

    An account is given of illustrative applications of genetic algorithms and simulated annealing methods in structural optimization. The advantages of such stochastic search methods over traditional mathematical programming strategies are emphasized; it is noted that these methods offer a significantly higher probability of locating the global optimum in a multimodal design space. Both genetic-search and simulated annealing can be effectively used in problems with a mix of continuous, discrete, and integer design variables.

  15. Climate SPHINX: High-resolution present-day and future climate simulations with an improved representation of small-scale variability

    NASA Astrophysics Data System (ADS)

    Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Subramanian, Aneesh; Weisheimer, Antje; Christensen, Hannah; Juricke, Stephan; Palmer, Tim

    2016-04-01

    The PRACE Climate SPHINX project investigates the sensitivity of climate simulations to model resolution and stochastic parameterization. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in 30-years climate integrations as a function of model resolution (from 80km up to 16km for the atmosphere). The experiments include more than 70 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), using RCP8.5 CMIP5 forcing. A total amount of 20 million core hours will be used at end of the project (March 2016) and about 150 TBytes of post-processed data will be available to the climate community. Preliminary results show a clear improvement in the representation of climate variability over the Euro-Atlantic following resolution increase. More specifically, the well-known atmospheric blocking negative bias over Europe is definitely resolved. High resolution runs also show improved fidelity in representation of tropical variability - such as the MJO and its propagation - over the low resolution simulations. It is shown that including stochastic parameterization in the low resolution runs help to improve some of the aspects of the MJO propagation further. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).

  16. A multi-analysis approach for space-time and economic evaluation of risks related with livestock diseases: the example of FMD in Peru.

    PubMed

    Martínez-López, B; Ivorra, B; Fernández-Carrión, E; Perez, A M; Medel-Herrero, A; Sánchez-Vizcaíno, F; Gortázar, C; Ramos, A M; Sánchez-Vizcaíno, J M

    2014-04-01

    This study presents a multi-disciplinary decision-support tool, which integrates geo-statistics, social network analysis (SNA), spatial-stochastic spread model, economic analysis and mapping/visualization capabilities for the evaluation of the sanitary and socio-economic impact of livestock diseases under diverse epidemiologic scenarios. We illustrate the applicability of this tool using foot-and-mouth disease (FMD) in Peru as an example. The approach consisted on a flexible, multistep process that may be easily adapted based on data availability. The first module (mI) uses a geo-statistical approach for the estimation (if needed) of the distribution and abundance of susceptible population (in the example here, cattle, swine, sheep, goats, and camelids) at farm-level in the region or country of interest (Peru). The second module (mII) applies SNA for evaluating the farm-to-farm contact patterns and for exploring the structure and frequency of between-farm animal movements as a proxy for potential disease introduction or spread. The third module (mIII) integrates mI-II outputs into a spatial-stochastic model that simulates within- and between-farm FMD-transmission. The economic module (mIV) connects outputs from mI-III to provide an estimate of associated direct and indirect costs. A visualization module (mV) is also implemented to graph and map the outputs of module I-IV. After 1000 simulated epidemics, the mean (95% probability interval) number of outbreaks, infected animals, epidemic duration, and direct costs were 37 (1, 1164), 2152 (1, 13, 250), 63 days (0, 442), and US$ 1.2 million (1072, 9.5 million), respectively. Spread of disease was primarily local (<4.5km), but geolocation and type of index farm strongly influenced the extent and spatial patterns of an epidemic. The approach is intended to support decisions in the last phase of the FMD eradication program in Peru, in particular to inform and support the implementation of risk-based surveillance and livestock insurance systems that may help to prevent and control potential FMD virus incursions into Peru. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Bed Capacity Planning Using Stochastic Simulation Approach in Cardiac-surgery Department of Teaching Hospitals, Tehran, Iran

    PubMed Central

    TORABIPOUR, Amin; ZERAATI, Hojjat; ARAB, Mohammad; RASHIDIAN, Arash; AKBARI SARI, Ali; SARZAIEM, Mahmuod Reza

    2016-01-01

    Background: To determine the hospital required beds using stochastic simulation approach in cardiac surgery departments. Methods: This study was performed from Mar 2011 to Jul 2012 in three phases: First, collection data from 649 patients in cardiac surgery departments of two large teaching hospitals (in Tehran, Iran). Second, statistical analysis and formulate a multivariate linier regression model to determine factors that affect patient's length of stay. Third, develop a stochastic simulation system (from admission to discharge) based on key parameters to estimate required bed capacity. Results: Current cardiac surgery department with 33 beds can only admit patients in 90.7% of days. (4535 d) and will be required to over the 33 beds only in 9.3% of days (efficient cut off point). According to simulation method, studied cardiac surgery department will requires 41–52 beds for admission of all patients in the 12 next years. Finally, one-day reduction of length of stay lead to decrease need for two hospital beds annually. Conclusion: Variation of length of stay and its affecting factors can affect required beds. Statistic and stochastic simulation model are applied and useful methods to estimate and manage hospital beds based on key hospital parameters. PMID:27957466

  18. Stochastic simulation of enzyme-catalyzed reactions with disparate timescales.

    PubMed

    Barik, Debashis; Paul, Mark R; Baumann, William T; Cao, Yang; Tyson, John J

    2008-10-01

    Many physiological characteristics of living cells are regulated by protein interaction networks. Because the total numbers of these protein species can be small, molecular noise can have significant effects on the dynamical properties of a regulatory network. Computing these stochastic effects is made difficult by the large timescale separations typical of protein interactions (e.g., complex formation may occur in fractions of a second, whereas catalytic conversions may take minutes). Exact stochastic simulation may be very inefficient under these circumstances, and methods for speeding up the simulation without sacrificing accuracy have been widely studied. We show that the "total quasi-steady-state approximation" for enzyme-catalyzed reactions provides a useful framework for efficient and accurate stochastic simulations. The method is applied to three examples: a simple enzyme-catalyzed reaction where enzyme and substrate have comparable abundances, a Goldbeter-Koshland switch, where a kinase and phosphatase regulate the phosphorylation state of a common substrate, and coupled Goldbeter-Koshland switches that exhibit bistability. Simulations based on the total quasi-steady-state approximation accurately capture the steady-state probability distributions of all components of these reaction networks. In many respects, the approximation also faithfully reproduces time-dependent aspects of the fluctuations. The method is accurate even under conditions of poor timescale separation.

  19. From AWE-GEN to AWE-GEN-2d: a high spatial and temporal resolution weather generator

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo

    2016-04-01

    A new weather generator, AWE-GEN-2d (Advanced WEather GENerator for 2-Dimension grid) is developed following the philosophy of combining physical and stochastic approaches to simulate meteorological variables at high spatial and temporal resolution (e.g. 2 km x 2 km and 5 min for precipitation and cloud cover and 100 m x 100 m and 1 h for other variables variable (temperature, solar radiation, vapor pressure, atmospheric pressure and near-surface wind). The model is suitable to investigate the impacts of climate variability, temporal and spatial resolutions of forcing on hydrological, ecological, agricultural and geomorphological impacts studies. Using appropriate parameterization the model can be used in the context of climate change. Here we present the model technical structure of AWE-GEN-2d, which is a substantial evolution of four preceding models (i) the hourly-point scale Advanced WEather GENerator (AWE-GEN) presented by Fatichi et al. (2011, Adv. Water Resour.) (ii) the Space-Time Realizations of Areal Precipitation (STREAP) model introduced by Paschalis et al. (2013, Water Resour. Res.), (iii) the High-Resolution Synoptically conditioned Weather Generator developed by Peleg and Morin (2014, Water Resour. Res.), and (iv) the Wind-field Interpolation by Non Divergent Schemes presented by Burlando et al. (2007, Boundary-Layer Meteorol.). The AWE-GEN-2d is relatively parsimonious in terms of computational demand and allows generating many stochastic realizations of current and projected climates in an efficient way. An example of model application and testing is presented with reference to a case study in the Wallis region, a complex orography terrain in the Swiss Alps.

  20. 3D replicon distributions arise from stochastic initiation and domino-like DNA replication progression.

    PubMed

    Löb, D; Lengert, N; Chagin, V O; Reinhart, M; Casas-Delucchi, C S; Cardoso, M C; Drossel, B

    2016-04-07

    DNA replication dynamics in cells from higher eukaryotes follows very complex but highly efficient mechanisms. However, the principles behind initiation of potential replication origins and emergence of typical patterns of nuclear replication sites remain unclear. Here, we propose a comprehensive model of DNA replication in human cells that is based on stochastic, proximity-induced replication initiation. Critical model features are: spontaneous stochastic firing of individual origins in euchromatin and facultative heterochromatin, inhibition of firing at distances below the size of chromatin loops and a domino-like effect by which replication forks induce firing of nearby origins. The model reproduces the empirical temporal and chromatin-related properties of DNA replication in human cells. We advance the one-dimensional DNA replication model to a spatial model by taking into account chromatin folding in the nucleus, and we are able to reproduce the spatial and temporal characteristics of the replication foci distribution throughout S-phase.

  1. The Sharma-Parthasarathy stochastic two-body problem

    NASA Astrophysics Data System (ADS)

    Cresson, J.; Pierret, F.; Puig, B.

    2015-03-01

    We study the Sharma-Parthasarathy stochastic two-body problem introduced by Sharma and Parthasarathy in ["Dynamics of a stochastically perturbed two-body problem," Proc. R. Soc. A 463, 979-1003 (2007)]. In particular, we focus on the preservation of some fundamental features of the classical two-body problem like the Hamiltonian structure and first integrals in the stochastic case. Numerical simulations are performed which illustrate the dynamical behaviour of the osculating elements as the semi-major axis, the eccentricity, and the pericenter. We also derive a stochastic version of Gauss's equations in the planar case.

  2. Stochastic downscaling of numerically simulated spatial rain and cloud fields using a transient multifractal approach

    NASA Astrophysics Data System (ADS)

    Nogueira, M.; Barros, A. P.; Miranda, P. M.

    2012-04-01

    Atmospheric fields can be extremely variable over wide ranges of spatial scales, with a scale ratio of 109-1010 between largest (planetary) and smallest (viscous dissipation) scale. Furthermore atmospheric fields with strong variability over wide ranges in scale most likely should not be artificially split apart into large and small scales, as in reality there is no scale separation between resolved and unresolved motions. Usually the effects of the unresolved scales are modeled by a deterministic bulk formula representing an ensemble of incoherent subgrid processes on the resolved flow. This is a pragmatic approach to the problem and not the complete solution to it. These models are expected to underrepresent the small-scale spatial variability of both dynamical and scalar fields due to implicit and explicit numerical diffusion as well as physically based subgrid scale turbulent mixing, resulting in smoother and less intermittent fields as compared to observations. Thus, a fundamental change in the way we formulate our models is required. Stochastic approaches equipped with a possible realization of subgrid processes and potentially coupled to the resolved scales over the range of significant scale interactions range provide one alternative to address the problem. Stochastic multifractal models based on the cascade phenomenology of the atmosphere and its governing equations in particular are the focus of this research. Previous results have shown that rain and cloud fields resulting from both idealized and realistic numerical simulations display multifractal behavior in the resolved scales. This result is observed even in the absence of scaling in the initial conditions or terrain forcing, suggesting that multiscaling is a general property of the nonlinear solutions of the Navier-Stokes equations governing atmospheric dynamics. Our results also show that the corresponding multiscaling parameters for rain and cloud fields exhibit complex nonlinear behavior depending on large scale parameters such as terrain forcing and mean atmospheric conditions at each location, particularly mean wind speed and moist stability. A particularly robust behavior found is the transition of the multiscaling parameters between stable and unstable cases, which has a clear physical correspondence to the transition from stratiform to organized (banded) convective regime. Thus multifractal diagnostics of moist processes are fundamentally transient and should provide a physically robust basis for the downscaling and sub-grid scale parameterizations of moist processes. Here, we investigate the possibility of using a simplified computationally efficient multifractal downscaling methodology based on turbulent cascades to produce statistically consistent fields at scales higher than the ones resolved by the model. Specifically, we are interested in producing rainfall and cloud fields at spatial resolutions necessary for effective flash flood and earth flows forecasting. The results are examined by comparing downscaled field against observations, and tendency error budgets are used to diagnose the evolution of transient errors in the numerical model prediction which can be attributed to aliasing.

  3. Modelling the cancer growth process by Stochastic Differential Equations with the effect of Chondroitin Sulfate (CS) as anticancer therapeutics

    NASA Astrophysics Data System (ADS)

    Syahidatul Ayuni Mazlan, Mazma; Rosli, Norhayati; Jauhari Arief Ichwan, Solachuddin; Suhaity Azmi, Nina

    2017-09-01

    A stochastic model is introduced to describe the growth of cancer affected by anti-cancer therapeutics of Chondroitin Sulfate (CS). The parameters values of the stochastic model are estimated via maximum likelihood function. The numerical method of Euler-Maruyama will be employed to solve the model numerically. The efficiency of the stochastic model is measured by comparing the simulated result with the experimental data.

  4. MOSES: A Matlab-based open-source stochastic epidemic simulator.

    PubMed

    Varol, Huseyin Atakan

    2016-08-01

    This paper presents an open-source stochastic epidemic simulator. Discrete Time Markov Chain based simulator is implemented in Matlab. The simulator capable of simulating SEQIJR (susceptible, exposed, quarantined, infected, isolated and recovered) model can be reduced to simpler models by setting some of the parameters (transition probabilities) to zero. Similarly, it can be extended to more complicated models by editing the source code. It is designed to be used for testing different control algorithms to contain epidemics. The simulator is also designed to be compatible with a network based epidemic simulator and can be used in the network based scheme for the simulation of a node. Simulations show the capability of reproducing different epidemic model behaviors successfully in a computationally efficient manner.

  5. Boundary effects on population dynamics in stochastic lattice Lotka-Volterra models

    NASA Astrophysics Data System (ADS)

    Heiba, Bassel; Chen, Sheng; Täuber, Uwe C.

    2018-02-01

    We investigate spatially inhomogeneous versions of the stochastic Lotka-Volterra model for predator-prey competition and coexistence by means of Monte Carlo simulations on a two-dimensional lattice with periodic boundary conditions. To study boundary effects for this paradigmatic population dynamics system, we employ a simulation domain split into two patches: Upon setting the predation rates at two distinct values, one half of the system resides in an absorbing state where only the prey survives, while the other half attains a stable coexistence state wherein both species remain active. At the domain boundary, we observe a marked enhancement of the predator population density. The predator correlation length displays a minimum at the boundary, before reaching its asymptotic constant value deep in the active region. The frequency of the population oscillations appears only very weakly affected by the existence of two distinct domains, in contrast to their attenuation rate, which assumes its largest value there. We also observe that boundary effects become less prominent as the system is successively divided into subdomains in a checkerboard pattern, with two different reaction rates assigned to neighboring patches. When the domain size becomes reduced to the scale of the correlation length, the mean population densities attain values that are very similar to those in a disordered system with randomly assigned reaction rates drawn from a bimodal distribution.

  6. Modeling the effects of trophy selection and environmental disturbance on a simulated population of African lions.

    PubMed

    Whitman, Karyl L; Starfield, Anthony M; Quadling, Henley; Packer, Craig

    2007-06-01

    Tanzania is a premier destination for trophy hunting of African lions (Panthera leo) and is home to the most extensive long-term study of unhunted lions. Thus, it provides a unique opportunity to apply data from a long-term field study to a conservation dilemma: How can a trophy-hunted species whose reproductive success is closely tied to social stability be harvested sustainably? We used an individually based, spatially explicit, stochastic model, parameterized with nearly 40 years of behavioral and demographic data on lions in the Serengeti, to examine the separate effects of trophy selection and environmental disturbance on the viability of a simulated lion population in response to annual harvesting. Female population size was sensitive to the harvesting of young males (> or = 3 years), whereas hunting represented a relatively trivial threat to population viability when the harvest was restricted to mature males (> or = 6 years). Overall model performance was robust to environmental disturbance and to errors in age assessment based on nose coloration as an index used to age potential trophies. Introducing an environmental disturbance did not eliminate the capacity to maintain a viable breeding population when harvesting only older males, and initially depleted populations recovered within 15-25 years after the disturbance to levels comparable to hunted populations that did not experience a catastrophic event. These results are consistent with empirical observations of lion resilience to environmental stochasticity.

  7. Stochastic reaction-diffusion algorithms for macromolecular crowding

    NASA Astrophysics Data System (ADS)

    Sturrock, Marc

    2016-06-01

    Compartment-based (lattice-based) reaction-diffusion algorithms are often used for studying complex stochastic spatio-temporal processes inside cells. In this paper the influence of macromolecular crowding on stochastic reaction-diffusion simulations is investigated. Reaction-diffusion processes are considered on two different kinds of compartmental lattice, a cubic lattice and a hexagonal close packed lattice, and solved using two different algorithms, the stochastic simulation algorithm and the spatiocyte algorithm (Arjunan and Tomita 2010 Syst. Synth. Biol. 4, 35-53). Obstacles (modelling macromolecular crowding) are shown to have substantial effects on the mean squared displacement and average number of molecules in the domain but the nature of these effects is dependent on the choice of lattice, with the cubic lattice being more susceptible to the effects of the obstacles. Finally, improvements for both algorithms are presented.

  8. A noisy chaotic neural network for solving combinatorial optimization problems: stochastic chaotic simulated annealing.

    PubMed

    Wang, Lipo; Li, Sa; Tian, Fuyu; Fu, Xiuju

    2004-10-01

    Recently Chen and Aihara have demonstrated both experimentally and mathematically that their chaotic simulated annealing (CSA) has better search ability for solving combinatorial optimization problems compared to both the Hopfield-Tank approach and stochastic simulated annealing (SSA). However, CSA may not find a globally optimal solution no matter how slowly annealing is carried out, because the chaotic dynamics are completely deterministic. In contrast, SSA tends to settle down to a global optimum if the temperature is reduced sufficiently slowly. Here we combine the best features of both SSA and CSA, thereby proposing a new approach for solving optimization problems, i.e., stochastic chaotic simulated annealing, by using a noisy chaotic neural network. We show the effectiveness of this new approach with two difficult combinatorial optimization problems, i.e., a traveling salesman problem and a channel assignment problem for cellular mobile communications.

  9. Adiabatic coarse-graining and simulations of stochastic biochemical networks

    PubMed Central

    Sinitsyn, N. A.; Hengartner, Nicolas; Nemenman, Ilya

    2009-01-01

    We propose a universal approach for analysis and fast simulations of stiff stochastic biochemical networks, which rests on elimination of fast chemical species without a loss of information about mesoscopic, non-Poissonian fluctuations of the slow ones. Our approach is similar to the Born–Oppenheimer approximation in quantum mechanics and follows from the stochastic path integral representation of the cumulant generating function of reaction events. In applications with a small number of chemical reactions, it produces analytical expressions for cumulants of chemical fluxes between the slow variables. This allows for a low-dimensional, interpretable representation and can be used for high-accuracy, low-complexity coarse-grained numerical simulations. As an example, we derive the coarse-grained description for a chain of biochemical reactions and show that the coarse-grained and the microscopic simulations agree, but the former is 3 orders of magnitude faster. PMID:19525397

  10. Coupled stochastic soil moisture simulation-optimization model of deficit irrigation

    NASA Astrophysics Data System (ADS)

    Alizadeh, Hosein; Mousavi, S. Jamshid

    2013-07-01

    This study presents an explicit stochastic optimization-simulation model of short-term deficit irrigation management for large-scale irrigation districts. The model which is a nonlinear nonconvex program with an economic objective function is built on an agrohydrological simulation component. The simulation component integrates (1) an explicit stochastic model of soil moisture dynamics of the crop-root zone considering interaction of stochastic rainfall and irrigation with shallow water table effects, (2) a conceptual root zone salt balance model, and 3) the FAO crop yield model. Particle Swarm Optimization algorithm, linked to the simulation component, solves the resulting nonconvex program with a significantly better computational performance compared to a Monte Carlo-based implicit stochastic optimization model. The model has been tested first by applying it in single-crop irrigation problems through which the effects of the severity of water deficit on the objective function (net benefit), root-zone water balance, and irrigation water needs have been assessed. Then, the model has been applied in Dasht-e-Abbas and Ein-khosh Fakkeh Irrigation Districts (DAID and EFID) of the Karkheh Basin in southwest of Iran. While the maximum net benefit has been obtained for a stress-avoidance (SA) irrigation policy, the highest water profitability has been resulted when only about 60% of the water used in the SA policy is applied. The DAID with respectively 33% of total cultivated area and 37% of total applied water has produced only 14% of the total net benefit due to low-valued crops and adverse soil and shallow water table conditions.

  11. Stochastic modeling of a lava-flow aquifer system

    USGS Publications Warehouse

    Cronkite-Ratcliff, Collin; Phelps, Geoffrey A.

    2014-01-01

    This report describes preliminary three-dimensional geostatistical modeling of a lava-flow aquifer system using a multiple-point geostatistical model. The purpose of this study is to provide a proof-of-concept for this modeling approach. An example of the method is demonstrated using a subset of borehole geologic data and aquifer test data from a portion of the Calico Hills Formation, a lava-flow aquifer system that partially underlies Pahute Mesa, Nevada. Groundwater movement in this aquifer system is assumed to be controlled by the spatial distribution of two geologic units—rhyolite lava flows and zeolitized tuffs. The configuration of subsurface lava flows and tuffs is largely unknown because of limited data. The spatial configuration of the lava flows and tuffs is modeled by using a multiple-point geostatistical simulation algorithm that generates a large number of alternative realizations, each honoring the available geologic data and drawn from a geologic conceptual model of the lava-flow aquifer system as represented by a training image. In order to demonstrate how results from the geostatistical model could be analyzed in terms of available hydrologic data, a numerical simulation of part of an aquifer test was applied to the realizations of the geostatistical model.

  12. Examining the influence of heterogeneous porosity fields on conservative solute transport

    USGS Publications Warehouse

    Hu, B.X.; Meerschaert, M.M.; Barrash, W.; Hyndman, D.W.; He, C.; Li, X.; Guo, Laodong

    2009-01-01

    It is widely recognized that groundwater flow and solute transport in natural media are largely controlled by heterogeneities. In the last three decades, many studies have examined the effects of heterogeneous hydraulic conductivity fields on flow and transport processes, but there has been much less attention to the influence of heterogeneous porosity fields. In this study, we use porosity and particle size measurements from boreholes at the Boise Hydrogeophysical Research Site (BHRS) to evaluate the importance of characterizing the spatial structure of porosity and grain size data for solute transport modeling. Then we develop synthetic hydraulic conductivity fields based on relatively simple measurements of porosity from borehole logs and grain size distributions from core samples to examine and compare the characteristics of tracer transport through these fields with and without inclusion of porosity heterogeneity. In particular, we develop horizontal 2D realizations based on data from one of the less heterogeneous units at the BHRS to examine effects where spatial variations in hydraulic parameters are not large. The results indicate that the distributions of porosity and the derived hydraulic conductivity in the study unit resemble fractal normal and lognormal fields respectively. We numerically simulate solute transport in stochastic fields and find that spatial variations in porosity have significant effects on the spread of an injected tracer plume including a significant delay in simulated tracer concentration histories.

  13. Analysis of a novel stochastic SIRS epidemic model with two different saturated incidence rates

    NASA Astrophysics Data System (ADS)

    Chang, Zhengbo; Meng, Xinzhu; Lu, Xiao

    2017-04-01

    This paper presents a stochastic SIRS epidemic model with two different nonlinear incidence rates and double epidemic asymmetrical hypothesis, and we devote to develop a mathematical method to obtain the threshold of the stochastic epidemic model. We firstly investigate the boundness and extinction of the stochastic system. Furthermore, we use Ito's formula, the comparison theorem and some new inequalities techniques of stochastic differential systems to discuss persistence in mean of two diseases on three cases. The results indicate that stochastic fluctuations can suppress the disease outbreak. Finally, numerical simulations about different noise disturbance coefficients are carried out to illustrate the obtained theoretical results.

  14. Using stochastic models to incorporate spatial and temporal variability [Exercise 14

    Treesearch

    Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke

    2003-01-01

    To this point, our analysis of population processes and viability in the western prairie fringed orchid has used only deterministic models. In this exercise, we conduct a similar analysis, using a stochastic model instead. This distinction is of great importance to population biology in general and to conservation biology in particular. In deterministic models,...

  15. Simulations of transient membrane behavior in cells subjected to a high-intensity ultrashort electric pulse.

    PubMed

    Hu, Q; Viswanadham, S; Joshi, R P; Schoenbach, K H; Beebe, S J; Blackmore, P F

    2005-03-01

    A molecular dynamics (MD) scheme is combined with a distributed circuit model for a self-consistent analysis of the transient membrane response for cells subjected to an ultrashort (nanosecond) high-intensity (approximately 0.01-V/nm spatially averaged field) voltage pulse. The dynamical, stochastic, many-body aspects are treated at the molecular level by resorting to a course-grained representation of the membrane lipid molecules. Coupling the Smoluchowski equation to the distributed electrical model for current flow provides the time-dependent transmembrane fields for the MD simulations. A good match between the simulation results and available experimental data is obtained. Predictions include pore formation times of about 5-6 ns. It is also shown that the pore formation process would tend to begin from the anodic side of an electrically stressed membrane. Furthermore, the present simulations demonstrate that ions could facilitate pore formation. This could be of practical importance and have direct relevance to the recent observations of calcium release from the endoplasmic reticulum in cells subjected to such ultrashort, high-intensity pulses.

  16. A parallel reaction-transport model applied to cement hydration and microstructure development

    NASA Astrophysics Data System (ADS)

    Bullard, Jeffrey W.; Enjolras, Edith; George, William L.; Satterfield, Steven G.; Terrill, Judith E.

    2010-03-01

    A recently described stochastic reaction-transport model on three-dimensional lattices is parallelized and is used to simulate the time-dependent structural and chemical evolution in multicomponent reactive systems. The model, called HydratiCA, uses probabilistic rules to simulate the kinetics of diffusion, homogeneous reactions and heterogeneous phenomena such as solid nucleation, growth and dissolution in complex three-dimensional systems. The algorithms require information only from each lattice site and its immediate neighbors, and this localization enables the parallelized model to exhibit near-linear scaling up to several hundred processors. Although applicable to a wide range of material systems, including sedimentary rock beds, reacting colloids and biochemical systems, validation is performed here on two minerals that are commonly found in Portland cement paste, calcium hydroxide and ettringite, by comparing their simulated dissolution or precipitation rates far from equilibrium to standard rate equations, and also by comparing simulated equilibrium states to thermodynamic calculations, as a function of temperature and pH. Finally, we demonstrate how HydratiCA can be used to investigate microstructure characteristics, such as spatial correlations between different condensed phases, in more complex microstructures.

  17. Particle Trapping Mechanisms Are Different in Spatially Ordered and Disordered Interacting Gels.

    PubMed

    Hansing, Johann; Netz, Roland R

    2018-06-05

    Using stochastic simulations, we study the influence of spatial disorder on the diffusion of a single particle through a gel that consists of rigid, straight fibers. The interaction between the particle and the gel fibers consists of an invariant short-range repulsion, the steric part, and an interaction part that can be attractive or repulsive and of varying range. The effect that spatial disorder of the gel structure has on the particle diffusivity depends crucially on the presence of nonsteric interactions. For attractive interactions, disorder slows down diffusion, because in disordered gels, the particle becomes strongly trapped in regions of locally increased fiber density. For repulsive interactions, the diffusivity is minimal for intermediate disorder strength, because highly disordered lattices exhibit abundant passageways of locally low fiber density. The comparison with experimental data on protein and fluorophore diffusion through various hydrogels is favorable. Our findings shed light on particle-diffusion mechanisms in biogels and thus on biological barrier properties, which can be helpful for the optimal design of synthetic diffusors as well as synthetic mucus constructs. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. A PROBABILISTIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CCA-TREATED PLAYSETS AND DECKS USING THE STOCHASTIC HUMAN EXPOSURE AND DOSE SIMULATION (SHEDS) MODEL FOR THE WOOD PRESERVATIVE EXPOSURE SCENARIO

    EPA Science Inventory

    The U.S. Environmental Protection Agency has conducted a probabilistic exposure and dose assessment on the arsenic (As) and chromium (Cr) components of Chromated Copper Arsenate (CCA) using the Stochastic Human Exposure and Dose Simulation model for wood preservatives (SHEDS-Wood...

  19. Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.

    PubMed

    Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio

    2010-03-26

    Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.

  20. Stochastic simulation by image quilting of process-based geological models

    NASA Astrophysics Data System (ADS)

    Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef

    2017-09-01

    Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.

  1. Simulation-based planning for theater air warfare

    NASA Astrophysics Data System (ADS)

    Popken, Douglas A.; Cox, Louis A., Jr.

    2004-08-01

    Planning for Theatre Air Warfare can be represented as a hierarchy of decisions. At the top level, surviving airframes must be assigned to roles (e.g., Air Defense, Counter Air, Close Air Support, and AAF Suppression) in each time period in response to changing enemy air defense capabilities, remaining targets, and roles of opposing aircraft. At the middle level, aircraft are allocated to specific targets to support their assigned roles. At the lowest level, routing and engagement decisions are made for individual missions. The decisions at each level form a set of time-sequenced Courses of Action taken by opposing forces. This paper introduces a set of simulation-based optimization heuristics operating within this planning hierarchy to optimize allocations of aircraft. The algorithms estimate distributions for stochastic outcomes of the pairs of Red/Blue decisions. Rather than using traditional stochastic dynamic programming to determine optimal strategies, we use an innovative combination of heuristics, simulation-optimization, and mathematical programming. Blue decisions are guided by a stochastic hill-climbing search algorithm while Red decisions are found by optimizing over a continuous representation of the decision space. Stochastic outcomes are then provided by fast, Lanchester-type attrition simulations. This paper summarizes preliminary results from top and middle level models.

  2. Assessing flow paths in a karst aquifer based on multiple dye tracing tests using stochastic simulation and the MODFLOW-CFP code

    NASA Astrophysics Data System (ADS)

    Assari, Amin; Mohammadi, Zargham

    2017-09-01

    Karst systems show high spatial variability of hydraulic parameters over small distances and this makes their modeling a difficult task with several uncertainties. Interconnections of fractures have a major role on the transport of groundwater, but many of the stochastic methods in use do not have the capability to reproduce these complex structures. A methodology is presented for the quantification of tortuosity using the single normal equation simulation (SNESIM) algorithm and a groundwater flow model. A training image was produced based on the statistical parameters of fractures and then used in the simulation process. The SNESIM algorithm was used to generate 75 realizations of the four classes of fractures in a karst aquifer in Iran. The results from six dye tracing tests were used to assign hydraulic conductivity values to each class of fractures. In the next step, the MODFLOW-CFP and MODPATH codes were consecutively implemented to compute the groundwater flow paths. The 9,000 flow paths obtained from the MODPATH code were further analyzed to calculate the tortuosity factor. Finally, the hydraulic conductivity values calculated from the dye tracing experiments were refined using the actual flow paths of groundwater. The key outcomes of this research are: (1) a methodology for the quantification of tortuosity; (2) hydraulic conductivities, that are incorrectly estimated (biased low) with empirical equations that assume Darcian (laminar) flow with parallel rather than tortuous streamlines; and (3) an understanding of the scale-dependence and non-normal distributions of tortuosity.

  3. A guide to differences between stochastic point-source and stochastic finite-fault simulations

    USGS Publications Warehouse

    Atkinson, G.M.; Assatourians, K.; Boore, D.M.; Campbell, K.; Motazedian, D.

    2009-01-01

    Why do stochastic point-source and finite-fault simulation models not agree on the predicted ground motions for moderate earthquakes at large distances? This question was posed by Ken Campbell, who attempted to reproduce the Atkinson and Boore (2006) ground-motion prediction equations for eastern North America using the stochastic point-source program SMSIM (Boore, 2005) in place of the finite-source stochastic program EXSIM (Motazedian and Atkinson, 2005) that was used by Atkinson and Boore (2006) in their model. His comparisons suggested that a higher stress drop is needed in the context of SMSIM to produce an average match, at larger distances, with the model predictions of Atkinson and Boore (2006) based on EXSIM; this is so even for moderate magnitudes, which should be well-represented by a point-source model. Why? The answer to this question is rooted in significant differences between point-source and finite-source stochastic simulation methodologies, specifically as implemented in SMSIM (Boore, 2005) and EXSIM (Motazedian and Atkinson, 2005) to date. Point-source and finite-fault methodologies differ in general in several important ways: (1) the geometry of the source; (2) the definition and application of duration; and (3) the normalization of finite-source subsource summations. Furthermore, the specific implementation of the methods may differ in their details. The purpose of this article is to provide a brief overview of these differences, their origins, and implications. This sets the stage for a more detailed companion article, "Comparing Stochastic Point-Source and Finite-Source Ground-Motion Simulations: SMSIM and EXSIM," in which Boore (2009) provides modifications and improvements in the implementations of both programs that narrow the gap and result in closer agreement. These issues are important because both SMSIM and EXSIM have been widely used in the development of ground-motion prediction equations and in modeling the parameters that control observed ground motions.

  4. Modelling remediation scenarios in historical mining catchments.

    PubMed

    Gamarra, Javier G P; Brewer, Paul A; Macklin, Mark G; Martin, Katherine

    2014-01-01

    Local remediation measures, particularly those undertaken in historical mining areas, can often be ineffective or even deleterious because erosion and sedimentation processes operate at spatial scales beyond those typically used in point-source remediation. Based on realistic simulations of a hybrid landscape evolution model combined with stochastic rainfall generation, we demonstrate that similar remediation strategies may result in differing effects across three contrasting European catchments depending on their topographic and hydrologic regimes. Based on these results, we propose a conceptual model of catchment-scale remediation effectiveness based on three basic catchment characteristics: the degree of contaminant source coupling, the ratio of contaminated to non-contaminated sediment delivery, and the frequency of sediment transport events.

  5. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, David J.; Reynolds, Daniel R.

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  6. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE PAGES

    Gardner, David J.; Reynolds, Daniel R.

    2017-01-05

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  7. Stochastic-field cavitation model

    NASA Astrophysics Data System (ADS)

    Dumond, J.; Magagnato, F.; Class, A.

    2013-07-01

    Nonlinear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally, the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian "particles" or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and, in particular, to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. First, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.

  8. A cavitation model based on Eulerian stochastic fields

    NASA Astrophysics Data System (ADS)

    Magagnato, F.; Dumond, J.

    2013-12-01

    Non-linear phenomena can often be described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian "particles" or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and in particular to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. Firstly, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.

  9. On generic obstructions to recovering correct statistics from climate simulations: Homogenization for deterministic maps and multiplicative noise

    NASA Astrophysics Data System (ADS)

    Gottwald, Georg; Melbourne, Ian

    2013-04-01

    Whereas diffusion limits of stochastic multi-scale systems have a long and successful history, the case of constructing stochastic parametrizations of chaotic deterministic systems has been much less studied. We present rigorous results of convergence of a chaotic slow-fast system to a stochastic differential equation with multiplicative noise. Furthermore we present rigorous results for chaotic slow-fast maps, occurring as numerical discretizations of continuous time systems. This raises the issue of how to interpret certain stochastic integrals; surprisingly the resulting integrals of the stochastic limit system are generically neither of Stratonovich nor of Ito type in the case of maps. It is shown that the limit system of a numerical discretisation is different to the associated continuous time system. This has important consequences when interpreting the statistics of long time simulations of multi-scale systems - they may be very different to the one of the original continuous time system which we set out to study.

  10. Stochastic-field cavitation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumond, J., E-mail: julien.dumond@areva.com; AREVA GmbH, Erlangen, Paul-Gossen-Strasse 100, D-91052 Erlangen; Magagnato, F.

    2013-07-15

    Nonlinear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally, the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian “particles” or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and, in particular, to cavitating flow. To validate the proposed stochastic-fieldmore » cavitation model, two applications are considered. First, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.« less

  11. Optimal Groundwater Extraction under Uncertainty and a Spatial Stock Externality

    EPA Science Inventory

    We introduce a model that incorporates two important elements to estimating welfare gains from groundwater management: stochasticity and a spatial stock externality. We estimate welfare gains resulting from optimal management under uncertainty as well as a gradual stock externali...

  12. Scalable hierarchical PDE sampler for generating spatially correlated random fields using nonmatching meshes: Scalable hierarchical PDE sampler using nonmatching meshes

    DOE PAGES

    Osborn, Sarah; Zulian, Patrick; Benson, Thomas; ...

    2018-01-30

    This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less

  13. Scalable hierarchical PDE sampler for generating spatially correlated random fields using nonmatching meshes: Scalable hierarchical PDE sampler using nonmatching meshes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Zulian, Patrick; Benson, Thomas

    This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less

  14. Analysis of fluid flow and solute transport through a single fracture with variable apertures intersecting a canister: Comparison between fractal and Gaussian fractures

    NASA Astrophysics Data System (ADS)

    Liu, L.; Neretnieks, I.

    Canisters with spent nuclear fuel will be deposited in fractured crystalline rock in the Swedish concept for a final repository. The fractures intersect the canister holes at different angles and they have variable apertures and therefore locally varying flowrates. Our previous model with fractures with a constant aperture and a 90° intersection angle is now extended to arbitrary intersection angles and stochastically variable apertures. It is shown that the previous basic model can be simply amended to account for these effects. More importantly, it has been found that the distributions of the volumetric and the equivalent flow rates are all close to the Normal for both fractal and Gaussian fractures, with the mean of the distribution of the volumetric flow rate being determined solely by the hydraulic aperture, and that of the equivalent flow rate being determined by the mechanical aperture. Moreover, the standard deviation of the volumetric flow rates of the many realizations increases with increasing roughness and spatial correlation length of the aperture field, and so does that of the equivalent flow rates. Thus, two simple statistical relations can be developed to describe the stochastic properties of fluid flow and solute transport through a single fracture with spatially variable apertures. This obviates, then, the need to simulate each fracture that intersects a canister in great detail, and allows the use of complex fractures also in very large fracture network models used in performance assessment.

  15. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  16. A Hybrid of the Chemical Master Equation and the Gillespie Algorithm for Efficient Stochastic Simulations of Sub-Networks.

    PubMed

    Albert, Jaroslav

    2016-01-01

    Modeling stochastic behavior of chemical reaction networks is an important endeavor in many aspects of chemistry and systems biology. The chemical master equation (CME) and the Gillespie algorithm (GA) are the two most fundamental approaches to such modeling; however, each of them has its own limitations: the GA may require long computing times, while the CME may demand unrealistic memory storage capacity. We propose a method that combines the CME and the GA that allows one to simulate stochastically a part of a reaction network. First, a reaction network is divided into two parts. The first part is simulated via the GA, while the solution of the CME for the second part is fed into the GA in order to update its propensities. The advantage of this method is that it avoids the need to solve the CME or stochastically simulate the entire network, which makes it highly efficient. One of its drawbacks, however, is that most of the information about the second part of the network is lost in the process. Therefore, this method is most useful when only partial information about a reaction network is needed. We tested this method against the GA on two systems of interest in biology--the gene switch and the Griffith model of a genetic oscillator--and have shown it to be highly accurate. Comparing this method to four different stochastic algorithms revealed it to be at least an order of magnitude faster than the fastest among them.

  17. Spatial stochastic modelling of the Hes1 gene regulatory network: intrinsic noise can explain heterogeneity in embryonic stem cell differentiation

    PubMed Central

    Sturrock, Marc; Hellander, Andreas; Matzavinos, Anastasios; Chaplain, Mark A. J.

    2013-01-01

    Individual mouse embryonic stem cells have been found to exhibit highly variable differentiation responses under the same environmental conditions. The noisy cyclic expression of Hes1 and its downstream genes are known to be responsible for this, but the mechanism underlying this variability in expression is not well understood. In this paper, we show that the observed experimental data and diverse differentiation responses can be explained by a spatial stochastic model of the Hes1 gene regulatory network. We also propose experiments to control the precise differentiation response using drug treatment. PMID:23325756

  18. Spatial dynamics of invasion: the geometry of introduced species.

    PubMed

    Korniss, Gyorgy; Caraco, Thomas

    2005-03-07

    Many exotic species combine low probability of establishment at each introduction with rapid population growth once introduction does succeed. To analyse this phenomenon, we note that invaders often cluster spatially when rare, and consequently an introduced exotic's population dynamics should depend on locally structured interactions. Ecological theory for spatially structured invasion relies on deterministic approximations, and determinism does not address the observed uncertainty of the exotic-introduction process. We take a new approach to the population dynamics of invasion and, by extension, to the general question of invasibility in any spatial ecology. We apply the physical theory for nucleation of spatial systems to a lattice-based model of competition between plant species, a resident and an invader, and the analysis reaches conclusions that differ qualitatively from the standard ecological theories. Nucleation theory distinguishes between dynamics of single- and multi-cluster invasion. Low introduction rates and small system size produce single-cluster dynamics, where success or failure of introduction is inherently stochastic. Single-cluster invasion occurs only if the cluster reaches a critical size, typically preceded by a number of failed attempts. For this case, we identify the functional form of the probability distribution of time elapsing until invasion succeeds. Although multi-cluster invasion for sufficiently large systems exhibits spatial averaging and almost-deterministic dynamics of the global densities, an analytical approximation from nucleation theory, known as Avrami's law, describes our simulation results far better than standard ecological approximations.

  19. Modeling and Properties of Nonlinear Stochastic Dynamical System of Continuous Culture

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Feng, Enmin; Ye, Jianxiong; Xiu, Zhilong

    The stochastic counterpart to the deterministic description of continuous fermentation with ordinary differential equation is investigated in the process of glycerol bio-dissimilation to 1,3-propanediol by Klebsiella pneumoniae. We briefly discuss the continuous fermentation process driven by three-dimensional Brownian motion and Lipschitz coefficients, which is suitable for the factual fermentation. Subsequently, we study the existence and uniqueness of solutions for the stochastic system as well as the boundedness of the Two-order Moment and the Markov property of the solution. Finally stochastic simulation is carried out under the Stochastic Euler-Maruyama method.

  20. The Sharma-Parthasarathy stochastic two-body problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cresson, J.; SYRTE/Observatoire de Paris, 75014 Paris; Pierret, F.

    2015-03-15

    We study the Sharma-Parthasarathy stochastic two-body problem introduced by Sharma and Parthasarathy in [“Dynamics of a stochastically perturbed two-body problem,” Proc. R. Soc. A 463, 979-1003 (2007)]. In particular, we focus on the preservation of some fundamental features of the classical two-body problem like the Hamiltonian structure and first integrals in the stochastic case. Numerical simulations are performed which illustrate the dynamical behaviour of the osculating elements as the semi-major axis, the eccentricity, and the pericenter. We also derive a stochastic version of Gauss’s equations in the planar case.

Top