Sample records for multiple simulation models

  1. Improvements in simulation of multiple scattering effects in ATLAS fast simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basalaev, A. E., E-mail: artem.basalaev@cern.ch

    Fast ATLAS Tracking Simulation (Fatras) package was verified on single layer geometry with respect to full simulation with GEANT4. Fatras hadronic interactions and multiple scattering simulation were studied in comparison with GEANT4. Disagreement was found in multiple scattering distributions of primary charged particles (μ, π, e). A new model for multiple scattering simulation was implemented in Fatras. The model was based on R. Frühwirth’s mixture models. New model was tested on single layer geometry and a good agreement with GEANT4 was achieved. Also a comparison of reconstructed tracks’ parameters was performed for Inner Detector geometry, and Fatras with new multiplemore » scattering model proved to have better agreement with GEANT4. New model of multiple scattering was added as a part of Fatras package in the development release of ATLAS software—ATHENA.« less

  2. Theory, modeling, and simulation of structural and functional materials: Micromechanics, microstructures, and properties

    NASA Astrophysics Data System (ADS)

    Jin, Yongmei

    In recent years, theoretical modeling and computational simulation of microstructure evolution and materials property has been attracting much attention. While significant advances have been made, two major challenges remain. One is the integration of multiple physical phenomena for simulation of complex materials behavior, the other is the bridging over multiple length and time scales in materials modeling and simulation. The research presented in this Thesis is focused mainly on tackling the first major challenge. In this Thesis, a unified Phase Field Microelasticity (PFM) approach is developed. This approach is an advanced version of the phase field method that takes into account the exact elasticity of arbitrarily anisotropic, elastically and structurally inhomogeneous systems. The proposed theory and models are applicable to infinite solids, elastic half-space, and finite bodies with arbitrary-shaped free surfaces, which may undergo various concomitant physical processes. The Phase Field Microelasticity approach is employed to formulate the theories and models of martensitic transformation, dislocation dynamics, and crack evolution in single crystal and polycrystalline solids. It is also used to study strain relaxation in heteroepitaxial thin films through misfit dislocation and surface roughening. Magnetic domain evolution in nanocrystalline thin films is also investigated. Numerous simulation studies are performed. Comparison with analytical predictions and experimental observations are presented. Agreement verities the theory and models as realistic simulation tools for computational materials science and engineering. The same Phase Field Microelasticity formalism of individual models of different physical phenomena makes it easy to integrate multiple physical processes into one unified simulation model, where multiple phenomena are treated as various relaxation modes that together act as one common cooperative phenomenon. The model does not impose a priori constraints on possible microstructure evolution paths. This gives the model predicting power, where material system itself "chooses" the optimal path for multiple processes. The advances made in this Thesis present a significant step forward to overcome the first challenge, mesoscale multi-physics modeling and simulation of materials. At the end of this Thesis, the way to tackle the second challenge, bridging over multiple length and time scales in materials modeling and simulation, is discussed based on connection between the mesoscale Phase Field Microelasticity modeling and microscopic atomistic calculation as well as macroscopic continuum theory.

  3. Using multi-criteria analysis of simulation models to understand complex biological systems

    Treesearch

    Maureen C. Kennedy; E. David Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  4. Theoretical Models of Protostellar Binary and Multiple Systems with AMR Simulations

    NASA Astrophysics Data System (ADS)

    Matsumoto, Tomoaki; Tokuda, Kazuki; Onishi, Toshikazu; Inutsuka, Shu-ichiro; Saigo, Kazuya; Takakuwa, Shigehisa

    2017-05-01

    We present theoretical models for protostellar binary and multiple systems based on the high-resolution numerical simulation with an adaptive mesh refinement (AMR) code, SFUMATO. The recent ALMA observations have revealed early phases of the binary and multiple star formation with high spatial resolutions. These observations should be compared with theoretical models with high spatial resolutions. We present two theoretical models for (1) a high density molecular cloud core, MC27/L1521F, and (2) a protobinary system, L1551 NE. For the model for MC27, we performed numerical simulations for gravitational collapse of a turbulent cloud core. The cloud core exhibits fragmentation during the collapse, and dynamical interaction between the fragments produces an arc-like structure, which is one of the prominent structures observed by ALMA. For the model for L1551 NE, we performed numerical simulations of gas accretion onto protobinary. The simulations exhibit asymmetry of a circumbinary disk. Such asymmetry has been also observed by ALMA in the circumbinary disk of L1551 NE.

  5. Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-09-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Reprint of: Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-11-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating…

  8. Stochastic molecular model of enzymatic hydrolysis of cellulose for ethanol production

    PubMed Central

    2013-01-01

    Background During cellulosic ethanol production, cellulose hydrolysis is achieved by synergistic action of cellulase enzyme complex consisting of multiple enzymes with different mode of actions. Enzymatic hydrolysis of cellulose is one of the bottlenecks in the commercialization of the process due to low hydrolysis rates and high cost of enzymes. A robust hydrolysis model that can predict hydrolysis profile under various scenarios can act as an important forecasting tool to improve the hydrolysis process. However, multiple factors affecting hydrolysis: cellulose structure and complex enzyme-substrate interactions during hydrolysis make it diffucult to develop mathematical kinetic models that can simulate hydrolysis in presence of multiple enzymes with high fidelity. In this study, a comprehensive hydrolysis model based on stochastic molecular modeling approch in which each hydrolysis event is translated into a discrete event is presented. The model captures the structural features of cellulose, enzyme properties (mode of actions, synergism, inhibition), and most importantly dynamic morphological changes in the substrate that directly affect the enzyme-substrate interactions during hydrolysis. Results Cellulose was modeled as a group of microfibrils consisting of elementary fibrils bundles, where each elementary fibril was represented as a three dimensional matrix of glucose molecules. Hydrolysis of cellulose was simulated based on Monte Carlo simulation technique. Cellulose hydrolysis results predicted by model simulations agree well with the experimental data from literature. Coefficients of determination for model predictions and experimental values were in the range of 0.75 to 0.96 for Avicel hydrolysis by CBH I action. Model was able to simulate the synergistic action of multiple enzymes during hydrolysis. The model simulations captured the important experimental observations: effect of structural properties, enzyme inhibition and enzyme loadings on the hydrolysis and degree of synergism among enzymes. Conclusions The model was effective in capturing the dynamic behavior of cellulose hydrolysis during action of individual as well as multiple cellulases. Simulations were in qualitative and quantitative agreement with experimental data. Several experimentally observed phenomena were simulated without the need for any additional assumptions or parameter changes and confirmed the validity of using the stochastic molecular modeling approach to quantitatively and qualitatively describe the cellulose hydrolysis. PMID:23638989

  9. Simulation's Ensemble is Better Than Ensemble Simulation

    NASA Astrophysics Data System (ADS)

    Yan, X.

    2017-12-01

    Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.

  10. A multiple-point geostatistical approach to quantifying uncertainty for flow and transport simulation in geologically complex environments

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.

    2011-12-01

    In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.

  11. Multi-objective optimization for generating a weighted multi-model ensemble

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2017-12-01

    Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.

  12. Predicting the oral pharmacokinetic profiles of multiple-unit (pellet) dosage forms using a modeling and simulation approach coupled with biorelevant dissolution testing: case example diclofenac sodium.

    PubMed

    Kambayashi, Atsushi; Blume, Henning; Dressman, Jennifer B

    2014-07-01

    The objective of this research was to characterize the dissolution profile of a poorly soluble drug, diclofenac, from a commercially available multiple-unit enteric coated dosage form, Diclo-Puren® capsules, and to develop a predictive model for its oral pharmacokinetic profile. The paddle method was used to obtain the dissolution profiles of this dosage form in biorelevant media, with the exposure to simulated gastric conditions being varied in order to simulate the gastric emptying behavior of pellets. A modified Noyes-Whitney theory was subsequently fitted to the dissolution data. A physiologically-based pharmacokinetic (PBPK) model for multiple-unit dosage forms was designed using STELLA® software and coupled with the biorelevant dissolution profiles in order to simulate the plasma concentration profiles of diclofenac from Diclo-Puren® capsule in both the fasted and fed state in humans. Gastric emptying kinetics relevant to multiple-units pellets were incorporated into the PBPK model by setting up a virtual patient population to account for physiological variations in emptying kinetics. Using in vitro biorelevant dissolution coupled with in silico PBPK modeling and simulation it was possible to predict the plasma profile of this multiple-unit formulation of diclofenac after oral administration in both the fasted and fed state. This approach might be useful to predict variability in the plasma profiles for other drugs housed in multiple-unit dosage forms. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  14. Multiple-Shrinkage Multinomial Probit Models with Applications to Simulating Geographies in Public Use Data.

    PubMed

    Burgette, Lane F; Reiter, Jerome P

    2013-06-01

    Multinomial outcomes with many levels can be challenging to model. Information typically accrues slowly with increasing sample size, yet the parameter space expands rapidly with additional covariates. Shrinking all regression parameters towards zero, as often done in models of continuous or binary response variables, is unsatisfactory, since setting parameters equal to zero in multinomial models does not necessarily imply "no effect." We propose an approach to modeling multinomial outcomes with many levels based on a Bayesian multinomial probit (MNP) model and a multiple shrinkage prior distribution for the regression parameters. The prior distribution encourages the MNP regression parameters to shrink toward a number of learned locations, thereby substantially reducing the dimension of the parameter space. Using simulated data, we compare the predictive performance of this model against two other recently-proposed methods for big multinomial models. The results suggest that the fully Bayesian, multiple shrinkage approach can outperform these other methods. We apply the multiple shrinkage MNP to simulating replacement values for areal identifiers, e.g., census tract indicators, in order to protect data confidentiality in public use datasets.

  15. MODFLOW-2005, The U.S. Geological Survey Modular Ground-Water Model - Documentation of the Multiple-Refined-Areas Capability of Local Grid Refinement (LGR) and the Boundary Flow and Head (BFH) Package

    USGS Publications Warehouse

    Mehl, Steffen W.; Hill, Mary C.

    2007-01-01

    This report documents the addition of the multiple-refined-areas capability to shared node Local Grid Refinement (LGR) and Boundary Flow and Head (BFH) Package of MODFLOW-2005, the U.S. Geological Survey modular, three-dimensional, finite-difference ground-water flow model. LGR now provides the capability to simulate ground-water flow by using one or more block-shaped, higher resolution local grids (child model) within a coarser grid (parent model). LGR accomplishes this by iteratively coupling separate MODFLOW-2005 models such that heads and fluxes are balanced across the shared interfacing boundaries. The ability to have multiple, nonoverlapping areas of refinement is important in situations where there is more than one area of concern within a regional model. In this circumstance, LGR can be used to simulate these distinct areas with higher resolution grids. LGR can be used in two-and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined ground-water systems. The BFH Package can be used to simulate these situations by using either the parent or child models independently.

  16. Equivalent circuit model of Ge/Si separate absorption charge multiplication avalanche photodiode

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Chen, Ting; Yan, Linshu; Bao, Xiaoyuan; Xu, Yuanyuan; Wang, Guang; Wang, Guanyu; Yuan, Jun; Li, Junfeng

    2018-03-01

    The equivalent circuit model of Ge/Si Separate Absorption Charge Multiplication Avalanche Photodiode (SACM-APD) is proposed. Starting from the carrier rate equations in different regions of device and considering the influences of non-uniform electric field, noise, parasitic effect and some other factors, the equivalent circuit model of SACM-APD device is established, in which the steady-state and transient current voltage characteristics can be described exactly. In addition, the proposed Ge/Si SACM APD equivalent circuit model is embedded in PSpice simulator. The important characteristics of Ge/Si SACM APD such as dark current, frequency response, shot noise are simulated, the simulation results show that the simulation with the proposed model are in good agreement with the experimental results.

  17. Ascertaining Validity in the Abstract Realm of PMESII Simulation Models: An Analysis of the Peace Support Operations Model (PSOM)

    DTIC Science & Technology

    2009-06-01

    simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE

  18. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  19. Simulation of Attitude and Trajectory Dynamics and Control of Multiple Spacecraft

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric T.

    2009-01-01

    Agora software is a simulation of spacecraft attitude and orbit dynamics. It supports spacecraft models composed of multiple rigid bodies or flexible structural models. Agora simulates multiple spacecraft simultaneously, supporting rendezvous, proximity operations, and precision formation flying studies. The Agora environment includes ephemerides for all planets and major moons in the solar system, supporting design studies for deep space as well as geocentric missions. The environment also contains standard models for gravity, atmospheric density, and magnetic fields. Disturbance force and torque models include aerodynamic, gravity-gradient, solar radiation pressure, and third-body gravitation. In addition to the dynamic and environmental models, Agora supports geometrical visualization through an OpenGL interface. Prototype models are provided for common sensors, actuators, and control laws. A clean interface accommodates linking in actual flight code in place of the prototype control laws. The same simulation may be used for rapid feasibility studies, and then used for flight software validation as the design matures. Agora is open-source and portable across computing platforms, making it customizable and extensible. It is written to support the entire GNC (guidance, navigation, and control) design cycle, from rapid prototyping and design analysis, to high-fidelity flight code verification. As a top-down design, Agora is intended to accommodate a large range of missions, anywhere in the solar system. Both two-body and three-body flight regimes are supported, as well as seamless transition between them. Multiple spacecraft may be simultaneously simulated, enabling simulation of rendezvous scenarios, as well as formation flying. Built-in reference frames and orbit perturbation dynamics provide accurate modeling of precision formation control.

  20. Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine.

    PubMed

    Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner

    2016-01-01

    Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.

  1. Multiple-source multiple-harmonic active vibration control of variable section cylindrical structures: A numerical study

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Chen, Xuefeng; Gao, Jiawei; Zhang, Xingwu

    2016-12-01

    Air vehicles, space vehicles and underwater vehicles, the cabins of which can be viewed as variable section cylindrical structures, have multiple rotational vibration sources (e.g., engines, propellers, compressors and motors), making the spectrum of noise multiple-harmonic. The suppression of such noise has been a focus of interests in the field of active vibration control (AVC). In this paper, a multiple-source multiple-harmonic (MSMH) active vibration suppression algorithm with feed-forward structure is proposed based on reference amplitude rectification and conjugate gradient method (CGM). An AVC simulation scheme called finite element model in-loop simulation (FEMILS) is also proposed for rapid algorithm verification. Numerical studies of AVC are conducted on a variable section cylindrical structure based on the proposed MSMH algorithm and FEMILS scheme. It can be seen from the numerical studies that: (1) the proposed MSMH algorithm can individually suppress each component of the multiple-harmonic noise with an unified and improved convergence rate; (2) the FEMILS scheme is convenient and straightforward for multiple-source simulations with an acceptable loop time. Moreover, the simulations have similar procedure to real-life control and can be easily extended to physical model platform.

  2. Predictive performance models and multiple task performance

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  3. The Answering Process for Multiple-Choice Questions in Collaborative Learning: A Mathematical Learning Model Analysis

    ERIC Educational Resources Information Center

    Nakamura, Yasuyuki; Nishi, Shinnosuke; Muramatsu, Yuta; Yasutake, Koichi; Yamakawa, Osamu; Tagawa, Takahiro

    2014-01-01

    In this paper, we introduce a mathematical model for collaborative learning and the answering process for multiple-choice questions. The collaborative learning model is inspired by the Ising spin model and the model for answering multiple-choice questions is based on their difficulty level. An intensive simulation study predicts the possibility of…

  4. Entropic multiple-relaxation-time multirange pseudopotential lattice Boltzmann model for two-phase flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Feifei; Mazloomi Moqaddam, Ali; Kang, Qinjun

    Here, an entropic multiple-relaxation-time lattice Boltzmann approach is coupled to a multirange Shan-Chen pseudopotential model to study the two-phase flow. Compared with previous multiple-relaxation-time multiphase models, this model is stable and accurate for the simulation of a two-phase flow in a much wider range of viscosity and surface tension at a high liquid-vapor density ratio. A stationary droplet surrounded by equilibrium vapor is first simulated to validate this model using the coexistence curve and Laplace’s law. Then, two series of droplet impact behavior, on a liquid film and a flat surface, are simulated in comparison with theoretical or experimental results.more » Droplet impact on a liquid film is simulated for different Reynolds numbers at high Weber numbers. With the increase of the Sommerfeld parameter, onset of splashing is observed and multiple secondary droplets occur. The droplet spreading ratio agrees well with the square root of time law and is found to be independent of Reynolds number. Moreover, shapes of simulated droplets impacting hydrophilic and superhydrophobic flat surfaces show good agreement with experimental observations through the entire dynamic process. The maximum spreading ratio of a droplet impacting the superhydrophobic flat surface is studied for a large range of Weber numbers. Results show that the rescaled maximum spreading ratios are in good agreement with a universal scaling law. This series of simulations demonstrates that the proposed model accurately captures the complex fluid-fluid and fluid-solid interfacial physical processes for a wide range of Reynolds and Weber numbers at high density ratios.« less

  5. Entropic multiple-relaxation-time multirange pseudopotential lattice Boltzmann model for two-phase flow

    NASA Astrophysics Data System (ADS)

    Qin, Feifei; Mazloomi Moqaddam, Ali; Kang, Qinjun; Derome, Dominique; Carmeliet, Jan

    2018-03-01

    An entropic multiple-relaxation-time lattice Boltzmann approach is coupled to a multirange Shan-Chen pseudopotential model to study the two-phase flow. Compared with previous multiple-relaxation-time multiphase models, this model is stable and accurate for the simulation of a two-phase flow in a much wider range of viscosity and surface tension at a high liquid-vapor density ratio. A stationary droplet surrounded by equilibrium vapor is first simulated to validate this model using the coexistence curve and Laplace's law. Then, two series of droplet impact behavior, on a liquid film and a flat surface, are simulated in comparison with theoretical or experimental results. Droplet impact on a liquid film is simulated for different Reynolds numbers at high Weber numbers. With the increase of the Sommerfeld parameter, onset of splashing is observed and multiple secondary droplets occur. The droplet spreading ratio agrees well with the square root of time law and is found to be independent of Reynolds number. Moreover, shapes of simulated droplets impacting hydrophilic and superhydrophobic flat surfaces show good agreement with experimental observations through the entire dynamic process. The maximum spreading ratio of a droplet impacting the superhydrophobic flat surface is studied for a large range of Weber numbers. Results show that the rescaled maximum spreading ratios are in good agreement with a universal scaling law. This series of simulations demonstrates that the proposed model accurately captures the complex fluid-fluid and fluid-solid interfacial physical processes for a wide range of Reynolds and Weber numbers at high density ratios.

  6. Entropic multiple-relaxation-time multirange pseudopotential lattice Boltzmann model for two-phase flow

    DOE PAGES

    Qin, Feifei; Mazloomi Moqaddam, Ali; Kang, Qinjun; ...

    2018-03-22

    Here, an entropic multiple-relaxation-time lattice Boltzmann approach is coupled to a multirange Shan-Chen pseudopotential model to study the two-phase flow. Compared with previous multiple-relaxation-time multiphase models, this model is stable and accurate for the simulation of a two-phase flow in a much wider range of viscosity and surface tension at a high liquid-vapor density ratio. A stationary droplet surrounded by equilibrium vapor is first simulated to validate this model using the coexistence curve and Laplace’s law. Then, two series of droplet impact behavior, on a liquid film and a flat surface, are simulated in comparison with theoretical or experimental results.more » Droplet impact on a liquid film is simulated for different Reynolds numbers at high Weber numbers. With the increase of the Sommerfeld parameter, onset of splashing is observed and multiple secondary droplets occur. The droplet spreading ratio agrees well with the square root of time law and is found to be independent of Reynolds number. Moreover, shapes of simulated droplets impacting hydrophilic and superhydrophobic flat surfaces show good agreement with experimental observations through the entire dynamic process. The maximum spreading ratio of a droplet impacting the superhydrophobic flat surface is studied for a large range of Weber numbers. Results show that the rescaled maximum spreading ratios are in good agreement with a universal scaling law. This series of simulations demonstrates that the proposed model accurately captures the complex fluid-fluid and fluid-solid interfacial physical processes for a wide range of Reynolds and Weber numbers at high density ratios.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less

  8. Simulations of buoyancy-generated horizontal roll vortices over multiple heating lines

    Treesearch

    W.E. Heilman

    1994-01-01

    A two-dimensional nonhydrostatic atmospheric model is used to simulate the boundary-layer circulations that develop from multiple lines of extremely high surface temperatures. Numerical simulations are carried out to investigate the role of buoyancy and ambient crossflow effects in generating horizontal roll vortices in the vicinity of adjacent wildland fire perimeters...

  9. Algorithms for radiative transfer simulations for aerosol retrieval

    NASA Astrophysics Data System (ADS)

    Mukai, Sonoyo; Sano, Itaru; Nakata, Makiko

    2012-11-01

    Aerosol retrieval work from satellite data, i.e. aerosol remote sensing, is divided into three parts as: satellite data analysis, aerosol modeling and multiple light scattering calculation in the atmosphere model which is called radiative transfer simulation. The aerosol model is compiled from the accumulated measurements during more than ten years provided with the world wide aerosol monitoring network (AERONET). The radiative transfer simulations take Rayleigh scattering by molecules and Mie scattering by aerosols in the atmosphere, and reflection by the Earth surface into account. Thus the aerosol properties are estimated by comparing satellite measurements with the numerical values of radiation simulations in the Earth-atmosphere-surface model. It is reasonable to consider that the precise simulation of multiple light-scattering processes is necessary, and needs a long computational time especially in an optically thick atmosphere model. Therefore efficient algorithms for radiative transfer problems are indispensable to retrieve aerosols from space.

  10. Interacting Multiple Model (IMM) Fifth-Degree Spherical Simplex-Radial Cubature Kalman Filter for Maneuvering Target Tracking

    PubMed Central

    Liu, Hua; Wu, Wen

    2017-01-01

    For improving the tracking accuracy and model switching speed of maneuvering target tracking in nonlinear systems, a new algorithm named the interacting multiple model fifth-degree spherical simplex-radial cubature Kalman filter (IMM5thSSRCKF) is proposed in this paper. The new algorithm is a combination of the interacting multiple model (IMM) filter and the fifth-degree spherical simplex-radial cubature Kalman filter (5thSSRCKF). The proposed algorithm makes use of Markov process to describe the switching probability among the models, and uses 5thSSRCKF to deal with the state estimation of each model. The 5thSSRCKF is an improved filter algorithm, which utilizes the fifth-degree spherical simplex-radial rule to improve the filtering accuracy. Finally, the tracking performance of the IMM5thSSRCKF is evaluated by simulation in a typical maneuvering target tracking scenario. Simulation results show that the proposed algorithm has better tracking performance and quicker model switching speed when disposing maneuver models compared with the interacting multiple model unscented Kalman filter (IMMUKF), the interacting multiple model cubature Kalman filter (IMMCKF) and the interacting multiple model fifth-degree cubature Kalman filter (IMM5thCKF). PMID:28608843

  11. Interacting Multiple Model (IMM) Fifth-Degree Spherical Simplex-Radial Cubature Kalman Filter for Maneuvering Target Tracking.

    PubMed

    Liu, Hua; Wu, Wen

    2017-06-13

    For improving the tracking accuracy and model switching speed of maneuvering target tracking in nonlinear systems, a new algorithm named the interacting multiple model fifth-degree spherical simplex-radial cubature Kalman filter (IMM5thSSRCKF) is proposed in this paper. The new algorithm is a combination of the interacting multiple model (IMM) filter and the fifth-degree spherical simplex-radial cubature Kalman filter (5thSSRCKF). The proposed algorithm makes use of Markov process to describe the switching probability among the models, and uses 5thSSRCKF to deal with the state estimation of each model. The 5thSSRCKF is an improved filter algorithm, which utilizes the fifth-degree spherical simplex-radial rule to improve the filtering accuracy. Finally, the tracking performance of the IMM5thSSRCKF is evaluated by simulation in a typical maneuvering target tracking scenario. Simulation results show that the proposed algorithm has better tracking performance and quicker model switching speed when disposing maneuver models compared with the interacting multiple model unscented Kalman filter (IMMUKF), the interacting multiple model cubature Kalman filter (IMMCKF) and the interacting multiple model fifth-degree cubature Kalman filter (IMM5thCKF).

  12. Middle Rio Grande Cooperative Water Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tidwell, Vince; Passell, Howard

    2005-11-01

    This is computer simulation model built in a commercial modeling product Called Studio Expert, developed by Powersim, Inc. The simulation model is built in a system dynamics environment, allowing the simulation of the interaction among multiple systems that are all changing over time. The model focuses on hydrology, ecology, demography, and economy of the Middle Rio Grande, with Water as the unifying feature.

  13. Multiple imputation of covariates by fully conditional specification: Accommodating the substantive model

    PubMed Central

    Seaman, Shaun R; White, Ian R; Carpenter, James R

    2015-01-01

    Missing covariate data commonly occur in epidemiological and clinical research, and are often dealt with using multiple imputation. Imputation of partially observed covariates is complicated if the substantive model is non-linear (e.g. Cox proportional hazards model), or contains non-linear (e.g. squared) or interaction terms, and standard software implementations of multiple imputation may impute covariates from models that are incompatible with such substantive models. We show how imputation by fully conditional specification, a popular approach for performing multiple imputation, can be modified so that covariates are imputed from models which are compatible with the substantive model. We investigate through simulation the performance of this proposal, and compare it with existing approaches. Simulation results suggest our proposal gives consistent estimates for a range of common substantive models, including models which contain non-linear covariate effects or interactions, provided data are missing at random and the assumed imputation models are correctly specified and mutually compatible. Stata software implementing the approach is freely available. PMID:24525487

  14. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  15. Analysis of dense-medium light scattering with applications to corneal tissue: experiments and Monte Carlo simulations.

    PubMed

    Kim, K B; Shanyfelt, L M; Hahn, D W

    2006-01-01

    Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.

  16. Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections

    NASA Astrophysics Data System (ADS)

    Wakazuki, Y.

    2015-12-01

    A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.

  17. Computer simulation of a multiple-aperture coherent laser radar

    NASA Astrophysics Data System (ADS)

    Gamble, Kevin J.; Weeks, Arthur R.

    1996-06-01

    This paper presents the construction of a 2D multiple aperture coherent laser radar simulation that is capable of including the effects of the time evolution of speckle on the laser radar output. Every portion of a laser radar system is modeled in software, including quarter and half wave plates, beamsplitters (polarizing and non-polarizing), the detector, the laser source, and all necessary lenses. Free space propagation is implemented using the Rayleigh- Sommerfeld integral for both orthogonal polarizations. Atmospheric turbulence is also included in the simulation and is modeled using time correlated Kolmogorov phase screens. The simulation itself can be configured to simulate both monostatic and bistatic systems. The simulation allows the user to specify component level parameters such as extinction ratios for polarizing beam splitters, detector sizes and shapes. orientation of the slow axis for quarter/half wave plates and other components used in the system. This is useful from a standpoint of being a tool in the design of a multiple aperture laser radar system.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.

    Here, as part of an international intercomparison project, a set of single-column models (SCMs) and cloud-resolving models (CRMs) are run under the weak-temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistentmore » implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.« less

  19. SIMULATING SUB-DECADAL CHANNEL MORPHOLOGIC CHANGE IN EPHEMERAL STREAM NETWORKS

    EPA Science Inventory

    A distributed watershed model was modified to simulate cumulative channel morphologic
    change from multiple runoff events in ephemeral stream networks. The model incorporates the general design of the event-based Kinematic Runoff and" Erosion Model (KINEROS), which describes t...

  20. Multiple point statistical simulation using uncertain (soft) conditional data

    NASA Astrophysics Data System (ADS)

    Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou

    2018-05-01

    Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.

  1. NASCAP simulation of laboratory charging tests using multiple electron guns

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Parks, D. E.

    1981-01-01

    NASCAP calculations have been performed simulating exposure of a spacecraft-like model to multiple electron guns. The results agree well with experiment. It is found that magnetic field effects are fairly small, but substantial differential charging can result from electron gun placement. Conditions for surface flashover are readily achieved.

  2. Developing an approach to effectively use super ensemble experiments for the projection of hydrological extremes under climate change

    NASA Astrophysics Data System (ADS)

    Watanabe, S.; Kim, H.; Utsumi, N.

    2017-12-01

    This study aims to develop a new approach which projects hydrology under climate change using super ensemble experiments. The use of multiple ensemble is essential for the estimation of extreme, which is a major issue in the impact assessment of climate change. Hence, the super ensemble experiments are recently conducted by some research programs. While it is necessary to use multiple ensemble, the multiple calculations of hydrological simulation for each output of ensemble simulations needs considerable calculation costs. To effectively use the super ensemble experiments, we adopt a strategy to use runoff projected by climate models directly. The general approach of hydrological projection is to conduct hydrological model simulations which include land-surface and river routing process using atmospheric boundary conditions projected by climate models as inputs. This study, on the other hand, simulates only river routing model using runoff projected by climate models. In general, the climate model output is systematically biased so that a preprocessing which corrects such bias is necessary for impact assessments. Various bias correction methods have been proposed, but, to the best of our knowledge, no method has proposed for variables other than surface meteorology. Here, we newly propose a method for utilizing the projected future runoff directly. The developed method estimates and corrects the bias based on the pseudo-observation which is a result of retrospective offline simulation. We show an application of this approach to the super ensemble experiments conducted under the program of Half a degree Additional warming, Prognosis and Projected Impacts (HAPPI). More than 400 ensemble experiments from multiple climate models are available. The results of the validation using historical simulations by HAPPI indicates that the output of this approach can effectively reproduce retrospective runoff variability. Likewise, the bias of runoff from super ensemble climate projections is corrected, and the impact of climate change on hydrologic extremes is assessed in a cost-efficient way.

  3. Hybrid-Lambda: simulation of multiple merger and Kingman gene genealogies in species networks and species trees.

    PubMed

    Zhu, Sha; Degnan, James H; Goldstien, Sharyn J; Eldon, Bjarki

    2015-09-15

    There has been increasing interest in coalescent models which admit multiple mergers of ancestral lineages; and to model hybridization and coalescence simultaneously. Hybrid-Lambda is a software package that simulates gene genealogies under multiple merger and Kingman's coalescent processes within species networks or species trees. Hybrid-Lambda allows different coalescent processes to be specified for different populations, and allows for time to be converted between generations and coalescent units, by specifying a population size for each population. In addition, Hybrid-Lambda can generate simulated datasets, assuming the infinitely many sites mutation model, and compute the F ST statistic. As an illustration, we apply Hybrid-Lambda to infer the time of subdivision of certain marine invertebrates under different coalescent processes. Hybrid-Lambda makes it possible to investigate biogeographic concordance among high fecundity species exhibiting skewed offspring distribution.

  4. Acceleration of discrete stochastic biochemical simulation using GPGPU.

    PubMed

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130.

  5. Acceleration of discrete stochastic biochemical simulation using GPGPU

    PubMed Central

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130. PMID:25762936

  6. PyNN: A Common Interface for Neuronal Network Simulators.

    PubMed

    Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.

  7. PyNN: A Common Interface for Neuronal Network Simulators

    PubMed Central

    Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  8. Implementation and Evaluation of Multiple Adaptive Control Technologies for a Generic Transport Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.; Nguyen, Nhan T.; Krishakumar, Kalmanje S.

    2010-01-01

    Presented here is the evaluation of multiple adaptive control technologies for a generic transport aircraft simulation. For this study, seven model reference adaptive control (MRAC) based technologies were considered. Each technology was integrated into an identical dynamic-inversion control architecture and tuned using a methodology based on metrics and specific design requirements. Simulation tests were then performed to evaluate each technology s sensitivity to time-delay, flight condition, model uncertainty, and artificially induced cross-coupling. The resulting robustness and performance characteristics were used to identify potential strengths, weaknesses, and integration challenges of the individual adaptive control technologies

  9. Joint Seasonal ARMA Approach for Modeling of Load Forecast Errors in Planning Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.

    2014-04-14

    To make informed and robust decisions in the probabilistic power system operation and planning process, it is critical to conduct multiple simulations of the generated combinations of wind and load parameters and their forecast errors to handle the variability and uncertainty of these time series. In order for the simulation results to be trustworthy, the simulated series must preserve the salient statistical characteristics of the real series. In this paper, we analyze day-ahead load forecast error data from multiple balancing authority locations and characterize statistical properties such as mean, standard deviation, autocorrelation, correlation between series, time-of-day bias, and time-of-day autocorrelation.more » We then construct and validate a seasonal autoregressive moving average (ARMA) model to model these characteristics, and use the model to jointly simulate day-ahead load forecast error series for all BAs.« less

  10. Integrating neuroinformatics tools in TheVirtualBrain.

    PubMed

    Woodman, M Marmaduke; Pezard, Laurent; Domide, Lia; Knock, Stuart A; Sanz-Leon, Paula; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor

    2014-01-01

    TheVirtualBrain (TVB) is a neuroinformatics Python package representing the convergence of clinical, systems, and theoretical neuroscience in the analysis, visualization and modeling of neural and neuroimaging dynamics. TVB is composed of a flexible simulator for neural dynamics measured across scales from local populations to large-scale dynamics measured by electroencephalography (EEG), magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI), and core analytic and visualization functions, all accessible through a web browser user interface. A datatype system modeling neuroscientific data ties together these pieces with persistent data storage, based on a combination of SQL and HDF5. These datatypes combine with adapters allowing TVB to integrate other algorithms or computational systems. TVB provides infrastructure for multiple projects and multiple users, possibly participating under multiple roles. For example, a clinician might import patient data to identify several potential lesion points in the patient's connectome. A modeler, working on the same project, tests these points for viability through whole brain simulation, based on the patient's connectome, and subsequent analysis of dynamical features. TVB also drives research forward: the simulator itself represents the culmination of several simulation frameworks in the modeling literature. The availability of the numerical methods, set of neural mass models and forward solutions allows for the construction of a wide range of brain-scale simulation scenarios. This paper briefly outlines the history and motivation for TVB, describing the framework and simulator, giving usage examples in the web UI and Python scripting.

  11. Integrating neuroinformatics tools in TheVirtualBrain

    PubMed Central

    Woodman, M. Marmaduke; Pezard, Laurent; Domide, Lia; Knock, Stuart A.; Sanz-Leon, Paula; Mersmann, Jochen; McIntosh, Anthony R.; Jirsa, Viktor

    2014-01-01

    TheVirtualBrain (TVB) is a neuroinformatics Python package representing the convergence of clinical, systems, and theoretical neuroscience in the analysis, visualization and modeling of neural and neuroimaging dynamics. TVB is composed of a flexible simulator for neural dynamics measured across scales from local populations to large-scale dynamics measured by electroencephalography (EEG), magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI), and core analytic and visualization functions, all accessible through a web browser user interface. A datatype system modeling neuroscientific data ties together these pieces with persistent data storage, based on a combination of SQL and HDF5. These datatypes combine with adapters allowing TVB to integrate other algorithms or computational systems. TVB provides infrastructure for multiple projects and multiple users, possibly participating under multiple roles. For example, a clinician might import patient data to identify several potential lesion points in the patient's connectome. A modeler, working on the same project, tests these points for viability through whole brain simulation, based on the patient's connectome, and subsequent analysis of dynamical features. TVB also drives research forward: the simulator itself represents the culmination of several simulation frameworks in the modeling literature. The availability of the numerical methods, set of neural mass models and forward solutions allows for the construction of a wide range of brain-scale simulation scenarios. This paper briefly outlines the history and motivation for TVB, describing the framework and simulator, giving usage examples in the web UI and Python scripting. PMID:24795617

  12. A multiple pointing-mount control strategy for space platforms

    NASA Technical Reports Server (NTRS)

    Johnson, C. D.

    1992-01-01

    A new disturbance-adaptive control strategy for multiple pointing-mount space platforms is proposed and illustrated by consideration of a simplified 3-link dynamic model of a multiple pointing-mount space platform. Simulation results demonstrate the effectiveness of the new platform control strategy. The simulation results also reveal a system 'destabilization phenomena' that can occur if the set of individual platform-mounted experiment controllers are 'too responsive.'

  13. A multiple hypotheses uncertainty analysis in hydrological modelling: about model structure, landscape parameterization, and numerical integration

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2016-04-01

    Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.

  14. Multi-ray medical ultrasound simulation without explicit speckle modelling.

    PubMed

    Tuzer, Mert; Yazıcı, Abdulkadir; Türkay, Rüştü; Boyman, Michael; Acar, Burak

    2018-05-04

    To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.

  15. Assessing climate change impacts, benefits of mitigation, and uncertainties on major global forest regions under multiple socioeconomic and emissions scenarios

    Treesearch

    John B Kim; Erwan Monier; Brent Sohngen; G Stephen Pitts; Ray Drapek; James McFarland; Sara Ohrel; Jefferson Cole

    2016-01-01

    We analyze a set of simulations to assess the impact of climate change on global forests where MC2 dynamic global vegetation model (DGVM) was run with climate simulations from the MIT Integrated Global System Model-Community Atmosphere Model (IGSM-CAM) modeling framework. The core study relies on an ensemble of climate simulations under two emissions scenarios: a...

  16. A Generic Multibody Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Neuhaus, Jason Richard; Kenney, Patrick Sean

    2006-01-01

    Flight simulation of dynamic atmospheric vehicles with parachute systems is a complex task that is not easily modeled in many simulation frameworks. In the past, the performance of vehicles with parachutes was analyzed by simulations dedicated to parachute operations and were generally not used for any other portion of the vehicle flight trajectory. This approach required multiple simulation resources to completely analyze the performance of the vehicle. Recently, improved software engineering practices and increased computational power have allowed a single simulation to model the entire flight profile of a vehicle employing a parachute.

  17. Can discrete event simulation be of use in modelling major depression?

    PubMed Central

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-01-01

    Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes. PMID:17147790

  18. Can discrete event simulation be of use in modelling major depression?

    PubMed

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-12-05

    Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.

  19. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  20. USING THE ECLPSS SOFTWARE ENVIRONMENT TO BUILD A SPATIALLY EXPLICIT COMPONENT-BASED MODEL OF OZONE EFFECTS ON FOREST ECOSYSTEMS. (R827958)

    EPA Science Inventory

    We have developed a modeling framework to support grid-based simulation of ecosystems at multiple spatial scales, the Ecological Component Library for Parallel Spatial Simulation (ECLPSS). ECLPSS helps ecologists to build robust spatially explicit simulations of ...

  1. Kalman approach to accuracy management for interoperable heterogeneous model abstraction within an HLA-compliant simulation

    NASA Astrophysics Data System (ADS)

    Leskiw, Donald M.; Zhau, Junmei

    2000-06-01

    This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.

  2. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  3. The North American Regional Climate Change Assessment Program (NARCCAP): Status and results

    NASA Astrophysics Data System (ADS)

    Gutowski, W. J.

    2009-12-01

    NARCCAP is a multi-institutional program that is investigating systematically the uncertainties in regional scale simulations of contemporary climate and projections of future climate. NARCCAP is supported by multiple federal agencies. NARCCAP is producing an ensemble of high-resolution climate-change scenarios by nesting multiple RCMs in reanalyses and multiple atmosphere-ocean GCM simulations of contemporary and future-scenario climates. The RCM domains cover the contiguous U.S., northern Mexico, and most of Canada. The simulation suite also includes time-slice, high resolution GCMs that use sea-surface temperatures from parent atmosphere-ocean GCMs. The baseline resolution of the RCMs and time-slice GCMs is 50 km. Simulations use three sources of boundary conditions: National Centers for Environmental Prediction (NCEP)/Department of Energy (DOE) AMIP-II Reanalysis, GCMs simulating contemporary climate and GCMs using the A2 SRES emission scenario for the twenty-first century. Simulations cover 1979-2004 and 2038-2060, with the first 3 years discarded for spin-up. The resulting RCM and time-slice simulations offer opportunity for extensive analysis of RCM simulations as well as a basis for multiple high-resolution climate scenarios for climate change impacts assessments. Geophysical statisticians are developing measures of uncertainty from the ensemble. To enable very high-resolution simulations of specific regions, both RCM and high-resolution time-slice simulations are saving output needed for further downscaling. All output is publically available to the climate analysis and the climate impacts assessment community, through an archiving and data-distribution plan. Some initial results show that the models closely reproduce ENSO-related precipitation variations in coastal California, where the correlation between the simulated and observed monthly time series exceeds 0.94 for all models. The strong El Nino events of 1982-83 and 1997-98 are well reproduced for the Pacific coastal region of the U.S. in all models. ENSO signals are less well reproduced in other regions. The models also produce well extreme monthly precipitation in coastal California and the Upper Midwest. Model performance tends to deteriorate from west to east across the domain, or roughly from the inflow boundary toward the outflow boundary. This deterioration with distance from the inflow boundary is ameliorated to some extent in models formulated such that large-scale information is included in the model solution, whether implemented by spectral nudging or by use of a perturbation form of the governing equations.

  4. Applications of active adaptive noise control to jet engines

    NASA Technical Reports Server (NTRS)

    Shoureshi, Rahmat; Brackney, Larry

    1993-01-01

    During phase 2 research on the application of active noise control to jet engines, the development of multiple-input/multiple-output (MIMO) active adaptive noise control algorithms and acoustic/controls models for turbofan engines were considered. Specific goals for this research phase included: (1) implementation of a MIMO adaptive minimum variance active noise controller; and (2) turbofan engine model development. A minimum variance control law for adaptive active noise control has been developed, simulated, and implemented for single-input/single-output (SISO) systems. Since acoustic systems tend to be distributed, multiple sensors, and actuators are more appropriate. As such, the SISO minimum variance controller was extended to the MIMO case. Simulation and experimental results are presented. A state-space model of a simplified gas turbine engine is developed using the bond graph technique. The model retains important system behavior, yet is of low enough order to be useful for controller design. Expansion of the model to include multiple stages and spools is also discussed.

  5. Statistical methods for incomplete data: Some results on model misspecification.

    PubMed

    McIsaac, Michael; Cook, R J

    2017-02-01

    Inverse probability weighted estimating equations and multiple imputation are two of the most studied frameworks for dealing with incomplete data in clinical and epidemiological research. We examine the limiting behaviour of estimators arising from inverse probability weighted estimating equations, augmented inverse probability weighted estimating equations and multiple imputation when the requisite auxiliary models are misspecified. We compute limiting values for settings involving binary responses and covariates and illustrate the effects of model misspecification using simulations based on data from a breast cancer clinical trial. We demonstrate that, even when both auxiliary models are misspecified, the asymptotic biases of double-robust augmented inverse probability weighted estimators are often smaller than the asymptotic biases of estimators arising from complete-case analyses, inverse probability weighting or multiple imputation. We further demonstrate that use of inverse probability weighting or multiple imputation with slightly misspecified auxiliary models can actually result in greater asymptotic bias than the use of naïve, complete case analyses. These asymptotic results are shown to be consistent with empirical results from simulation studies.

  6. Modeling Cometary Coma with a Three Dimensional, Anisotropic Multiple Scattering Distributed Processing Code

    NASA Technical Reports Server (NTRS)

    Luchini, Chris B.

    1997-01-01

    Development of camera and instrument simulations for space exploration requires the development of scientifically accurate models of the objects to be studied. Several planned cometary missions have prompted the development of a three dimensional, multi-spectral, anisotropic multiple scattering model of cometary coma.

  7. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies.

    PubMed

    Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E

    2015-06-16

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .

  8. Comparison of the didactic lecture with the simulation/model approach for the teaching of a novel perioperative ultrasound curriculum to anesthesiology residents.

    PubMed

    Ramsingh, Davinder; Alexander, Brenton; Le, Khanhvan; Williams, Wendell; Canales, Cecilia; Cannesson, Maxime

    2014-09-01

    To expose residents to two methods of education for point-of-care ultrasound, a traditional didactic lecture and a model/simulation-based lecture, which focus on concepts of cardiopulmonary function, volume status, and evaluation of severe thoracic/abdominal injuries; and to assess which method is more effective. Single-center, prospective, blinded trial. University hospital. Anesthesiology residents who were assigned to an educational day during the two-month research study period. Residents were allocated to two groups to receive either a 90-minute, one-on-one didactic lecture or a 90-minute lecture in a simulation center, during which they practiced on a human model and simulation mannequin (normal pathology). Data points included a pre-lecture multiple-choice test, post-lecture multiple-choice test, and post-lecture, human model-based examination. Post-lecture tests were performed within three weeks of the lecture. An experienced sonographer who was blinded to the education modality graded the model-based skill assessment examinations. Participants completed a follow-up survey to assess the perceptions of the quality of their instruction between the two groups. 20 residents completed the study. No differences were noted between the two groups in pre-lecture test scores (P = 0.97), but significantly higher scores for the model/simulation group occurred on both the post-lecture multiple choice (P = 0.038) and post-lecture model (P = 0.041) examinations. Follow-up resident surveys showed significantly higher scores in the model/simulation group regarding overall interest in perioperative ultrasound (P = 0.047) as well understanding of the physiologic concepts (P = 0.021). A model/simulation-based based lecture series may be more effective in teaching the skills needed to perform a point-of-care ultrasound examination to anesthesiology residents. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.

    PubMed

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  10. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models

    PubMed Central

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A.

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients. PMID:25374542

  11. Variety Preserved Instance Weighting and Prototype Selection for Probabilistic Multiple Scope Simulations

    DTIC Science & Technology

    2017-05-30

    including analysis, control and management of the systems across their multiple scopes . These difficulties will become more significant in near future...behaviors of the systems , it tends to cover their many scopes . Accordingly, we may obtain better models for the simulations in a data-driven manner...to capture variety of the instance distribution in a given data set for covering multiple scopes of our objective system in a seamless manner. (2

  12. SimilarityExplorer: A visual inter-comparison tool for multifaceted climate data

    Treesearch

    J. Poco; A. Dasgupta; Y. Wei; W. Hargrove; C. Schwalm; R. Cook; E. Bertini; C. Silva

    2014-01-01

    Inter-comparison and similarity analysis to gauge consensus among multiple simulation models is a critical visualization problem for understanding climate change patterns. Climate models, specifically, Terrestrial Biosphere Models (TBM) represent time and space variable ecosystem processes, for example, simulations of photosynthesis and respiration, using algorithms...

  13. Explaining the heterogeneity of functional connectivity findings in multiple sclerosis: An empirically informed modeling study.

    PubMed

    Tewarie, Prejaas; Steenwijk, Martijn D; Brookes, Matthew J; Uitdehaag, Bernard M J; Geurts, Jeroen J G; Stam, Cornelis J; Schoonheim, Menno M

    2018-06-01

    To understand the heterogeneity of functional connectivity results reported in the literature, we analyzed the separate effects of grey and white matter damage on functional connectivity and networks in multiple sclerosis. For this, we employed a biophysical thalamo-cortical model consisting of interconnected cortical and thalamic neuronal populations, informed and amended by empirical diffusion MRI tractography data, to simulate functional data that mimic neurophysiological signals. Grey matter degeneration was simulated by decreasing within population connections and white matter degeneration by lowering between population connections, based on lesion predilection sites in multiple sclerosis. For all simulations, functional connectivity and functional network organization are quantified by phase synchronization and network integration, respectively. Modeling results showed that both cortical and thalamic grey matter damage induced a global increase in functional connectivity, whereas white matter damage induced an initially increased connectivity followed by a global decrease. Both white and especially grey matter damage, however, induced a decrease in network integration. These empirically informed simulations show that specific topology and timing of structural damage are nontrivial aspects in explaining functional abnormalities in MS. Insufficient attention to these aspects likely explains contradictory findings in multiple sclerosis functional imaging studies so far. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  14. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  15. Intercomparison of methods of coupling between convection and large-scale circulation. 1. Comparison over uniform surface conditions

    DOE PAGES

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.; ...

    2015-10-24

    Here, as part of an international intercomparison project, a set of single-column models (SCMs) and cloud-resolving models (CRMs) are run under the weak-temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistentmore » implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.« less

  16. Predicting Upwelling Radiance on the West Florida Shelf

    DTIC Science & Technology

    2006-03-31

    National Science Foundation . The chemical and biological model includes the ability to simulate multiple groups of phytoplankton, multiple limiting nutrients, spectral light harvesting by phytoplankton, multiple particulate and dissolved degradational pools of organic material, and non-stoichometric carbon, nitrogen, phosphorus, silica, and iron dynamics. It also includes a complete spectral light model for the prediction of Inherent Optical Properties (IOPs). The coupling of the predicted IOP model (Ecosim 2.0) with robust radiative transfer model (Ecolight

  17. Collapse Causes Analysis and Numerical Simulation for a Rigid Frame Multiple Arch Bridge

    NASA Astrophysics Data System (ADS)

    Zuo, XinDai

    2018-03-01

    Following the collapse accident of Baihe Bridge, the author built a plane model of the whole bridge firstly and analyzed the carrying capacity of the structure for a 170-tons lorry load. Then the author built a spatial finite element model which can accurately simulate the bridge collapse course. The collapse course was simulated and the accident scene was reproduced. Spatial analysis showed rotational stiffness of the pier bottom had a large influence on the collapse from of the superstructures. The conclusion was that the170 tons lorry load and multiple arch bridge design were the important factors leading to collapse.

  18. SIMPAVE : evaluation of virtual environments for pavement construction simulations

    DOT National Transportation Integrated Search

    2007-05-01

    In the last couple of years, the authors have been developing virtual simulations for modeling the construction of asphalt pavements. The simulations are graphically rich, interactive, three-dimensional, with realistic physics, and allow multiple peo...

  19. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    PubMed

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  20. Hybrid network modeling and the effect of image resolution on digitally-obtained petrophysical and two-phase flow properties

    NASA Astrophysics Data System (ADS)

    Aghaei, A.

    2017-12-01

    Digital imaging and modeling of rocks and subsequent simulation of physical phenomena in digitally-constructed rock models are becoming an integral part of core analysis workflows. One of the inherent limitations of image-based analysis, at any given scale, is image resolution. This limitation becomes more evident when the rock has multiple scales of porosity such as in carbonates and tight sandstones. Multi-scale imaging and constructions of hybrid models that encompass images acquired at multiple scales and resolutions are proposed as a solution to this problem. In this study, we investigate the effect of image resolution and unresolved porosity on petrophysical and two-phase flow properties calculated based on images. A helical X-ray micro-CT scanner with a high cone-angle is used to acquire digital rock images that are free of geometric distortion. To remove subjectivity from the analyses, a semi-automated image processing technique is used to process and segment the acquired data into multiple phases. Direct and pore network based models are used to simulate physical phenomena and obtain absolute permeability, formation factor and two-phase flow properties such as relative permeability and capillary pressure. The effect of image resolution on each property is investigated. Finally a hybrid network model incorporating images at multiple resolutions is built and used for simulations. The results from the hybrid model are compared against results from the model built at the highest resolution and those from laboratory tests.

  1. Discrimination of biological and chemical threat simulants in residue mixtures on multiple substrates.

    PubMed

    Gottfried, Jennifer L

    2011-07-01

    The potential of laser-induced breakdown spectroscopy (LIBS) to discriminate biological and chemical threat simulant residues prepared on multiple substrates and in the presence of interferents has been explored. The simulant samples tested include Bacillus atrophaeus spores, Escherichia coli, MS-2 bacteriophage, α-hemolysin from Staphylococcus aureus, 2-chloroethyl ethyl sulfide, and dimethyl methylphosphonate. The residue samples were prepared on polycarbonate, stainless steel and aluminum foil substrates by Battelle Eastern Science and Technology Center. LIBS spectra were collected by Battelle on a portable LIBS instrument developed by A3 Technologies. This paper presents the chemometric analysis of the LIBS spectra using partial least-squares discriminant analysis (PLS-DA). The performance of PLS-DA models developed based on the full LIBS spectra, and selected emission intensities and ratios have been compared. The full-spectra models generally provided better classification results based on the inclusion of substrate emission features; however, the intensity/ratio models were able to correctly identify more types of simulant residues in the presence of interferents. The fusion of the two types of PLS-DA models resulted in a significant improvement in classification performance for models built using multiple substrates. In addition to identifying the major components of residue mixtures, minor components such as growth media and solvents can be identified with an appropriately designed PLS-DA model.

  2. A comparison of multiple behavior models in a simulation of the aftermath of an improvised nuclear detonation.

    PubMed

    Parikh, Nidhi; Hayatnagarkar, Harshal G; Beckman, Richard J; Marathe, Madhav V; Swarup, Samarth

    2016-11-01

    We describe a large-scale simulation of the aftermath of a hypothetical 10kT improvised nuclear detonation at ground level, near the White House in Washington DC. We take a synthetic information approach, where multiple data sets are combined to construct a synthesized representation of the population of the region with accurate demographics, as well as four infrastructures: transportation, healthcare, communication, and power. In this article, we focus on the model of agents and their behavior, which is represented using the options framework. Six different behavioral options are modeled: household reconstitution, evacuation, healthcare-seeking, worry, shelter-seeking, and aiding & assisting others. Agent decision-making takes into account their health status, information about family members, information about the event, and their local environment. We combine these behavioral options into five different behavior models of increasing complexity and do a number of simulations to compare the models.

  3. A comparison of multiple behavior models in a simulation of the aftermath of an improvised nuclear detonation

    PubMed Central

    Parikh, Nidhi; Hayatnagarkar, Harshal G.; Beckman, Richard J.; Marathe, Madhav V.; Swarup, Samarth

    2016-01-01

    We describe a large-scale simulation of the aftermath of a hypothetical 10kT improvised nuclear detonation at ground level, near the White House in Washington DC. We take a synthetic information approach, where multiple data sets are combined to construct a synthesized representation of the population of the region with accurate demographics, as well as four infrastructures: transportation, healthcare, communication, and power. In this article, we focus on the model of agents and their behavior, which is represented using the options framework. Six different behavioral options are modeled: household reconstitution, evacuation, healthcare-seeking, worry, shelter-seeking, and aiding & assisting others. Agent decision-making takes into account their health status, information about family members, information about the event, and their local environment. We combine these behavioral options into five different behavior models of increasing complexity and do a number of simulations to compare the models. PMID:27909393

  4. The General Ensemble Biogeochemical Modeling System (GEMS) and its applications to agricultural systems in the United States: Chapter 18

    USGS Publications Warehouse

    Liu, Shuguang; Tan, Zhengxi; Chen, Mingshi; Liu, Jinxun; Wein, Anne; Li, Zhengpeng; Huang, Shengli; Oeding, Jennifer; Young, Claudia; Verma, Shashi B.; Suyker, Andrew E.; Faulkner, Stephen P.

    2012-01-01

    The General Ensemble Biogeochemical Modeling System (GEMS) was es in individual models, it uses multiple site-scale biogeochemical models to perform model simulations. Second, it adopts Monte Carlo ensemble simulations of each simulation unit (one site/pixel or group of sites/pixels with similar biophysical conditions) to incorporate uncertainties and variability (as measured by variances and covariance) of input variables into model simulations. In this chapter, we illustrate the applications of GEMS at the site and regional scales with an emphasis on incorporating agricultural practices. Challenges in modeling soil carbon dynamics and greenhouse emissions are also discussed.

  5. Potent New Small-Molecule Inhibitor of Botulinum Neurotoxin Serotype A Endopeptidase Developed by Synthesis-Based Computer-Aided Molecular Design

    DTIC Science & Technology

    2009-11-01

    dynamics of the complex predicted by multiple molecular dynamics simulations , and discuss further structural optimization to achieve better in vivo efficacy...complex with BoNTAe and the dynamics of the complex predicted by multiple molecular dynamics simulations (MMDSs). On the basis of the 3D model, we discuss...is unlimited whereas AHP exhibited 54% inhibition under the same conditions (Table 1). Computer Simulation Twenty different molecular dynamics

  6. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  7. Multi-Site λ-dynamics for simulated Structure-Activity Relationship studies

    PubMed Central

    Knight, Jennifer L.; Brooks, Charles L.

    2011-01-01

    Multi-Site λ-dynamics (MSλD) is a new free energy simulation method that is based on λ-dynamics. It has been developed to enable multiple substituents at multiple sites on a common ligand core to be modeled simultaneously and their free energies assessed. The efficacy of MSλD for estimating relative hydration free energies and relative binding affinties is demonstrated using three test systems. Model compounds representing multiple identical benzene, dihydroxybenzene and dimethoxybenzene molecules show total combined MSλD trajectory lengths of ~1.5 ns are sufficient to reliably achieve relative hydration free energy estimates within 0.2 kcal/mol and are less sensitive to the number of trajectories that are used to generate these estimates for hybrid ligands that contain up to ten substituents modeled at a single site or five substituents modeled at each of two sites. Relative hydration free energies among six benzene derivatives calculated from MSλD simulations are in very good agreement with those from alchemical free energy simulations (with average unsigned differences of 0.23 kcal/mol and R2=0.991) and experiment (with average unsigned errors of 1.8 kcal/mol and R2=0.959). Estimates of the relative binding affinities among 14 inhibitors of HIV-1 reverse transcriptase obtained from MSλD simulations are in reasonable agreement with those from traditional free energy simulations and experiment (average unsigned errors of 0.9 kcal/mol and R2=0.402). For the same level of accuracy and precision MSλD simulations are achieved ~20–50 times faster than traditional free energy simulations and thus with reliable force field parameters can be used effectively to screen tens to hundreds of compounds in structure-based drug design applications. PMID:22125476

  8. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less

  9. Impulsive response of an automatic transmission system with multiple clearances: Formulation, simulation and experiment

    NASA Astrophysics Data System (ADS)

    Crowther, Ashley R.; Singh, Rajendra; Zhang, Nong; Chapman, Chris

    2007-10-01

    Impulsive responses in geared systems with multiple clearances are studied when the mean torque excitation and system load change abruptly, with application to a vehicle driveline with an automatic transmission. First, torsional lumped-mass models of the planetary and differential gear sets are formulated using matrix elements. The model is then reduced to address tractable nonlinear problems while successfully retaining the main modes of interest. Second, numerical simulations for the nonlinear model are performed for transient conditions and a typical driving situation that induces an impulsive behaviour simulated. However, initial conditions and excitation and load profiles have to be carefully defined before the model can be numerically solved. It is shown that the impacts within the planetary or differential gears may occur under combinations of engine, braking and vehicle load transients. Our analysis shows that the shaping of the engine transient by the torque converter before reaching the clearance locations is more critical. Third, a free vibration experiment is developed for an analogous driveline with multiple clearances and three experiments that excite different response regimes have been carried out. Good correlations validate the proposed methodology.

  10. A coupling of homology modeling with multiple molecular dynamics simulation for identifying representative conformation of GPCR structures: a case study on human bombesin receptor subtype-3.

    PubMed

    Nowroozi, Amin; Shahlaei, Mohsen

    2017-02-01

    In this study, a computational pipeline was therefore devised to overcome homology modeling (HM) bottlenecks. The coupling of HM with molecular dynamics (MD) simulation is useful in that it tackles the sampling deficiency of dynamics simulations by providing good-quality initial guesses for the native structure. Indeed, HM also relaxes the severe requirement of force fields to explore the huge conformational space of protein structures. In this study, the interaction between the human bombesin receptor subtype-3 and MK-5046 was investigated integrating HM, molecular docking, and MD simulations. To improve conformational sampling in typical MD simulations of GPCRs, as in other biomolecules, multiple trajectories with different initial conditions can be employed rather than a single long trajectory. Multiple MD simulations of human bombesin receptor subtype-3 with different initial atomic velocities are applied to sample conformations in the vicinity of the structure generated by HM. The backbone atom conformational space distribution of replicates is analyzed employing principal components analysis. As a result, the averages of structural and dynamic properties over the twenty-one trajectories differ significantly from those obtained from individual trajectories.

  11. An improved car-following model with multiple preceding cars' velocity fluctuation feedback

    NASA Astrophysics Data System (ADS)

    Guo, Lantian; Zhao, Xiangmo; Yu, Shaowei; Li, Xiuhai; Shi, Zhongke

    2017-04-01

    In order to explore and evaluate the effects of velocity variation trend of multiple preceding cars used in the Cooperative Adaptive Cruise Control (CACC) strategy on the dynamic characteristic, fuel economy and emission of the corresponding traffic flow, we conduct a study as follows: firstly, with the real-time car-following (CF) data, the close relationship between multiple preceding cars' velocity fluctuation feedback and the host car's behaviors is explored, the evaluation results clearly show that multiple preceding cars' velocity fluctuation with different time window-width are highly correlated to the host car's acceleration/deceleration. Then, a microscopic traffic flow model is proposed to evaluate the effects of multiple preceding cars' velocity fluctuation feedback in the CACC strategy on the traffic flow evolution process. Finally, numerical simulations on fuel economy and exhaust emission of the traffic flow are also implemented by utilizing VT-micro model. Simulation results prove that considering multiple preceding cars' velocity fluctuation feedback in the control strategy of the CACC system can improve roadway traffic mobility, fuel economy and exhaust emission performance.

  12. Virtual Plant Tissue: Building Blocks for Next-Generation Plant Growth Simulation

    PubMed Central

    De Vos, Dirk; Dzhurakhalov, Abdiravuf; Stijven, Sean; Klosiewicz, Przemyslaw; Beemster, Gerrit T. S.; Broeckhove, Jan

    2017-01-01

    Motivation: Computational modeling of plant developmental processes is becoming increasingly important. Cellular resolution plant tissue simulators have been developed, yet they are typically describing physiological processes in an isolated way, strongly delimited in space and time. Results: With plant systems biology moving toward an integrative perspective on development we have built the Virtual Plant Tissue (VPTissue) package to couple functional modules or models in the same framework and across different frameworks. Multiple levels of model integration and coordination enable combining existing and new models from different sources, with diverse options in terms of input/output. Besides the core simulator the toolset also comprises a tissue editor for manipulating tissue geometry and cell, wall, and node attributes in an interactive manner. A parameter exploration tool is available to study parameter dependence of simulation results by distributing calculations over multiple systems. Availability: Virtual Plant Tissue is available as open source (EUPL license) on Bitbucket (https://bitbucket.org/vptissue/vptissue). The project has a website https://vptissue.bitbucket.io. PMID:28523006

  13. Increasing atmospheric CO2 overrides the historical legacy of multiple stable biome states in Africa.

    PubMed

    Moncrieff, Glenn R; Scheiter, Simon; Bond, William J; Higgins, Steven I

    2014-02-01

    The dominant vegetation over much of the global land surface is not predetermined by contemporary climate, but also influenced by past environmental conditions. This confounds attempts to predict current and future biome distributions, because even a perfect model would project multiple possible biomes without knowledge of the historical vegetation state. Here we compare the distribution of tree- and grass-dominated biomes across Africa simulated using a dynamic global vegetation model (DGVM). We explicitly evaluate where and under what conditions multiple stable biome states are possible for current and projected future climates. Our simulation results show that multiple stable biomes states are possible for vast areas of tropical and subtropical Africa under current conditions. Widespread loss of the potential for multiple stable biomes states is projected in the 21st Century, driven by increasing atmospheric CO2 . Many sites where currently both tree-dominated and grass-dominated biomes are possible become deterministically tree-dominated. Regions with multiple stable biome states are widespread and require consideration when attempting to predict future vegetation changes. Testing for behaviour characteristic of systems with multiple stable equilibria, such as hysteresis and dependence on historical conditions, and the resulting uncertainty in simulated vegetation, will lead to improved projections of global change impacts. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.

  14. Step wise, multiple objective calibration of a hydrologic model for a snowmelt dominated basin

    USGS Publications Warehouse

    Hay, L.E.; Leavesley, G.H.; Clark, M.P.; Markstrom, S.L.; Viger, R.J.; Umemoto, M.

    2006-01-01

    The ability to apply a hydrologic model to large numbers of basins for forecasting purposes requires a quick and effective calibration strategy. This paper presents a step wise, multiple objective, automated procedure for hydrologic model calibration. This procedure includes the sequential calibration of a model's simulation of solar radiation (SR), potential evapotranspiration (PET), water balance, and daily runoff. The procedure uses the Shuffled Complex Evolution global search algorithm to calibrate the U.S. Geological Survey's Precipitation Runoff Modeling System in the Yampa River basin of Colorado. This process assures that intermediate states of the model (SR and PET on a monthly mean basis), as well as the water balance and components of the daily hydrograph are simulated, consistently with measured values.

  15. TOPMODEL simulations of streamflow and depth to water table in Fishing Brook Watershed, New York, 2007-09

    USGS Publications Warehouse

    Nystrom, Elizabeth A.; Burns, Douglas A.

    2011-01-01

    TOPMODEL uses a topographic wetness index computed from surface-elevation data to simulate streamflow and subsurface-saturation state, represented by the saturation deficit. Depth to water table was computed from simulated saturation-deficit values using computed soil properties. In the Fishing Brook Watershed, TOPMODEL was calibrated to the natural logarithm of streamflow at the study area outlet and depth to water table at Sixmile Wetland using a combined multiple-objective function. Runoff and depth to water table responded differently to some of the model parameters, and the combined multiple-objective function balanced the goodness-of-fit of the model realizations with respect to these parameters. Results show that TOPMODEL reasonably simulated runoff and depth to water table during the study period. The simulated runoff had a Nash-Sutcliffe efficiency of 0.738, but the model underpredicted total runoff by 14 percent. Depth to water table computed from simulated saturation-deficit values matched observed water-table depth moderately well; the root mean squared error of absolute depth to water table was 91 millimeters (mm), compared to the mean observed depth to water table of 205 mm. The correlation coefficient for temporal depth-to-water-table fluctuations was 0.624. The variability of the TOPMODEL simulations was assessed using prediction intervals grouped using the combined multiple-objective function. The calibrated TOPMODEL results for the entire study area were applied to several subwatersheds within the study area using computed hydrogeomorphic properties of the subwatersheds.

  16. Study of hadron bundles observed in Chacaltaya two-story emulsion chamber

    NASA Technical Reports Server (NTRS)

    Aoki, H.

    1985-01-01

    The existence of hadron-rich families associated with few gamma-ray emission named Centauro and Mini-Centauro phemonena was reported. It was investigated whether these are produced by the special type of interaction different from the ordinary pion multiple production or not. The experimental results are compared with simulation calculation based on ordinary multiple pion production model. Both hadron multiplicity distribution, obtained from the present observation and the simulation calculation, show almost the same distribution which means that hadron bundles of such smaller multiplicities are considered to originate from successive interactions of surviving nucleon with the nature of multiple production during passage through the atmosphere.

  17. Optimum Vehicle Component Integration with InVeST (Integrated Vehicle Simulation Testbed)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, W; Paddack, E; Aceves, S

    2001-12-27

    We have developed an Integrated Vehicle Simulation Testbed (InVeST). InVeST is based on the concept of Co-simulation, and it allows the development of virtual vehicles that can be analyzed and optimized as an overall integrated system. The virtual vehicle is defined by selecting different vehicle components from a component library. Vehicle component models can be written in multiple programming languages running on different computer platforms. At the same time, InVeST provides full protection for proprietary models. Co-simulation is a cost-effective alternative to competing methodologies, such as developing a translator or selecting a single programming language for all vehicle components. InVeSTmore » has been recently demonstrated using a transmission model and a transmission controller model. The transmission model was written in SABER and ran on a Sun/Solaris workstation, while the transmission controller was written in MATRIXx and ran on a PC running Windows NT. The demonstration was successfully performed. Future plans include the applicability of Co-simulation and InVeST to analysis and optimization of multiple complex systems, including those of Intelligent Transportation Systems.« less

  18. Block Oriented Simulation System (BOSS)

    NASA Technical Reports Server (NTRS)

    Ratcliffe, Jaimie

    1988-01-01

    Computer simulation is assuming greater importance as a flexible and expedient approach to modeling system and subsystem behavior. Simulation has played a key role in the growth of complex, multiple access space communications such as those used by the space shuttle and the TRW-built Tracking and Data Relay Satellites (TDRS). A powerful new simulator for use in designing and modeling the communication system of NASA's planned Space Station is being developed. Progress to date on the Block (Diagram) Oriented Simulation System (BOSS) is described.

  19. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction

    PubMed Central

    Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H.; Montesinos-López, José C.; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-01-01

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. PMID:28364037

  20. Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.

    ERIC Educational Resources Information Center

    Muraki, Eiji

    1999-01-01

    Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…

  1. Relevance of the hadronic interaction model in the interpretation of multiple muon data as detected with the MACRO experiment

    NASA Astrophysics Data System (ADS)

    Ambrosio, M.; Antolini, R.; Aramo, C.; Auriemma, G.; Baldini, A.; Barbarino, G. C.; Barish, B. C.; Battistoni, G.; Bellotti, R.; Bemporad, C.; Bernardini, P.; Bilokon, H.; Bisi, V.; Bloise, C.; Bower, C.; Bussino, S.; Cafagna, F.; Calicchio, M.; Campana, D.; Carboni, M.; Castellano, M.; Cecchini, S.; Cei, F.; Chiarella, V.; Coutu, S.; de Benedictis, L.; de Cataldo, G.; Dekhissi, H.; de Marzo, C.; de Mitri, I.; de Vincenzi, M.; di Credico, A.; Erriquez, O.; Favuzzi, C.; Forti, C.; Fusco, P.; Giacomelli, G.; Giannini, G.; Giglietto, N.; Grassi, M.; Gray, L.; Grillo, A.; Guarino, F.; Guarnaccia, P.; Gustavino, C.; Habig, A.; Hanson, K.; Hawthorne, A.; Heinz, R.; Iarocci, E.; Katsavounidis, E.; Kearns, E.; Kyriazopoulou, S.; Lamanna, E.; Lane, C.; Levin, D. S.; Lipari, P.; Longley, N. P.; Longo, M. J.; Maaroufi, F.; Mancarella, G.; Mandrioli, G.; Manzoor, S.; Margiotta Neri, A.; Marini, A.; Martello, D.; Marzari-Chiesa, A.; Mazziotta, M. N.; Mazzotta, C.; Michael, D. G.; Mikheyev, S.; Miller, L.; Monacelli, P.; Montaruli, T.; Monteno, M.; Mufson, S.; Musser, J.; Nicoló, D.; Nolty, R.; Okada, C.; Orth, C.; Osteria, G.; Palamara, O.; Patera, V.; Patrizii, L.; Pazzi, R.; Peck, C. W.; Petrera, S.; Pistilli, P.; Popa, V.; Rainó, A.; Rastelli, A.; Reynoldson, J.; Ronga, F.; Rubizzo, U.; Sanzgiri, A.; Satriano, C.; Satta, L.; Scapparone, E.; Scholberg, K.; Sciubba, A.; Serra-Lugaresi, P.; Severi, M.; Sioli, M.; Sitta, M.; Spinelli, P.; Spinetti, M.; Spurio, M.; Steinberg, R.; Stone, J. L.; Sulak, L. R.; Surdo, A.; Tarlé, G.; Togo, V.; Walter, C. W.; Webb, R.

    1999-03-01

    With the aim of discussing the effect of the possible sources of systematic uncertainties in simulation models, the analysis of multiple muon events from the MACRO experiment at Gran Sasso is reviewed. In particular, the predictions from different currently available hadronic interaction models are compared.

  2. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  3. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE PAGES

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    2017-06-13

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  4. Cost-effective solutions to maintaining smart grid reliability

    NASA Astrophysics Data System (ADS)

    Qin, Qiu

    As the aging power systems are increasingly working closer to the capacity and thermal limits, maintaining an sufficient reliability has been of great concern to the government agency, utility companies and users. This dissertation focuses on improving the reliability of transmission and distribution systems. Based on the wide area measurements, multiple model algorithms are developed to diagnose transmission line three-phase short to ground faults in the presence of protection misoperations. The multiple model algorithms utilize the electric network dynamics to provide prompt and reliable diagnosis outcomes. Computational complexity of the diagnosis algorithm is reduced by using a two-step heuristic. The multiple model algorithm is incorporated into a hybrid simulation framework, which consist of both continuous state simulation and discrete event simulation, to study the operation of transmission systems. With hybrid simulation, line switching strategy for enhancing the tolerance to protection misoperations is studied based on the concept of security index, which involves the faulted mode probability and stability coverage. Local measurements are used to track the generator state and faulty mode probabilities are calculated in the multiple model algorithms. FACTS devices are considered as controllers for the transmission system. The placement of FACTS devices into power systems is investigated with a criterion of maintaining a prescribed level of control reconfigurability. Control reconfigurability measures the small signal combined controllability and observability of a power system with an additional requirement on fault tolerance. For the distribution systems, a hierarchical framework, including a high level recloser allocation scheme and a low level recloser placement scheme, is presented. The impacts of recloser placement on the reliability indices is analyzed. Evaluation of reliability indices in the placement process is carried out via discrete event simulation. The reliability requirements are described with probabilities and evaluated from the empirical distributions of reliability indices.

  5. The Effects of Towfish Motion on Sidescan Sonar Images: Extension to a Multiple-Beam Device

    DTIC Science & Technology

    1994-02-01

    simulation, the raw simulated sidescan image is formed from pixels G , which are the sum of energies E,", assigned to the nearest range- bin k as noted in...for stable motion at constant velocity V0, are applied to (divided into) the G ,, and the simulated sidescan image is ready to display. Maximal energy...limitation is likely to apply to all multiple-beam sonais of similar construction. The yaw correction was incorporated in the MBEAM model by an

  6. Measurement of antiproton annihilation on Cu, Ag and Au with emulsion films

    NASA Astrophysics Data System (ADS)

    Aghion, S.; Amsler, C.; Ariga, A.; Ariga, T.; Bonomi, G.; Bräunig, P.; Brusa, R. S.; Cabaret, L.; Caccia, M.; Caravita, R.; Castelli, F.; Cerchiari, G.; Comparat, D.; Consolati, G.; Demetrio, A.; Di Noto, L.; Doser, M.; Ereditato, A.; Evans, C.; Ferragut, R.; Fesel, J.; Fontana, A.; Gerber, S.; Giammarchi, M.; Gligorova, A.; Guatieri, F.; Haider, S.; Hinterberger, A.; Holmestad, H.; Huse, T.; Kawada, J.; Kellerbauer, A.; Kimura, M.; Krasnický, D.; Lagomarsino, V.; Lansonneur, P.; Lebrun, P.; Malbrunot, C.; Mariazzi, S.; Matveev, V.; Mazzotta, Z.; Müller, S. R.; Nebbia, G.; Nedelec, P.; Oberthaler, M.; Pacifico, N.; Pagano, D.; Penasa, L.; Petracek, V.; Pistillo, C.; Prelz, F.; Prevedelli, M.; Ravelli, L.; Rienaecker, B.; RØhne, O. M.; Rotondi, A.; Sacerdoti, M.; Sandaker, H.; Santoro, R.; Scampoli, P.; Simon, M.; Smestad, L.; Sorrentino, F.; Testera, G.; Tietje, I. C.; Vamosi, S.; Vladymyrov, M.; Widmann, E.; Yzombard, P.; Zimmer, C.; Zmeskal, J.; Zurlo, N.

    2017-04-01

    The characteristics of low energy antiproton annihilations on nuclei (e.g. hadronization and product multiplicities) are not well known, and Monte Carlo simulation packages that use different models provide different descriptions of the annihilation events. In this study, we measured the particle multiplicities resulting from antiproton annihilations on nuclei. The results were compared with predictions obtained using different models in the simulation tools GEANT4 and FLUKA. For this study, we exposed thin targets (Cu, Ag and Au) to a very low energy antiproton beam from CERN's Antiproton Decelerator, exploiting the secondary beamline available in the AEgIS experimental zone. The antiproton annihilation products were detected using emulsion films developed at the Laboratory of High Energy Physics in Bern, where they were analysed at the automatic microscope facility. The fragment multiplicity measured in this study is in good agreement with results obtained with FLUKA simulations for both minimally and heavily ionizing particles.

  7. Polarized BRDF for coatings based on three-component assumption

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong

    2017-02-01

    A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.

  8. Contributions to HiLiftPW-3 Using Structured, Overset Grid Methods

    NASA Technical Reports Server (NTRS)

    Coder, James G.; Pulliam, Thomas H.; Jensen, James C.

    2018-01-01

    The High-Lift Common Research Model (HL-CRM) and the JAXA Standard Model (JSM) were analyzed computationally using both the OVERFLOW and LAVA codes for the third AIAA High-Lift Prediction Workshop. Geometry descriptions and the test cases simulated are described. With the HL-CRM, the effects of surface smoothness during grid projection and the effect of partially sealing a flap gap were studied. Grid refinement studies were performed at two angles of attack using both codes. For the JSM, simulations were performed with and without the nacelle/pylon. Without the nacelle/pylon, evidence of multiple solutions was observed when a quadratic constitutive relation is used in the turbulence modeling; however, using time-accurate simulation seemed to alleviate this issue. With the nacelle/pylon, no evidence of multiple solutions was observed. Laminar-turbulent transition modeling was applied to both JSM configuration, and had an overall favorable impact on the lift predictions.

  9. An Integrated Crustal Dynamics Simulator

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Mora, P.

    2007-12-01

    Numerical modelling offers an outstanding opportunity to gain an understanding of the crustal dynamics and complex crustal system behaviour. This presentation provides our long-term and ongoing effort on finite element based computational model and software development to simulate the interacting fault system for earthquake forecasting. A R-minimum strategy based finite-element computational model and software tool, PANDAS, for modelling 3-dimensional nonlinear frictional contact behaviour between multiple deformable bodies with the arbitrarily-shaped contact element strategy has been developed by the authors, which builds up a virtual laboratory to simulate interacting fault systems including crustal boundary conditions and various nonlinearities (e.g. from frictional contact, materials, geometry and thermal coupling). It has been successfully applied to large scale computing of the complex nonlinear phenomena in the non-continuum media involving the nonlinear frictional instability, multiple material properties and complex geometries on supercomputers, such as the South Australia (SA) interacting fault system, South California fault model and Sumatra subduction model. It has been also extended and to simulate the hot fractured rock (HFR) geothermal reservoir system in collaboration of Geodynamics Ltd which is constructing the first geothermal reservoir system in Australia and to model the tsunami generation induced by earthquakes. Both are supported by Australian Research Council.

  10. Modeling of Explorative Procedures for Remote Object Identification

    DTIC Science & Technology

    1991-09-01

    haptic sensory system and the simulated foveal component of the visual system. Eventually it will allow multiple applications in remote sensing and...superposition of sensory channels. The use of a force reflecting telemanipulator and computer simulated visual foveal component are the tools which...representation of human search models is achieved by using the proprioceptive component of the haptic sensory system and the simulated foveal component of the

  11. A Crack Growth Evaluation Method for Interacting Multiple Cracks

    NASA Astrophysics Data System (ADS)

    Kamaya, Masayuki

    When stress corrosion cracking or corrosion fatigue occurs, multiple cracks are frequently initiated in the same area. According to section XI of the ASME Boiler and Pressure Vessel Code, multiple cracks are considered as a single combined crack in crack growth analysis, if the specified conditions are satisfied. In crack growth processes, however, no prescription for the interference between multiple cracks is given in this code. The JSME Post-Construction Code, issued in May 2000, prescribes the conditions of crack coalescence in the crack growth process. This study aimed to extend this prescription to more general cases. A simulation model was applied, to simulate the crack growth process, taking into account the interference between two cracks. This model made it possible to analyze multiple crack growth behaviors for many cases (e. g. different relative position and length) that could not be studied by experiment only. Based on these analyses, a new crack growth analysis method was suggested for taking into account the interference between multiple cracks.

  12. Multisite stochastic simulation of daily precipitation from copula modeling with a gamma marginal distribution

    NASA Astrophysics Data System (ADS)

    Lee, Taesam

    2018-05-01

    Multisite stochastic simulations of daily precipitation have been widely employed in hydrologic analyses for climate change assessment and agricultural model inputs. Recently, a copula model with a gamma marginal distribution has become one of the common approaches for simulating precipitation at multiple sites. Here, we tested the correlation structure of the copula modeling. The results indicate that there is a significant underestimation of the correlation in the simulated data compared to the observed data. Therefore, we proposed an indirect method for estimating the cross-correlations when simulating precipitation at multiple stations. We used the full relationship between the correlation of the observed data and the normally transformed data. Although this indirect method offers certain improvements in preserving the cross-correlations between sites in the original domain, the method was not reliable in application. Therefore, we further improved a simulation-based method (SBM) that was developed to model the multisite precipitation occurrence. The SBM preserved well the cross-correlations of the original domain. The SBM method provides around 0.2 better cross-correlation than the direct method and around 0.1 degree better than the indirect method. The three models were applied to the stations in the Nakdong River basin, and the SBM was the best alternative for reproducing the historical cross-correlation. The direct method significantly underestimates the correlations among the observed data, and the indirect method appeared to be unreliable.

  13. Analyzing Multiple Outcomes in Clinical Research Using Multivariate Multilevel Models

    PubMed Central

    Baldwin, Scott A.; Imel, Zac E.; Braithwaite, Scott R.; Atkins, David C.

    2014-01-01

    Objective Multilevel models have become a standard data analysis approach in intervention research. Although the vast majority of intervention studies involve multiple outcome measures, few studies use multivariate analysis methods. The authors discuss multivariate extensions to the multilevel model that can be used by psychotherapy researchers. Method and Results Using simulated longitudinal treatment data, the authors show how multivariate models extend common univariate growth models and how the multivariate model can be used to examine multivariate hypotheses involving fixed effects (e.g., does the size of the treatment effect differ across outcomes?) and random effects (e.g., is change in one outcome related to change in the other?). An online supplemental appendix provides annotated computer code and simulated example data for implementing a multivariate model. Conclusions Multivariate multilevel models are flexible, powerful models that can enhance clinical research. PMID:24491071

  14. XCAT/DRASIM: a realistic CT/human-model simulation package

    NASA Astrophysics Data System (ADS)

    Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.

    2011-03-01

    The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.

  15. Conceptualization of Karstic Aquifer with Multiple Outlets Using a Dual Porosity Model.

    PubMed

    Hosseini, Seiyed Mossa; Ataie-Ashtiani, Behzad

    2017-07-01

    In this study, two conceptual models, the classic reservoir (CR) model and exchange reservoirs model embedded by dual porosity approach (DPR) are developed for simulation of karst aquifer functioning drained by multiple outlets. The performances of two developed models are demonstrated at a less developed karstic aquifer with three spring outlets located in Zagros Mountain in the south-west of Iran using 22-years of daily data. During the surface recharge, a production function based on water mass balance is implemented for computing the time series of surface recharge to the karst formations. The efficiency of both models has been assessed for simulation of daily spring discharge during the recession and also surface recharge periods. Results indicate that both CR and DPR models are capable of simulating the ordinates of spring hydrographs which drainage less developed karstic aquifer. However, the goodness of fit criteria indicates outperformance of DPR model for simulation of total hydrograph ordinates. In addition, the DPR model is capable of quantifying hydraulic properties of two hydrologically connected overlapping continua conduits network and fissure matrix which lays important foundations for the mining operation and water resource management whereas homogeneous model representations of the karstic subsurface (e.g., the CR) do not work accurately in the karstic environment. © 2017, National Ground Water Association.

  16. Opportunities and Challenges in Supply-Side Simulation: Physician-Based Models

    PubMed Central

    Gresenz, Carole Roan; Auerbach, David I; Duarte, Fabian

    2013-01-01

    Objective To provide a conceptual framework and to assess the availability of empirical data for supply-side microsimulation modeling in the context of health care. Data Sources Multiple secondary data sources, including the American Community Survey, Health Tracking Physician Survey, and SK&A physician database. Study Design We apply our conceptual framework to one entity in the health care market—physicians—and identify, assess, and compare data available for physician-based simulation models. Principal Findings Our conceptual framework describes three broad types of data required for supply-side microsimulation modeling. Our assessment of available data for modeling physician behavior suggests broad comparability across various sources on several dimensions and highlights the need for significant integration of data across multiple sources to provide a platform adequate for modeling. A growing literature provides potential estimates for use as behavioral parameters that could serve as the models' engines. Sources of data for simulation modeling that account for the complex organizational and financial relationships among physicians and other supply-side entities are limited. Conclusions A key challenge for supply-side microsimulation modeling is optimally combining available data to harness their collective power. Several possibilities also exist for novel data collection. These have the potential to serve as catalysts for the next generation of supply-side-focused simulation models to inform health policy. PMID:23347041

  17. T-COMP—A suite of programs for extracting transmissivity from MODFLOW models

    USGS Publications Warehouse

    Halford, Keith J.

    2016-02-12

    Simulated transmissivities are constrained poorly by assigning permissible ranges of hydraulic conductivities from aquifer-test results to hydrogeologic units in groundwater-flow models. These wide ranges are derived from interpretations of many aquifer tests that are categorized by hydrogeologic unit. Uncertainty is added where contributing thicknesses differ between field estimates and numerical models. Wide ranges of hydraulic conductivities and discordant thicknesses result in simulated transmissivities that frequently are much greater than aquifer-test results. Multiple orders of magnitude differences frequently occur between simulated and observed transmissivities where observed transmissivities are less than 1,000 feet squared per day.Transmissivity observations from individual aquifer tests can constrain model calibration as head and flow observations do. This approach is superior to diluting aquifer-test results into generalized ranges of hydraulic conductivities. Observed and simulated transmissivities can be compared directly with T-COMP, a suite of three FORTRAN programs. Transmissivity observations require that simulated hydraulic conductivities and thicknesses in the volume investigated by an aquifer test be extracted and integrated into a simulated transmissivity. Transmissivities of MODFLOW model cells are sampled within the volume affected by an aquifer test as defined by a well-specific, radial-flow model of each aquifer test. Sampled transmissivities of model cells are averaged within a layer and summed across layers. Accuracy of the approach was tested with hypothetical, multiple-aquifer models where specified transmissivities ranged between 250 and 20,000 feet squared per day. More than 90 percent of simulated transmissivities were within a factor of 2 of specified transmissivities.

  18. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  19. A new OLED SPICE model for pixel circuit simulation in OLED-on-silicon microdisplay design

    NASA Astrophysics Data System (ADS)

    Bohua, Zhao; Ran, Huang; Jianhui, Bu; Yinxue, Lü; Yiqi, Wang; Fei, Ma; Guohua, Xie; Zhensong, Zhang; Huan, Du; Jiajun, Luo; Zhengsheng, Han; Yi, Zhao

    2012-07-01

    A new equivalent circuit model of organic-light-emitting-diode (OLED) is proposed. As the single-diode model is able to approximate OLED behavior as well as the multiple-diode model, the new model will be built based on it. In order to make sure that the experimental and simulated data are in good agreement, the constant resistor is exchanged for an exponential resistor in the new model. Compared with the measured data and the results of the other two OLED SPICE models, the simulated I—V characteristics of the new model match the measured data much better. This new model can be directly incorporated into an SPICE circuit simulator and presents good accuracy over the whole operating voltage.

  20. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  1. Design, Customization and Implementation of Energy Simulation with 5E Model in Elementary Classroom

    ERIC Educational Resources Information Center

    Lye, Sze Yee; Wee, Loo Kang; Kwek, Yao Chie; Abas, Suriati; Tay, Lee Yong

    2014-01-01

    Science simulations are popular among educators as such simulations afford for multiple visual representation and interactivity. Despite the popularity and abundance on the internet, our literature review suggested little research has been conducted on the use of simulation in elementary school. Thus, an exploratory pilot case study was conducted…

  2. Multiple imputation for handling missing outcome data when estimating the relative risk.

    PubMed

    Sullivan, Thomas R; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-09-06

    Multiple imputation is a popular approach to handling missing data in medical research, yet little is known about its applicability for estimating the relative risk. Standard methods for imputing incomplete binary outcomes involve logistic regression or an assumption of multivariate normality, whereas relative risks are typically estimated using log binomial models. It is unclear whether misspecification of the imputation model in this setting could lead to biased parameter estimates. Using simulated data, we evaluated the performance of multiple imputation for handling missing data prior to estimating adjusted relative risks from a correctly specified multivariable log binomial model. We considered an arbitrary pattern of missing data in both outcome and exposure variables, with missing data induced under missing at random mechanisms. Focusing on standard model-based methods of multiple imputation, missing data were imputed using multivariate normal imputation or fully conditional specification with a logistic imputation model for the outcome. Multivariate normal imputation performed poorly in the simulation study, consistently producing estimates of the relative risk that were biased towards the null. Despite outperforming multivariate normal imputation, fully conditional specification also produced somewhat biased estimates, with greater bias observed for higher outcome prevalences and larger relative risks. Deleting imputed outcomes from analysis datasets did not improve the performance of fully conditional specification. Both multivariate normal imputation and fully conditional specification produced biased estimates of the relative risk, presumably since both use a misspecified imputation model. Based on simulation results, we recommend researchers use fully conditional specification rather than multivariate normal imputation and retain imputed outcomes in the analysis when estimating relative risks. However fully conditional specification is not without its shortcomings, and so further research is needed to identify optimal approaches for relative risk estimation within the multiple imputation framework.

  3. Transition model for ricin-aptamer interactions with multiple pathways and energy barriers

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Xu, Bingqian

    2014-02-01

    We develop a transition model to interpret single-molecule ricin-aptamer interactions with multiple unbinding pathways and energy barriers measured by atomic force microscopy dynamic force spectroscopy. Molecular simulations establish the relationship between binding conformations and the corresponding unbinding pathways. Each unbinding pathway follows a Bell-Evans multiple-barrier model. Markov-type transition matrices are developed to analyze the redistribution of unbinding events among the pathways under different loading rates. Our study provides detailed information about complex behaviors in ricin-aptamer unbinding events.

  4. Simulation of multi-photon emission isotopes using time-resolved SimSET multiple photon history generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Chih-Chieh; Lin, Hsin-Hon; Lin, Chang-Shiun

    Abstract-Multiple-photon emitters, such as In-111 or Se-75, have enormous potential in the field of nuclear medicine imaging. For example, Se-75 can be used to investigate the bile acid malabsorption and measure the bile acid pool loss. The simulation system for emission tomography (SimSET) is a well-known Monte Carlo simulation (MCS) code in nuclear medicine for its high computational efficiency. However, current SimSET cannot simulate these isotopes due to the lack of modeling of complex decay scheme and the time-dependent decay process. To extend the versatility of SimSET for simulation of those multi-photon emission isotopes, a time-resolved multiple photon history generatormore » based on SimSET codes is developed in present study. For developing the time-resolved SimSET (trSimSET) with radionuclide decay process, the new MCS model introduce new features, including decay time information and photon time-of-flight information, into this new code. The half-life of energy states were tabulated from the Evaluated Nuclear Structure Data File (ENSDF) database. The MCS results indicate that the overall percent difference is less than 8.5% for all simulation trials as compared to GATE. To sum up, we demonstrated that time-resolved SimSET multiple photon history generator can have comparable accuracy with GATE and keeping better computational efficiency. The new MCS code is very useful to study the multi-photon imaging of novel isotopes that needs the simulation of lifetime and the time-of-fight measurements. (authors)« less

  5. Dynamic electrical impedance imaging with the interacting multiple model scheme.

    PubMed

    Kim, Kyung Youn; Kim, Bong Seok; Kim, Min Chan; Kim, Sin; Isaacson, David; Newell, Jonathan C

    2005-04-01

    In this paper, an effective dynamical EIT imaging scheme is presented for on-line monitoring of the abruptly changing resistivity distribution inside the object, based on the interacting multiple model (IMM) algorithm. The inverse problem is treated as a stochastic nonlinear state estimation problem with the time-varying resistivity (state) being estimated on-line with the aid of the IMM algorithm. In the design of the IMM algorithm multiple models with different process noise covariance are incorporated to reduce the modeling uncertainty. Simulations and phantom experiments are provided to illustrate the proposed algorithm.

  6. Mercury and methylmercury stream concentrations in a Coastal Plain watershed: A multi-scale simulation analysis

    USGS Publications Warehouse

    Knightes, Christopher D.; Golden, Heather E.; Journey, Celeste A.; Davis, Gary M.; Conrads, Paul; Marvin-DiPasquale, Mark; Brigham, Mark E.; Bradley, Paul M.

    2014-01-01

    Mercury is a ubiquitous global environmental toxicant responsible for most US fish advisories. Processes governing mercury concentrations in rivers and streams are not well understood, particularly at multiple spatial scales. We investigate how insights gained from reach-scale mercury data and model simulations can be applied at broader watershed scales using a spatially and temporally explicit watershed hydrology and biogeochemical cycling model, VELMA. We simulate fate and transport using reach-scale (0.1 km2) study data and evaluate applications to multiple watershed scales. Reach-scale VELMA parameterization was applied to two nested sub-watersheds (28 km2 and 25 km2) and the encompassing watershed (79 km2). Results demonstrate that simulated flow and total mercury concentrations compare reasonably to observations at different scales, but simulated methylmercury concentrations are out-of-phase with observations. These findings suggest that intricacies of methylmercury biogeochemical cycling and transport are under-represented in VELMA and underscore the complexity of simulating mercury fate and transport.

  7. Multiple-Relaxation-Time Lattice Boltzmann Models in 3D

    NASA Technical Reports Server (NTRS)

    dHumieres, Dominique; Ginzburg, Irina; Krafczyk, Manfred; Lallemand, Pierre; Luo, Li-Shi; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This article provides a concise exposition of the multiple-relaxation-time lattice Boltzmann equation, with examples of fifteen-velocity and nineteen-velocity models in three dimensions. Simulation of a diagonally lid-driven cavity flow in three dimensions at Re=500 and 2000 is performed. The results clearly demonstrate the superior numerical stability of the multiple-relaxation-time lattice Boltzmann equation over the popular lattice Bhatnagar-Gross-Krook equation.

  8. Scatter characterization and correction for simultaneous multiple small-animal PET imaging.

    PubMed

    Prasad, Rameshwar; Zaidi, Habib

    2014-04-01

    The rapid growth and usage of small-animal positron emission tomography (PET) in molecular imaging research has led to increased demand on PET scanner's time. One potential solution to increase throughput is to scan multiple rodents simultaneously. However, this is achieved at the expense of deterioration of image quality and loss of quantitative accuracy owing to enhanced effects of photon attenuation and Compton scattering. The purpose of this work is, first, to characterize the magnitude and spatial distribution of the scatter component in small-animal PET imaging when scanning single and multiple rodents simultaneously and, second, to assess the relevance and evaluate the performance of scatter correction under similar conditions. The LabPET™-8 scanner was modelled as realistically as possible using Geant4 Application for Tomographic Emission Monte Carlo simulation platform. Monte Carlo simulations allow the separation of unscattered and scattered coincidences and as such enable detailed assessment of the scatter component and its origin. Simple shape-based and more realistic voxel-based phantoms were used to simulate single and multiple PET imaging studies. The modelled scatter component using the single-scatter simulation technique was compared to Monte Carlo simulation results. PET images were also corrected for attenuation and the combined effect of attenuation and scatter on single and multiple small-animal PET imaging evaluated in terms of image quality and quantitative accuracy. A good agreement was observed between calculated and Monte Carlo simulated scatter profiles for single- and multiple-subject imaging. In the LabPET™-8 scanner, the detector covering material (kovar) contributed the maximum amount of scatter events while the scatter contribution due to lead shielding is negligible. The out-of field-of-view (FOV) scatter fraction (SF) is 1.70, 0.76, and 0.11% for lower energy thresholds of 250, 350, and 400 keV, respectively. The increase in SF ranged between 25 and 64% when imaging multiple subjects (three to five) of different size simultaneously in comparison to imaging a single subject. The spill-over ratio (SOR) increases with increasing the number of subjects in the FOV. Scatter correction improved the SOR for both water and air cold compartments of single and multiple imaging studies. The recovery coefficients for different body parts of the mouse whole-body and rat whole-body anatomical models were improved for multiple imaging studies following scatter correction. The magnitude and spatial distribution of the scatter component in small-animal PET imaging of single and multiple subjects simultaneously were characterized, and its impact was evaluated in different situations. Scatter correction improves PET image quality and quantitative accuracy for single rat and simultaneous multiple mice and rat imaging studies, whereas its impact is insignificant in single mouse imaging.

  9. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  10. An iterative fullwave simulation approach to multiple scattering in media with randomly distributed microbubbles

    NASA Astrophysics Data System (ADS)

    Joshi, Aditya; Lindsey, Brooks D.; Dayton, Paul A.; Pinton, Gianmarco; Muller, Marie

    2017-05-01

    Ultrasound contrast agents (UCA), such as microbubbles, enhance the scattering properties of blood, which is otherwise hypoechoic. The multiple scattering interactions of the acoustic field with UCA are poorly understood due to the complexity of the multiple scattering theories and the nonlinear microbubble response. The majority of bubble models describe the behavior of UCA as single, isolated microbubbles suspended in infinite medium. Multiple scattering models such as the independent scattering approximation can approximate phase velocity and attenuation for low scatterer volume fractions. However, all current models and simulation approaches only describe multiple scattering and nonlinear bubble dynamics separately. Here we present an approach that combines two existing models: (1) a full-wave model that describes nonlinear propagation and scattering interactions in a heterogeneous attenuating medium and (2) a Paul-Sarkar model that describes the nonlinear interactions between an acoustic field and microbubbles. These two models were solved numerically and combined with an iterative approach. The convergence of this combined model was explored in silico for 0.5 × 106 microbubbles ml-1, 1% and 2% bubble concentration by volume. The backscattering predicted by our modeling approach was verified experimentally with water tank measurements performed with a 128-element linear array transducer. An excellent agreement in terms of the fundamental and harmonic acoustic fields is shown. Additionally, our model correctly predicts the phase velocity and attenuation measured using through transmission and predicted by the independent scattering approximation.

  11. Road simulation for four-wheel vehicle whole input power spectral density

    NASA Astrophysics Data System (ADS)

    Wang, Jiangbo; Qiang, Baomin

    2017-05-01

    As the vibration of running vehicle mainly comes from road and influence vehicle ride performance. So the road roughness power spectral density simulation has great significance to analyze automobile suspension vibration system parameters and evaluate ride comfort. Firstly, this paper based on the mathematical model of road roughness power spectral density, established the integral white noise road random method. Then in the MATLAB/Simulink environment, according to the research method of automobile suspension frame from simple two degree of freedom single-wheel vehicle model to complex multiple degrees of freedom vehicle model, this paper built the simple single incentive input simulation model. Finally the spectrum matrix was used to build whole vehicle incentive input simulation model. This simulation method based on reliable and accurate mathematical theory and can be applied to the random road simulation of any specified spectral which provides pavement incentive model and foundation to vehicle ride performance research and vibration simulation.

  12. A High-Rate, Single-Crystal Model for Cyclotrimethylene Trinitramine including Phase Transformations and Plastic Slip

    DOE PAGES

    Addessio, Francis L.; Luscher, Darby Jon; Cawkwell, Marc Jon; ...

    2017-05-14

    A continuum model for the high-rate, thermo-mechanical deformation of single-crystal cyclotrimethylene trinitramine (RDX) is developed. The model includes the effects of anisotropy, large deformations, nonlinear thermo-elasticity, phase transformations, and plastic slip. A multiplicative decomposition of the deformation gradient is used. The volumetric elastic component of the deformation is accounted for through a free-energy based equation of state for the low- (α) and high-pressure (γ) polymorphs of RDX. Crystal plasticity is addressed using a phenomenological thermal activation model. The deformation gradient for the phase transformation is based on an approach that has been applied to martensitic transformations. Simulations were conducted andmore » compared to high-rate, impact loading of oriented RDX single crystals. The simulations considered multiple orientations of the crystal relative to the direction of shock loading and multiple sample thicknesses. Thirteen slip systems, which were inferred from indentation and x-ray topography, were used to model the α-polymorph. It is shown that by increasing the number of slip systems from the previously considered number of six (6) to thirteen (13) in the α-polymorph, better comparisons with data may be obtained. Simulations of impact conditions in the vicinity of the α- to γ-polymorph transformation (3.8 GPa) are considered. Eleven of the simulations, which were at pressures below the transformation value (3.0 GPa), were compared to experimental data. Comparison of the model was also made with available data for one experiment above the transformation pressure (4.4 GPa). Also, simulations are provided for a nominal pressure of 7.5 GPa to demonstrate the effect of the transformation kinetics on the deformation of a high-rate plate impact problem.« less

  13. A High-Rate, Single-Crystal Model for Cyclotrimethylene Trinitramine including Phase Transformations and Plastic Slip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Addessio, Francis L.; Luscher, Darby Jon; Cawkwell, Marc Jon

    A continuum model for the high-rate, thermo-mechanical deformation of single-crystal cyclotrimethylene trinitramine (RDX) is developed. The model includes the effects of anisotropy, large deformations, nonlinear thermo-elasticity, phase transformations, and plastic slip. A multiplicative decomposition of the deformation gradient is used. The volumetric elastic component of the deformation is accounted for through a free-energy based equation of state for the low- (α) and high-pressure (γ) polymorphs of RDX. Crystal plasticity is addressed using a phenomenological thermal activation model. The deformation gradient for the phase transformation is based on an approach that has been applied to martensitic transformations. Simulations were conducted andmore » compared to high-rate, impact loading of oriented RDX single crystals. The simulations considered multiple orientations of the crystal relative to the direction of shock loading and multiple sample thicknesses. Thirteen slip systems, which were inferred from indentation and x-ray topography, were used to model the α-polymorph. It is shown that by increasing the number of slip systems from the previously considered number of six (6) to thirteen (13) in the α-polymorph, better comparisons with data may be obtained. Simulations of impact conditions in the vicinity of the α- to γ-polymorph transformation (3.8 GPa) are considered. Eleven of the simulations, which were at pressures below the transformation value (3.0 GPa), were compared to experimental data. Comparison of the model was also made with available data for one experiment above the transformation pressure (4.4 GPa). Also, simulations are provided for a nominal pressure of 7.5 GPa to demonstrate the effect of the transformation kinetics on the deformation of a high-rate plate impact problem.« less

  14. An Improved Interacting Multiple Model Filtering Algorithm Based on the Cubature Kalman Filter for Maneuvering Target Tracking.

    PubMed

    Zhu, Wei; Wang, Wei; Yuan, Gannan

    2016-06-01

    In order to improve the tracking accuracy, model estimation accuracy and quick response of multiple model maneuvering target tracking, the interacting multiple models five degree cubature Kalman filter (IMM5CKF) is proposed in this paper. In the proposed algorithm, the interacting multiple models (IMM) algorithm processes all the models through a Markov Chain to simultaneously enhance the model tracking accuracy of target tracking. Then a five degree cubature Kalman filter (5CKF) evaluates the surface integral by a higher but deterministic odd ordered spherical cubature rule to improve the tracking accuracy and the model switch sensitivity of the IMM algorithm. Finally, the simulation results demonstrate that the proposed algorithm exhibits quick and smooth switching when disposing different maneuver models, and it also performs better than the interacting multiple models cubature Kalman filter (IMMCKF), interacting multiple models unscented Kalman filter (IMMUKF), 5CKF and the optimal mode transition matrix IMM (OMTM-IMM).

  15. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  16. Spacecraft Multiple Array Communication System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  17. Real-time inextensible surgical thread simulation.

    PubMed

    Xu, Lang; Liu, Qian

    2018-03-27

    This paper discusses a real-time simulation method of inextensible surgical thread based on the Cosserat rod theory using position-based dynamics (PBD). The method realizes stable twining and knotting of surgical thread while including inextensibility, bending, twisting and coupling effects. The Cosserat rod theory is used to model the nonlinear elastic behavior of surgical thread. The surgical thread model is solved with PBD to achieve a real-time, extremely stable simulation. Due to the one-dimensional linear structure of surgical thread, the direct solution of the distance constraint based on tridiagonal matrix algorithm is used to enhance stretching resistance in every constraint projection iteration. In addition, continuous collision detection and collision response guarantee a large time step and high performance. Furthermore, friction is integrated into the constraint projection process to stabilize the twining of multiple threads and complex contact situations. Through comparisons with existing methods, the surgical thread maintains constant length under large deformation after applying the direct distance constraint in our method. The twining and knotting of multiple threads correspond to stable solutions to contact and friction forces. A surgical suture scene is also modeled to demonstrate the practicality and simplicity of our method. Our method achieves stable and fast simulation of inextensible surgical thread. Benefiting from the unified particle framework, the rigid body, elastic rod, and soft body can be simultaneously simulated. The method is appropriate for applications in virtual surgery that require multiple dynamic bodies.

  18. State estimation of stochastic non-linear hybrid dynamic system using an interacting multiple model algorithm.

    PubMed

    Elenchezhiyan, M; Prakash, J

    2015-09-01

    In this work, state estimation schemes for non-linear hybrid dynamic systems subjected to stochastic state disturbances and random errors in measurements using interacting multiple-model (IMM) algorithms are formulated. In order to compute both discrete modes and continuous state estimates of a hybrid dynamic system either an IMM extended Kalman filter (IMM-EKF) or an IMM based derivative-free Kalman filters is proposed in this study. The efficacy of the proposed IMM based state estimation schemes is demonstrated by conducting Monte-Carlo simulation studies on the two-tank hybrid system and switched non-isothermal continuous stirred tank reactor system. Extensive simulation studies reveal that the proposed IMM based state estimation schemes are able to generate fairly accurate continuous state estimates and discrete modes. In the presence and absence of sensor bias, the simulation studies reveal that the proposed IMM unscented Kalman filter (IMM-UKF) based simultaneous state and parameter estimation scheme outperforms multiple-model UKF (MM-UKF) based simultaneous state and parameter estimation scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    NASA Astrophysics Data System (ADS)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of saltwater intrusion are considered. The salinity levels resulting at strategic locations due to these pumping are predicted using the ensemble surrogates and are constrained to be within pre-specified levels. Different realizations of the concentration values are obtained from the ensemble predictions corresponding to each candidate solution of pumping. Reliability concept is incorporated as the percent of the total number of surrogate models which satisfy the imposed constraints. The methodology was applied to a realistic coastal aquifer system in Burdekin delta area in Australia. It was found that all optimal solutions corresponding to a reliability level of 0.99 satisfy all the constraints and as reducing reliability level decreases the constraint violation increases. Thus ensemble surrogate model based simulation-optimization was found to be useful in deriving multi-objective optimal pumping strategies for coastal aquifers under parameter uncertainty.

  20. Pairwise Force SPH Model for Real-Time Multi-Interaction Applications.

    PubMed

    Yang, Tao; Martin, Ralph R; Lin, Ming C; Chang, Jian; Hu, Shi-Min

    2017-10-01

    In this paper, we present a novel pairwise-force smoothed particle hydrodynamics (PF-SPH) model to enable simulation of various interactions at interfaces in real time. Realistic capture of interactions at interfaces is a challenging problem for SPH-based simulations, especially for scenarios involving multiple interactions at different interfaces. Our PF-SPH model can readily handle multiple types of interactions simultaneously in a single simulation; its basis is to use a larger support radius than that used in standard SPH. We adopt a novel anisotropic filtering term to further improve the performance of interaction forces. The proposed model is stable; furthermore, it avoids the particle clustering problem which commonly occurs at the free surface. We show how our model can be used to capture various interactions. We also consider the close connection between droplets and bubbles, and show how to animate bubbles rising in liquid as well as bubbles in air. Our method is versatile, physically plausible and easy-to-implement. Examples are provided to demonstrate the capabilities and effectiveness of our approach.

  1. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.

  2. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - AMS Testbed Detailed Requirements

    DOT National Transportation Integrated Search

    2016-04-20

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  3. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - Chicago testbed analysis plan.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  4. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation plan : draft report.

    DOT National Transportation Integrated Search

    2016-07-13

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  5. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation summary for DMA program.

    DOT National Transportation Integrated Search

    2017-07-04

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Application (DMA) connected vehicle applications and Active Transportation and Demand management (ATDM)...

  6. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - San Diego calibration report.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  7. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : summary report for the Chicago testbed.

    DOT National Transportation Integrated Search

    2017-04-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  8. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the Chicago Testbed

    DOT National Transportation Integrated Search

    2017-04-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  9. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — Chicago calibration report.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  10. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - Pasadena testbed analysis plan : final report.

    DOT National Transportation Integrated Search

    2016-06-30

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  11. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation report for ATDM program.

    DOT National Transportation Integrated Search

    2017-07-16

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Applications (DMA) and the Active Transportation and Demand Management (ATDM) strategies. Specifically,...

  12. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - San Diego testbed analysis plan.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  13. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation summary for ATDM program.

    DOT National Transportation Integrated Search

    2017-07-04

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Application (DMA) connected vehicle applications and Active Transportation and Dynamic management (ATDM...

  14. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - AMS Testbed Selection Criteria

    DOT National Transportation Integrated Search

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  15. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : Dallas testbed analysis plan.

    DOT National Transportation Integrated Search

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate theimpacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM)strategies. The outputs (mo...

  16. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - Pasadena calibration report : draft report.

    DOT National Transportation Integrated Search

    2017-03-01

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active transportation and demand management (ATDM) strategies. The primary pu...

  17. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation report for DMA program.

    DOT National Transportation Integrated Search

    2017-02-02

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  18. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H; Montesinos-López, José C; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-05-05

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. Copyright © 2017 Montesinos-López et al.

  19. Installation effects on performance of multiple model V/STOL lift fans

    NASA Technical Reports Server (NTRS)

    Diedrich, J. H.; Clough, N.; Lieblein, S.

    1972-01-01

    An experimental program was performed in which the individual performance of multiple VTOL model lift fans was measured. The model tested consisted of three 5.5 in. diameter tip-turbine driven model VTOL lift fans mounted chordwise in a two-dimensional wing to simulate a pod-type array. The performance data provided significant insight into possible thrust variations and losses caused by the presence of cover doors, adjacent fuselage panels, and adjacent fans. The effect of a partial loss of drive air supply (simulated gas generator failure) on fan performance was also investigated. The results of the tests demonstrated that lift fan installation variables and hardware can have a significant effect on the thrust of the individual fans.

  20. Hybrid approaches for multiple-species stochastic reaction–diffusion models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spill, Fabian, E-mail: fspill@bu.edu; Department of Mechanical Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139; Guerrero, Pilar

    2015-10-15

    Reaction–diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and smallmore » in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction–diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model. - Highlights: • A novel hybrid stochastic/deterministic reaction–diffusion simulation method is given. • Can massively speed up stochastic simulations while preserving stochastic effects. • Can handle multiple reacting species. • Can handle moving boundaries.« less

  1. Pumping Optimization Model for Pump and Treat Systems - 15091

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, S.; Ivarson, Kristine A.; Karanovic, M.

    2015-01-15

    Pump and Treat systems are being utilized to remediate contaminated groundwater in the Hanford 100 Areas adjacent to the Columbia River in Eastern Washington. Design of the systems was supported by a three-dimensional (3D) fate and transport model. This model provided sophisticated simulation capabilities but requires many hours to calculate results for each simulation considered. Many simulations are required to optimize system performance, so a two-dimensional (2D) model was created to reduce run time. The 2D model was developed as a equivalent-property version of the 3D model that derives boundary conditions and aquifer properties from the 3D model. It producesmore » predictions that are very close to the 3D model predictions, allowing it to be used for comparative remedy analyses. Any potential system modifications identified by using the 2D version are verified for use by running the 3D model to confirm performance. The 2D model was incorporated into a comprehensive analysis system (the Pumping Optimization Model, POM) to simplify analysis of multiple simulations. It allows rapid turnaround by utilizing a graphical user interface that: 1 allows operators to create hypothetical scenarios for system operation, 2 feeds the input to the 2D fate and transport model, and 3 displays the scenario results to evaluate performance improvement. All of the above is accomplished within the user interface. Complex analyses can be completed within a few hours and multiple simulations can be compared side-by-side. The POM utilizes standard office computing equipment and established groundwater modeling software.« less

  2. Uncertainty Propagation of Non-Parametric-Derived Precipitation Estimates into Multi-Hydrologic Model Simulations

    NASA Astrophysics Data System (ADS)

    Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.

    2017-12-01

    Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.

  3. Using a Large-scale Neural Model of Cortical Object Processing to Investigate the Neural Substrate for Managing Multiple Items in Short-term Memory.

    PubMed

    Liu, Qin; Ulloa, Antonio; Horwitz, Barry

    2017-11-01

    Many cognitive and computational models have been proposed to help understand working memory. In this article, we present a simulation study of cortical processing of visual objects during several working memory tasks using an extended version of a previously constructed large-scale neural model [Tagamets, M. A., & Horwitz, B. Integrating electrophysiological and anatomical experimental data to create a large-scale model that simulates a delayed match-to-sample human brain imaging study. Cerebral Cortex, 8, 310-320, 1998]. The original model consisted of arrays of Wilson-Cowan type of neuronal populations representing primary and secondary visual cortices, inferotemporal (IT) cortex, and pFC. We added a module representing entorhinal cortex, which functions as a gating module. We successfully implemented multiple working memory tasks using the same model and produced neuronal patterns in visual cortex, IT cortex, and pFC that match experimental findings. These working memory tasks can include distractor stimuli or can require that multiple items be retained in mind during a delay period (Sternberg's task). Besides electrophysiology data and behavioral data, we also generated fMRI BOLD time series from our simulation. Our results support the involvement of IT cortex in working memory maintenance and suggest the cortical architecture underlying the neural mechanisms mediating particular working memory tasks. Furthermore, we noticed that, during simulations of memorizing a list of objects, the first and last items in the sequence were recalled best, which may implicate the neural mechanism behind this important psychological effect (i.e., the primacy and recency effect).

  4. Simulation of quantity and quality of storm runoff for urban catchments in Fresno, California

    USGS Publications Warehouse

    Guay, J.R.; Smith, P.E.

    1988-01-01

    Rainfall-runoff models were developed for a multiple-dwelling residential catchment (2 applications), a single-dwelling residential catchment, and a commercial catchment in Fresno, California, using the U.S. Geological Survey Distributed Routing Rainfall-Runoff Model (DR3M-II). A runoff-quality model also was developed at the commercial catchment using the Survey 's Multiple-Event Urban Runoff Quality model (DR3M-qual). The purpose of this study was: (1) to demonstrate the capabilites of the two models for use in designing storm drains, estimating the frequency of storm runoff loads, and evaluating the effectiveness of street sweeping on an urban drainage catchment; and (2) to determine the simulation accuracies of these models. Simulation errors of the two models were summarized as the median absolute deviation in percent (mad) between measured and simulated values. Calibration and verification mad errors for runoff volumes and peak discharges ranged from 14 to 20%. The estimated annual storm-runoff loads, in pounds/acre of effective impervious area, that could occur once every hundred years at the commercial catchment was 95 for dissolved solids, 1.6 for the dissolved nitrite plus nitrate, 0.31 for total recoverable lead, and 120 for suspended sediment. Calibration and verification mad errors for the above constituents ranged from 11 to 54%. (USGS)

  5. Multiple-basin energy landscapes for large-amplitude conformational motions of proteins: Structure-based molecular dynamics simulations

    PubMed Central

    Okazaki, Kei-ichi; Koga, Nobuyasu; Takada, Shoji; Onuchic, Jose N.; Wolynes, Peter G.

    2006-01-01

    Biomolecules often undergo large-amplitude motions when they bind or release other molecules. Unlike macroscopic machines, these biomolecular machines can partially disassemble (unfold) and then reassemble (fold) during such transitions. Here we put forward a minimal structure-based model, the “multiple-basin model,” that can directly be used for molecular dynamics simulation of even very large biomolecular systems so long as the endpoints of the conformational change are known. We investigate the model by simulating large-scale motions of four proteins: glutamine-binding protein, S100A6, dihydrofolate reductase, and HIV-1 protease. The mechanisms of conformational transition depend on the protein basin topologies and change with temperature near the folding transition. The conformational transition rate varies linearly with driving force over a fairly large range. This linearity appears to be a consequence of partial unfolding during the conformational transition. PMID:16877541

  6. 78 FR 20666 - Food and Drug Administration/National Institutes of Health/National Science Foundation Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-05

    ... (CDRH) believes that computer modeling and simulation (M&S) has the potential to substantially augment... simulate multiple use conditions and to visualize and display complex processes and data can revolutionize...

  7. Comparisons between GRNTRN simulations and beam measurements of proton lateral broadening distributions

    NASA Astrophysics Data System (ADS)

    Mertens, Christopher; Moyers, Michael; Walker, Steven; Tweed, John

    Recent developments in NASA's High Charge and Energy Transport (HZETRN) code have included lateral broadening of primary ion beams due to small-angle multiple Coulomb scattering, and coupling of the ion-nuclear scattering interactions with energy loss and straggling. The new version of HZETRN based on Green function methods, GRNTRN, is suitable for modeling transport with both space environment and laboratory boundary conditions. Multiple scattering processes are a necessary extension to GRNTRN in order to accurately model ion beam experiments, to simulate the physical and biological-effective radiation dose, and to develop new methods and strategies for light ion radiation therapy. In this paper we compare GRNTRN simulations of proton lateral scattering distributions with beam measurements taken at Loma Linda Medical University. The simulated and measured lateral proton distributions will be compared for a 250 MeV proton beam on aluminum, polyethylene, polystyrene, bone, iron, and lead target materials.

  8. Integration of Multiple Data Sources to Simulate the Dynamics of Land Systems

    PubMed Central

    Deng, Xiangzheng; Su, Hongbo; Zhan, Jinyan

    2008-01-01

    In this paper we present and develop a new model, which we have called Dynamics of Land Systems (DLS). The DLS model is capable of integrating multiple data sources to simulate the dynamics of a land system. Three main modules are incorporated in DLS: a spatial regression module, to explore the relationship between land uses and influencing factors, a scenario analysis module of the land uses of a region during the simulation period and a spatial disaggregation module, to allocate land use changes from a regional level to disaggregated grid cells. A case study on Taips County in North China is incorporated in this paper to test the functionality of DLS. The simulation results under the baseline, economic priority and environmental scenarios help to understand the land system dynamics and project near future land-use trajectories of a region, in order to focus management decisions on land uses and land use planning. PMID:27879726

  9. Climate change and watershed mercury export: a multiple projection and model analysis

    USGS Publications Warehouse

    Golden, Heather E.; Knightes, Christopher D.; Conrads, Paul; Feaster, Toby D.; Davis, Gary M.; Benedict, Stephen T.; Bradley, Paul M.

    2013-01-01

    Future shifts in climatic conditions may impact watershed mercury (Hg) dynamics and transport. An ensemble of watershed models was applied in the present study to simulate and evaluate the responses of hydrological and total Hg (THg) fluxes from the landscape to the watershed outlet and in-stream THg concentrations to contrasting climate change projections for a watershed in the southeastern coastal plain of the United States. Simulations were conducted under stationary atmospheric deposition and land cover conditions to explicitly evaluate the effect of projected precipitation and temperature on watershed Hg export (i.e., the flux of Hg at the watershed outlet). Based on downscaled inputs from 2 global circulation models that capture extremes of projected wet (Community Climate System Model, Ver 3 [CCSM3]) and dry (ECHAM4/HOPE-G [ECHO]) conditions for this region, watershed model simulation results suggest a decrease of approximately 19% in ensemble-averaged mean annual watershed THg fluxes using the ECHO climate-change model and an increase of approximately 5% in THg fluxes with the CCSM3 model. Ensemble-averaged mean annual ECHO in-stream THg concentrations increased 20%, while those of CCSM3 decreased by 9% between the baseline and projected simulation periods. Watershed model simulation results using both climate change models suggest that monthly watershed THg fluxes increase during the summer, when projected flow is higher than baseline conditions. The present study's multiple watershed model approach underscores the uncertainty associated with climate change response projections and their use in climate change management decisions. Thus, single-model predictions can be misleading, particularly in developmental stages of watershed Hg modeling.

  10. Structural disconnection is responsible for increased functional connectivity in multiple sclerosis.

    PubMed

    Patel, Kevin R; Tobyne, Sean; Porter, Daria; Bireley, John Daniel; Smith, Victoria; Klawiter, Eric

    2018-06-01

    Increased synchrony within neuroanatomical networks is often observed in neurophysiologic studies of human brain disease. Most often, this phenomenon is ascribed to a compensatory process in the face of injury, though evidence supporting such accounts is limited. Given the known dependence of resting-state functional connectivity (rsFC) on underlying structural connectivity (SC), we examine an alternative hypothesis: that topographical changes in SC, specifically particular patterns of disconnection, contribute to increased network rsFC. We obtain measures of rsFC using fMRI and SC using probabilistic tractography in 50 healthy and 28 multiple sclerosis subjects. Using a computational model of neuronal dynamics, we simulate BOLD using healthy subject SC to couple regions. We find that altering the model by introducing structural disconnection patterns observed in those multiple sclerosis subjects with high network rsFC generates simulations with high rsFC as well, suggesting that disconnection itself plays a role in producing high network functional connectivity. We then examine SC data in individuals. In multiple sclerosis subjects with high network rsFC, we find a preferential disconnection between the relevant network and wider system. We examine the significance of such network isolation by introducing random disconnection into the model. As observed empirically, simulated network rsFC increases with removal of connections bridging a community with the remainder of the brain. We thus show that structural disconnection known to occur in multiple sclerosis contributes to network rsFC changes in multiple sclerosis and further that community isolation is responsible for elevated network functional connectivity.

  11. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.

  12. Simulation of multiple scattering in a medium with an anisotropic scattering pattern

    NASA Astrophysics Data System (ADS)

    Kuzmin, V. L.; Val'kov, A. Yu.

    2017-03-01

    Multiple backscattering from layers with various thicknesses, including the case of half-space, is numerically simulated and a comparative analysis is performed for systems with the anisotropy of scattering described by the Henyey-Greenstein and Rayleigh-Gans phase functions. It is shown that the intensity of backscattering depends on the form of the phase function; the difference between the intensities obtained within the two models increases with anisotropy.

  13. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    PubMed Central

    Speybroeck, Niko; Van Malderen, Carine; Harper, Sam; Müller, Birgit; Devleesschauwer, Brecht

    2013-01-01

    Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks. PMID:24192788

  14. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs — Calibration Report for San Mateo Testbed.

    DOT National Transportation Integrated Search

    2016-08-22

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  15. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - calibration report for Dallas testbed : final report.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  16. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — gaps, challenges and future research.

    DOT National Transportation Integrated Search

    2017-05-01

    The primary objective of AMS project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. Through this p...

  17. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - Evaluation Report for the San Diego Testbed

    DOT National Transportation Integrated Search

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  18. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the San Diego Testbed : Draft Report.

    DOT National Transportation Integrated Search

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  19. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - calibration Report for Phoenix Testbed : Final Report.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  20. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation summary for the San Diego testbed

    DOT National Transportation Integrated Search

    2017-08-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  1. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic applications (DMA) and active transportation and demand management (ATDM) programs — leveraging AMS testbed outputs for ATDM analysis – a primer.

    DOT National Transportation Integrated Search

    2017-08-01

    The primary objective of AMS Testbed project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. Throug...

  2. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - San Mateo Testbed Analysis Plan : Final Report.

    DOT National Transportation Integrated Search

    2016-06-29

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  3. Idealized modeling of convective organization with changing sea surface temperatures using multiple equilibria in weak temperature gradient simulations

    NASA Astrophysics Data System (ADS)

    Sentić, Stipo; Sessions, Sharon L.

    2017-06-01

    The weak temperature gradient (WTG) approximation is a method of parameterizing the influences of the large scale on local convection in limited domain simulations. WTG simulations exhibit multiple equilibria in precipitation; depending on the initial moisture content, simulations can precipitate or remain dry for otherwise identical boundary conditions. We use a hypothesized analogy between multiple equilibria in precipitation in WTG simulations, and dry and moist regions of organized convection to study tropical convective organization. We find that the range of wind speeds that support multiple equilibria depends on sea surface temperature (SST). Compared to the present SST, low SSTs support a narrower range of multiple equilibria at higher wind speeds. In contrast, high SSTs exhibit a narrower range of multiple equilibria at low wind speeds. This suggests that at high SSTs, organized convection might occur with lower surface forcing. To characterize convection at different SSTs, we analyze the change in relationships between precipitation rate, atmospheric stability, moisture content, and the large-scale transport of moist entropy and moisture with increasing SSTs. We find an increase in large-scale export of moisture and moist entropy from dry simulations with increasing SST, which is consistent with a strengthening of the up-gradient transport of moisture from dry regions to moist regions in organized convection. Furthermore, the changes in diagnostic relationships with SST are consistent with more intense convection in precipitating regions of organized convection for higher SSTs.

  4. Application of constraint-based satellite mission planning model in forest fire monitoring

    NASA Astrophysics Data System (ADS)

    Guo, Bingjun; Wang, Hongfei; Wu, Peng

    2017-10-01

    In this paper, a constraint-based satellite mission planning model is established based on the thought of constraint satisfaction. It includes target, request, observation, satellite, payload and other elements, with constraints linked up. The optimization goal of the model is to make full use of time and resources, and improve the efficiency of target observation. Greedy algorithm is used in the model solving to make observation plan and data transmission plan. Two simulation experiments are designed and carried out, which are routine monitoring of global forest fire and emergency monitoring of forest fires in Australia. The simulation results proved that the model and algorithm perform well. And the model is of good emergency response capability. Efficient and reasonable plan can be worked out to meet users' needs under complex cases of multiple payloads, multiple targets and variable priorities with this model.

  5. VERAIn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan

    2015-02-16

    CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less

  6. High-throughput landslide modelling using computational grids

    NASA Astrophysics Data System (ADS)

    Wallace, M.; Metson, S.; Holcombe, L.; Anderson, M.; Newbold, D.; Brook, N.

    2012-04-01

    Landslides are an increasing problem in developing countries. Multiple landslides can be triggered by heavy rainfall resulting in loss of life, homes and critical infrastructure. Through computer simulation of individual slopes it is possible to predict the causes, timing and magnitude of landslides and estimate the potential physical impact. Geographical scientists at the University of Bristol have developed software that integrates a physically-based slope hydrology and stability model (CHASM) with an econometric model (QUESTA) in order to predict landslide risk over time. These models allow multiple scenarios to be evaluated for each slope, accounting for data uncertainties, different engineering interventions, risk management approaches and rainfall patterns. Individual scenarios can be computationally intensive, however each scenario is independent and so multiple scenarios can be executed in parallel. As more simulations are carried out the overhead involved in managing input and output data becomes significant. This is a greater problem if multiple slopes are considered concurrently, as is required both for landslide research and for effective disaster planning at national levels. There are two critical factors in this context: generated data volumes can be in the order of tens of terabytes, and greater numbers of simulations result in long total runtimes. Users of such models, in both the research community and in developing countries, need to develop a means for handling the generation and submission of landside modelling experiments, and the storage and analysis of the resulting datasets. Additionally, governments in developing countries typically lack the necessary computing resources and infrastructure. Consequently, knowledge that could be gained by aggregating simulation results from many different scenarios across many different slopes remains hidden within the data. To address these data and workload management issues, University of Bristol particle physicists and geographical scientists are collaborating to develop methods for providing simple and effective access to landslide models and associated simulation data. Particle physicists have valuable experience in dealing with data complexity and management due to the scale of data generated by particle accelerators such as the Large Hadron Collider (LHC). The LHC generates tens of petabytes of data every year which is stored and analysed using the Worldwide LHC Computing Grid (WLCG). Tools and concepts from the WLCG are being used to drive the development of a Software-as-a-Service (SaaS) platform to provide access to hosted landslide simulation software and data. It contains advanced data management features and allows landslide simulations to be run on the WLCG, dramatically reducing simulation runtimes by parallel execution. The simulations are accessed using a web page through which users can enter and browse input data, submit jobs and visualise results. Replication of the data ensures a local copy can be accessed should a connection to the platform be unavailable. The platform does not know the details of the simulation software it runs, so it is therefore possible to use it to run alternative models at similar scales. This creates the opportunity for activities such as model sensitivity analysis and performance comparison at scales that are impractical using standalone software.

  7. Large ensemble and large-domain hydrologic modeling: Insights from SUMMA applications in the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ou, G.; Nijssen, B.; Nearing, G. S.; Newman, A. J.; Mizukami, N.; Clark, M. P.

    2016-12-01

    The Structure for Unifying Multiple Modeling Alternatives (SUMMA) provides a unifying modeling framework for process-based hydrologic modeling by defining a general set of conservation equations for mass and energy, with the capability to incorporate multiple choices for spatial discretizations and flux parameterizations. In this study, we provide a first demonstration of large-scale hydrologic simulations using SUMMA through an application to the Columbia River Basin (CRB) in the northwestern United States and Canada for a multi-decadal simulation period. The CRB is discretized into 11,723 hydrologic response units (HRUs) according to the United States Geologic Service Geospatial Fabric. The soil parameters are derived from the Natural Resources Conservation Service Soil Survey Geographic (SSURGO) Database. The land cover parameters are based on the National Land Cover Database from the year 2001 created by the Multi-Resolution Land Characteristics (MRLC) Consortium. The forcing data, including hourly air pressure, temperature, specific humidity, wind speed, precipitation, shortwave and longwave radiations, are based on Phase 2 of the North American Land Data Assimilation System (NLDAS-2) and averaged for each HRU. The simulation results are compared to simulations with the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS). We are particularly interested in SUMMA's capability to mimic model behaviors of the other two models through the selection of appropriate model parameterizations in SUMMA.

  8. Influence of reanalysis datasets on dynamically downscaling the recent past

    NASA Astrophysics Data System (ADS)

    Moalafhi, Ditiro B.; Evans, Jason P.; Sharma, Ashish

    2017-08-01

    Multiple reanalysis datasets currently exist that can provide boundary conditions for dynamic downscaling and simulating local hydro-climatic processes at finer spatial and temporal resolutions. Previous work has suggested that there are two reanalyses alternatives that provide the best lateral boundary conditions for downscaling over southern Africa. This study dynamically downscales these reanalyses (ERA-I and MERRA) over southern Africa to a high resolution (10 km) grid using the WRF model. Simulations cover the period 1981-2010. Multiple observation datasets were used for both surface temperature and precipitation to account for observational uncertainty when assessing results. Generally, temperature is simulated quite well, except over the Namibian coastal plain where the simulations show anomalous warm temperature related to the failure to propagate the influence of the cold Benguela current inland. Precipitation tends to be overestimated in high altitude areas, and most of southern Mozambique. This could be attributed to challenges in handling complex topography and capturing large-scale circulation patterns. While MERRA driven WRF exhibits slightly less bias in temperature especially for La Nina years, ERA-I driven simulations are on average superior in terms of RMSE. When considering multiple variables and metrics, ERA-I is found to produce the best simulation of the climate over the domain. The influence of the regional model appears to be large enough to overcome the small difference in relative errors present in the lateral boundary conditions derived from these two reanalyses.

  9. Novel patch modelling method for efficient simulation and prediction uncertainty analysis of multi-scale groundwater flow and transport processes

    NASA Astrophysics Data System (ADS)

    Sreekanth, J.; Moore, Catherine

    2018-04-01

    The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.

  10. Laplace transform analysis of a multiplicative asset transfer model

    NASA Astrophysics Data System (ADS)

    Sokolov, Andrey; Melatos, Andrew; Kieu, Tien

    2010-07-01

    We analyze a simple asset transfer model in which the transfer amount is a fixed fraction f of the giver’s wealth. The model is analyzed in a new way by Laplace transforming the master equation, solving it analytically and numerically for the steady-state distribution, and exploring the solutions for various values of f∈(0,1). The Laplace transform analysis is superior to agent-based simulations as it does not depend on the number of agents, enabling us to study entropy and inequality in regimes that are costly to address with simulations. We demonstrate that Boltzmann entropy is not a suitable (e.g. non-monotonic) measure of disorder in a multiplicative asset transfer system and suggest an asymmetric stochastic process that is equivalent to the asset transfer model.

  11. Analog quantum simulation of generalized Dicke models in trapped ions

    NASA Astrophysics Data System (ADS)

    Aedo, Ibai; Lamata, Lucas

    2018-04-01

    We propose the analog quantum simulation of generalized Dicke models in trapped ions. By combining bicromatic laser interactions on multiple ions we can generate all regimes of light-matter coupling in these models, where here the light mode is mimicked by a motional mode. We present numerical simulations of the three-qubit Dicke model both in the weak field (WF) regime, where the Jaynes-Cummings behavior arises, and the ultrastrong coupling (USC) regime, where a rotating-wave approximation cannot be considered. We also simulate the two-qubit biased Dicke model in the WF and USC regimes and the two-qubit anisotropic Dicke model in the USC regime and the deep-strong coupling regime. The agreement between the mathematical models and the ion system convinces us that these quantum simulations can be implemented in the laboratory with current or near-future technology. This formalism establishes an avenue for the quantum simulation of many-spin Dicke models in trapped ions.

  12. Multiple transient memories in sheared suspensions: Robustness, structure, and routes to plasticity

    NASA Astrophysics Data System (ADS)

    Keim, Nathan C.; Paulsen, Joseph D.; Nagel, Sidney R.

    2013-09-01

    Multiple transient memories, originally discovered in charge-density-wave conductors, are a remarkable and initially counterintuitive example of how a system can store information about its driving. In this class of memories, a system can learn multiple driving inputs, nearly all of which are eventually forgotten despite their continual input. If sufficient noise is present, the system regains plasticity so that it can continue to learn new memories indefinitely. Recently, Keim and Nagel [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.107.010603 107, 010603 (2011)] showed how multiple transient memories could be generalized to a generic driven disordered system with noise, giving as an example simulations of a simple model of a sheared non-Brownian suspension. Here, we further explore simulation models of suspensions under cyclic shear, focusing on three main themes: robustness, structure, and overdriving. We show that multiple transient memories are a robust feature independent of many details of the model. The steady-state spatial distribution of the particles is sensitive to the driving algorithm; nonetheless, the memory formation is independent of such a change in particle correlations. Finally, we demonstrate that overdriving provides another means for controlling memory formation and retention.

  13. Computers in Biological Education: Simulation Approaches. Genetics and Evolution. CAL Research Group Technical Report No. 13.

    ERIC Educational Resources Information Center

    Murphy, P. J.

    Three examples of genetics and evolution simulation concerning Mendelian inheritance, genetic mapping, and natural selection are used to illustrate the use of simulations in modeling scientific/natural processes. First described is the HERED series, which illustrates such phenomena as incomplete dominance, multiple alleles, lethal alleles,…

  14. Mercury and methylmercury stream concentrations in a Coastal Plain watershed: a multi-scale simulation analysis.

    PubMed

    Knightes, C D; Golden, H E; Journey, C A; Davis, G M; Conrads, P A; Marvin-DiPasquale, M; Brigham, M E; Bradley, P M

    2014-04-01

    Mercury is a ubiquitous global environmental toxicant responsible for most US fish advisories. Processes governing mercury concentrations in rivers and streams are not well understood, particularly at multiple spatial scales. We investigate how insights gained from reach-scale mercury data and model simulations can be applied at broader watershed scales using a spatially and temporally explicit watershed hydrology and biogeochemical cycling model, VELMA. We simulate fate and transport using reach-scale (0.1 km(2)) study data and evaluate applications to multiple watershed scales. Reach-scale VELMA parameterization was applied to two nested sub-watersheds (28 km(2) and 25 km(2)) and the encompassing watershed (79 km(2)). Results demonstrate that simulated flow and total mercury concentrations compare reasonably to observations at different scales, but simulated methylmercury concentrations are out-of-phase with observations. These findings suggest that intricacies of methylmercury biogeochemical cycling and transport are under-represented in VELMA and underscore the complexity of simulating mercury fate and transport. Published by Elsevier Ltd.

  15. Lattice Boltzmann simulations of multiple-droplet interaction dynamics.

    PubMed

    Zhou, Wenchao; Loney, Drew; Fedorov, Andrei G; Degertekin, F Levent; Rosen, David W

    2014-03-01

    A lattice Boltzmann (LB) formulation, which is consistent with the phase-field model for two-phase incompressible fluid, is proposed to model the interface dynamics of droplet impingement. The interparticle force is derived by comparing the macroscopic transport equations recovered from LB equations with the governing equations of the continuous phase-field model. The inconsistency between the existing LB implementations and the phase-field model in calculating the relaxation time at the phase interface is identified and an approximation is proposed to ensure the consistency with the phase-field model. It is also shown that the commonly used equilibrium velocity boundary for the binary fluid LB scheme does not conserve momentum at the wall boundary and a modified scheme is developed to ensure the momentum conservation at the boundary. In addition, a geometric formulation of the wetting boundary condition is proposed to replace the popular surface energy formulation and results show that the geometric approach enforces the prescribed contact angle better than the surface energy formulation in both static and dynamic wetting. The proposed LB formulation is applied to simulating droplet impingement dynamics in three dimensions and results are compared to those obtained with the continuous phase-field model, the LB simulations reported in the literature, and experimental data from the literature. The results show that the proposed LB simulation approach yields not only a significant speed improvement over the phase-field model in simulating droplet impingement dynamics on a submillimeter length scale, but also better accuracy than both the phase-field model and the previously reported LB techniques when compared to experimental data. Upon validation, the proposed LB modeling methodology is applied to the study of multiple-droplet impingement and interactions in three dimensions, which demonstrates its powerful capability of simulating extremely complex interface phenomena.

  16. Simulation of land use change in the three gorges reservoir area based on CART-CA

    NASA Astrophysics Data System (ADS)

    Yuan, Min

    2018-05-01

    This study proposes a new method to simulate spatiotemporal complex multiple land uses by using classification and regression tree algorithm (CART) based CA model. In this model, we use classification and regression tree algorithm to calculate land class conversion probability, and combine neighborhood factor, random factor to extract cellular transformation rules. The overall Kappa coefficient is 0.8014 and the overall accuracy is 0.8821 in the land dynamic simulation results of the three gorges reservoir area from 2000 to 2010, and the simulation results are satisfactory.

  17. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    NASA Astrophysics Data System (ADS)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.

  18. Techniques and resources for storm-scale numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Droegemeier, Kelvin; Grell, Georg; Doyle, James; Soong, Su-Tzai; Skamarock, William; Bacon, David; Staniforth, Andrew; Crook, Andrew; Wilhelmson, Robert

    1993-01-01

    The topics discussed include the following: multiscale application of the 5th-generation PSU/NCAR mesoscale model, the coupling of nonhydrostatic atmospheric and hydrostatic ocean models for air-sea interaction studies; a numerical simulation of cloud formation over complex topography; adaptive grid simulations of convection; an unstructured grid, nonhydrostatic meso/cloud scale model; efficient mesoscale modeling for multiple scales using variable resolution; initialization of cloud-scale models with Doppler radar data; and making effective use of future computing architectures, networks, and visualization software.

  19. Simulation of Rate-Related (Dead-Time) Losses In Passive Neutron Multiplicity Counting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, L.G.; Norman, P.I.; Leadbeater, T.W.

    Passive Neutron Multiplicity Counting (PNMC) based on Multiplicity Shift Register (MSR) electronics (a form of time correlation analysis) is a widely used non-destructive assay technique for quantifying spontaneously fissile materials such as Pu. At high event rates, dead-time losses perturb the count rates with the Singles, Doubles and Triples being increasingly affected. Without correction these perturbations are a major source of inaccuracy in the measured count rates and assay values derived from them. This paper presents the simulation of dead-time losses and investigates the effect of applying different dead-time models on the observed MSR data. Monte Carlo methods have beenmore » used to simulate neutron pulse trains for a variety of source intensities and with ideal detection geometry, providing an event by event record of the time distribution of neutron captures within the detection system. The action of the MSR electronics was modelled in software to analyse these pulse trains. Stored pulse trains were perturbed in software to apply the effects of dead-time according to the chosen physical process; for example, the ideal paralysable (extending) and non-paralysable models with an arbitrary dead-time parameter. Results of the simulations demonstrate the change in the observed MSR data when the system dead-time parameter is varied. In addition, the paralysable and non-paralysable models of deadtime are compared. These results form part of a larger study to evaluate existing dead-time corrections and to extend their application to correlated sources. (authors)« less

  20. Apollo: Giving application developers a single point of access to public health models using structured vocabularies and Web services

    PubMed Central

    Wagner, Michael M.; Levander, John D.; Brown, Shawn; Hogan, William R.; Millett, Nicholas; Hanna, Josh

    2013-01-01

    This paper describes the Apollo Web Services and Apollo-SV, its related ontology. The Apollo Web Services give an end-user application a single point of access to multiple epidemic simulators. An end user can specify an analytic problem—which we define as a configuration and a query of results—exactly once and submit it to multiple epidemic simulators. The end user represents the analytic problem using a standard syntax and vocabulary, not the native languages of the simulators. We have demonstrated the feasibility of this design by implementing a set of Apollo services that provide access to two epidemic simulators and two visualizer services. PMID:24551417

  1. Apollo: giving application developers a single point of access to public health models using structured vocabularies and Web services.

    PubMed

    Wagner, Michael M; Levander, John D; Brown, Shawn; Hogan, William R; Millett, Nicholas; Hanna, Josh

    2013-01-01

    This paper describes the Apollo Web Services and Apollo-SV, its related ontology. The Apollo Web Services give an end-user application a single point of access to multiple epidemic simulators. An end user can specify an analytic problem-which we define as a configuration and a query of results-exactly once and submit it to multiple epidemic simulators. The end user represents the analytic problem using a standard syntax and vocabulary, not the native languages of the simulators. We have demonstrated the feasibility of this design by implementing a set of Apollo services that provide access to two epidemic simulators and two visualizer services.

  2. A semi-analytical model of a time reversal cavity for high-amplitude focused ultrasound applications

    NASA Astrophysics Data System (ADS)

    Robin, J.; Tanter, M.; Pernot, M.

    2017-09-01

    Time reversal cavities (TRC) have been proposed as an efficient approach for 3D ultrasound therapy. They allow the precise spatio-temporal focusing of high-power ultrasound pulses within a large region of interest with a low number of transducers. Leaky TRCs are usually built by placing a multiple scattering medium, such as a random rod forest, in a reverberating cavity, and the final peak pressure gain of the device only depends on the temporal length of its impulse response. Such multiple scattering in a reverberating cavity is a complex phenomenon, and optimisation of the device’s gain is usually a cumbersome process, mostly empirical, and requiring numerical simulations with extremely long computation times. In this paper, we present a semi-analytical model for the fast optimisation of a TRC. This model decouples ultrasound propagation in an empty cavity and multiple scattering in a multiple scattering medium. It was validated numerically and experimentally using a 2D-TRC and numerically using a 3D-TRC. Finally, the model was used to determine rapidly the optimal parameters of the 3D-TRC which had been confirmed by numerical simulations.

  3. Multidimensional computer simulation of Stirling cycle engines

    NASA Technical Reports Server (NTRS)

    Hall, C. A.; Porsching, T. A.; Medley, J.; Tew, R. C.

    1990-01-01

    The computer code ALGAE (algorithms for the gas equations) treats incompressible, thermally expandable, or locally compressible flows in complicated two-dimensional flow regions. The solution method, finite differencing schemes, and basic modeling of the field equations in ALGAE are applicable to engineering design settings of the type found in Stirling cycle engines. The use of ALGAE to model multiple components of the space power research engine (SPRE) is reported. Videotape computer simulations of the transient behavior of the working gas (helium) in the heater-regenerator-cooler complex of the SPRE demonstrate the usefulness of such a program in providing information on thermal and hydraulic phenomena in multiple component sections of the SPRE.

  4. Simulations of Ground-Water Flow, Transport, Age, and Particle Tracking near York, Nebraska, for a Study of Transport of Anthropogenic and Natural Contaminants (TANC) to Public-Supply Wells

    USGS Publications Warehouse

    Clark, Brian R.; Landon, Matthew K.; Kauffman, Leon J.; Hornberger, George Z.

    2008-01-01

    Contamination of public-supply wells has resulted in public-health threats and negative economic effects for communities that must treat contaminated water or find alternative water supplies. To investigate factors controlling vulnerability of public-supply wells to anthropogenic and natural contaminants using consistent and systematic data collected in a variety of principal aquifer settings in the United States, a study of Transport of Anthropogenic and Natural Contaminants to public-supply wells was begun in 2001 as part of the U.S. Geological Survey National Water-Quality Assessment Program. The area simulated by the ground-water flow model described in this report was selected for a study of processes influencing contaminant distribution and transport along the direction of ground-water flow towards a public-supply well in southeastern York, Nebraska. Ground-water flow is simulated for a 60-year period from September 1, 1944, to August 31, 2004. Steady-state conditions are simulated prior to September 1, 1944, and represent conditions prior to use of ground water for irrigation. Irrigation, municipal, and industrial wells were simulated using the Multi-Node Well package of the modular three-dimensional ground-water flow model code, MODFLOW-2000, which allows simulation of flow and solutes through wells that are simulated in multiple nodes or layers. Ground-water flow, age, and transport of selected tracers were simulated using the Ground-Water Transport process of MODFLOW-2000. Simulated ground-water age was compared to interpreted ground-water age in six monitoring wells in the unconfined aquifer. The tracer chlorofluorocarbon-11 was simulated directly using Ground-Water Transport for comparison with concentrations measured in six monitoring wells and one public supply well screened in the upper confined aquifer. Three alternative model simulations indicate that simulation results are highly sensitive to the distribution of multilayer well bores where leakage can occur and that the calibrated model resulted in smaller differences than the alternative models between simulated and interpreted ages and measured tracer concentrations in most, but not all, wells. Results of the first alternative model indicate that the distribution of young water in the upper confined aquifer is substantially different when well-bore leakage at known abandoned wells and test holes is removed from the model. In the second alternative model, simulated age near the bottom of the unconfined aquifer was younger than interpreted ages and simulated chlorofluorocarbon-11 concentrations in the upper confined aquifer were zero in five out of six wells because the conventional Well Package fails to account for flow between model layers though well bores. The third alternative model produced differences between simulated and interpreted ground-water ages and measured chlorofluorocarbon-11 concentrations that were comparable to the calibrated model. However, simulated hydraulic heads deviated from measured hydraulic heads by a greater amount than for the calibrated model. Even so, because the third alternative model simulates steady-state flow, additional analysis was possible using steady-state particle tracking to assess the contributing recharge area to a public supply well selected for analysis of factors contributing to well vulnerability. Results from particle-tracking software (MODPATH) using the third alternative model indicates that the contributing recharge area of the study public-supply well is a composite of elongated, seemingly isolated areas associated with wells that are screened in multiple aquifers. The simulated age distribution of particles at the study public-supply well indicates that all water younger than 58 years travels through well bores of wells screened in multiple aquifers. The age distribution from the steady-state model using MODPATH estimates the youngest 7 percent of the water to have a flow-weighted mean age

  5. MultiGeMS: detection of SNVs from multiple samples using model selection on high-throughput sequencing data.

    PubMed

    Murillo, Gabriel H; You, Na; Su, Xiaoquan; Cui, Wei; Reilly, Muredach P; Li, Mingyao; Ning, Kang; Cui, Xinping

    2016-05-15

    Single nucleotide variant (SNV) detection procedures are being utilized as never before to analyze the recent abundance of high-throughput DNA sequencing data, both on single and multiple sample datasets. Building on previously published work with the single sample SNV caller genotype model selection (GeMS), a multiple sample version of GeMS (MultiGeMS) is introduced. Unlike other popular multiple sample SNV callers, the MultiGeMS statistical model accounts for enzymatic substitution sequencing errors. It also addresses the multiple testing problem endemic to multiple sample SNV calling and utilizes high performance computing (HPC) techniques. A simulation study demonstrates that MultiGeMS ranks highest in precision among a selection of popular multiple sample SNV callers, while showing exceptional recall in calling common SNVs. Further, both simulation studies and real data analyses indicate that MultiGeMS is robust to low-quality data. We also demonstrate that accounting for enzymatic substitution sequencing errors not only improves SNV call precision at low mapping quality regions, but also improves recall at reference allele-dominated sites with high mapping quality. The MultiGeMS package can be downloaded from https://github.com/cui-lab/multigems xinping.cui@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Multiscale modeling of mucosal immune responses

    PubMed Central

    2015-01-01

    Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM. Background Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Implementation Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. Conclusion We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation. PMID:26329787

  7. Multiscale modeling of mucosal immune responses.

    PubMed

    Mei, Yongguo; Abedi, Vida; Carbo, Adria; Zhang, Xiaoying; Lu, Pinyi; Philipson, Casandra; Hontecillas, Raquel; Hoops, Stefan; Liles, Nathan; Bassaganya-Riera, Josep

    2015-01-01

    Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation.Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM.

  8. Using Nucleon Multiplicities to Analyze Anti-Neutrino Interactions with Nuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elkins, Miranda J.

    The most commonly used, simple interaction models have not accurately described the nuclear effects on either neutrino-nucleus or anti-neutrino-nucleus interactions. Comparison of data collected by the MINERvA experiment and these models shows a discrepancy in the reconstructed hadronic energy distribution at momentum transfers below 0.8 GeV. Two nuclear model effects that were previously not modeled are possible culprits of this discrepancy. The first is known as random-phase-approximation and the second is the addition of a meson exchange current process, also known as two-particle two-hole due to its result in two particles leaving the nucleus with two holes left in theirmore » place. For the first time a neutron counting software algorithm has been created and used to compare the multiplicity and spatial distributions of neutrons between the simulation and data. There is localized sensitivity to the RPA and 2p2h effects and both help the simulation better describe the data. Ad ditional systematic or model effects are present which cause the simulation to overproduce neutrons, and potential causes are discussed.« less

  9. Application of artificial neural networks in hydrological modeling: A case study of runoff simulation of a Himalayan glacier basin

    NASA Technical Reports Server (NTRS)

    Buch, A. M.; Narain, A.; Pandey, P. C.

    1994-01-01

    The simulation of runoff from a Himalayan Glacier basin using an Artificial Neural Network (ANN) is presented. The performance of the ANN model is found to be superior to the Energy Balance Model and the Multiple Regression model. The RMS Error is used as the figure of merit for judging the performance of the three models, and the RMS Error for the ANN model is the latest of the three models. The ANN is faster in learning and exhibits excellent system generalization characteristics.

  10. Simulation of large-scale rule-based models

    PubMed Central

    Colvin, Joshua; Monine, Michael I.; Faeder, James R.; Hlavacek, William S.; Von Hoff, Daniel D.; Posner, Richard G.

    2009-01-01

    Motivation: Interactions of molecules, such as signaling proteins, with multiple binding sites and/or multiple sites of post-translational covalent modification can be modeled using reaction rules. Rules comprehensively, but implicitly, define the individual chemical species and reactions that molecular interactions can potentially generate. Although rules can be automatically processed to define a biochemical reaction network, the network implied by a set of rules is often too large to generate completely or to simulate using conventional procedures. To address this problem, we present DYNSTOC, a general-purpose tool for simulating rule-based models. Results: DYNSTOC implements a null-event algorithm for simulating chemical reactions in a homogenous reaction compartment. The simulation method does not require that a reaction network be specified explicitly in advance, but rather takes advantage of the availability of the reaction rules in a rule-based specification of a network to determine if a randomly selected set of molecular components participates in a reaction during a time step. DYNSTOC reads reaction rules written in the BioNetGen language which is useful for modeling protein–protein interactions involved in signal transduction. The method of DYNSTOC is closely related to that of StochSim. DYNSTOC differs from StochSim by allowing for model specification in terms of BNGL, which extends the range of protein complexes that can be considered in a model. DYNSTOC enables the simulation of rule-based models that cannot be simulated by conventional methods. We demonstrate the ability of DYNSTOC to simulate models accounting for multisite phosphorylation and multivalent binding processes that are characterized by large numbers of reactions. Availability: DYNSTOC is free for non-commercial use. The C source code, supporting documentation and example input files are available at http://public.tgen.org/dynstoc/. Contact: dynstoc@tgen.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19213740

  11. Landscape Builder: software for the creation of initial landscapes for LANDIS from FIA data

    Treesearch

    William Dijak

    2013-01-01

    I developed Landscape Builder to create spatially explicit landscapes as starting conditions for LANDIS Pro 7.0 and LANDIS II landscape forest simulation models from classified satellite imagery and Forest Inventory and Analysis (FIA) data collected over multiple years. LANDIS Pro and LANDIS II models project future landscapes by simulating tree growth, tree species...

  12. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  13. Climate change and watershed mercury export: a multiple projection and model analysis.

    PubMed

    Golden, Heather E; Knightes, Christopher D; Conrads, Paul A; Feaster, Toby D; Davis, Gary M; Benedict, Stephen T; Bradley, Paul M

    2013-09-01

    Future shifts in climatic conditions may impact watershed mercury (Hg) dynamics and transport. An ensemble of watershed models was applied in the present study to simulate and evaluate the responses of hydrological and total Hg (THg) fluxes from the landscape to the watershed outlet and in-stream THg concentrations to contrasting climate change projections for a watershed in the southeastern coastal plain of the United States. Simulations were conducted under stationary atmospheric deposition and land cover conditions to explicitly evaluate the effect of projected precipitation and temperature on watershed Hg export (i.e., the flux of Hg at the watershed outlet). Based on downscaled inputs from 2 global circulation models that capture extremes of projected wet (Community Climate System Model, Ver 3 [CCSM3]) and dry (ECHAM4/HOPE-G [ECHO]) conditions for this region, watershed model simulation results suggest a decrease of approximately 19% in ensemble-averaged mean annual watershed THg fluxes using the ECHO climate-change model and an increase of approximately 5% in THg fluxes with the CCSM3 model. Ensemble-averaged mean annual ECHO in-stream THg concentrations increased 20%, while those of CCSM3 decreased by 9% between the baseline and projected simulation periods. Watershed model simulation results using both climate change models suggest that monthly watershed THg fluxes increase during the summer, when projected flow is higher than baseline conditions. The present study's multiple watershed model approach underscores the uncertainty associated with climate change response projections and their use in climate change management decisions. Thus, single-model predictions can be misleading, particularly in developmental stages of watershed Hg modeling. Copyright © 2013 SETAC.

  14. Multimodel simulations of forest harvesting effects on long‐term productivity and CN cycling in aspen forests.

    PubMed

    Wang, Fugui; Mladenoff, David J; Forrester, Jodi A; Blanco, Juan A; Schelle, Robert M; Peckham, Scott D; Keough, Cindy; Lucash, Melissa S; Gower, Stith T

    The effects of forest management on soil carbon (C) and nitrogen (N) dynamics vary by harvest type and species. We simulated long-term effects of bole-only harvesting of aspen (Populus tremuloides) on stand productivity and interaction of CN cycles with a multiple model approach. Five models, Biome-BGC, CENTURY, FORECAST, LANDIS-II with Century-based soil dynamics, and PnET-CN, were run for 350 yr with seven harvesting events on nutrient-poor, sandy soils representing northwestern Wisconsin, United States. Twenty CN state and flux variables were summarized from the models' outputs and statistically analyzed using ordination and variance analysis methods. The multiple models' averages suggest that bole-only harvest would not significantly affect long-term site productivity of aspen, though declines in soil organic matter and soil N were significant. Along with direct N removal by harvesting, extensive leaching after harvesting before canopy closure was another major cause of N depletion. These five models were notably different in output values of the 20 variables examined, although there were some similarities for certain variables. PnET-CN produced unique results for every variable, and CENTURY showed fewer outliers and similar temporal patterns to the mean of all models. In general, we demonstrated that when there are no site-specific data for fine-scale calibration and evaluation of a single model, the multiple model approach may be a more robust approach for long-term simulations. In addition, multimodeling may also improve the calibration and evaluation of an individual model.

  15. Multiple-Objective Stepwise Calibration Using Luca

    USGS Publications Warehouse

    Hay, Lauren E.; Umemoto, Makiko

    2007-01-01

    This report documents Luca (Let us calibrate), a multiple-objective, stepwise, automated procedure for hydrologic model calibration and the associated graphical user interface (GUI). Luca is a wizard-style user-friendly GUI that provides an easy systematic way of building and executing a calibration procedure. The calibration procedure uses the Shuffled Complex Evolution global search algorithm to calibrate any model compiled with the U.S. Geological Survey's Modular Modeling System. This process assures that intermediate and final states of the model are simulated consistently with measured values.

  16. Reproducing multi-model ensemble average with Ensemble-averaged Reconstructed Forcings (ERF) in regional climate modeling

    NASA Astrophysics Data System (ADS)

    Erfanian, A.; Fomenko, L.; Wang, G.

    2016-12-01

    Multi-model ensemble (MME) average is considered the most reliable for simulating both present-day and future climates. It has been a primary reference for making conclusions in major coordinated studies i.e. IPCC Assessment Reports and CORDEX. The biases of individual models cancel out each other in MME average, enabling the ensemble mean to outperform individual members in simulating the mean climate. This enhancement however comes with tremendous computational cost, which is especially inhibiting for regional climate modeling as model uncertainties can originate from both RCMs and the driving GCMs. Here we propose the Ensemble-based Reconstructed Forcings (ERF) approach to regional climate modeling that achieves a similar level of bias reduction at a fraction of cost compared with the conventional MME approach. The new method constructs a single set of initial and boundary conditions (IBCs) by averaging the IBCs of multiple GCMs, and drives the RCM with this ensemble average of IBCs to conduct a single run. Using a regional climate model (RegCM4.3.4-CLM4.5), we tested the method over West Africa for multiple combination of (up to six) GCMs. Our results indicate that the performance of the ERF method is comparable to that of the MME average in simulating the mean climate. The bias reduction seen in ERF simulations is achieved by using more realistic IBCs in solving the system of equations underlying the RCM physics and dynamics. This endows the new method with a theoretical advantage in addition to reducing computational cost. The ERF output is an unaltered solution of the RCM as opposed to a climate state that might not be physically plausible due to the averaging of multiple solutions with the conventional MME approach. The ERF approach should be considered for use in major international efforts such as CORDEX. Key words: Multi-model ensemble, ensemble analysis, ERF, regional climate modeling

  17. Spacecraft Trajectory Analysis and Mission Planning Simulation (STAMPS) Software

    NASA Technical Reports Server (NTRS)

    Puckett, Nancy; Pettinger, Kris; Hallstrom,John; Brownfield, Dana; Blinn, Eric; Williams, Frank; Wiuff, Kelli; McCarty, Steve; Ramirez, Daniel; Lamotte, Nicole; hide

    2014-01-01

    STAMPS simulates either three- or six-degree-of-freedom cases for all spacecraft flight phases using translated HAL flight software or generic GN&C models. Single or multiple trajectories can be simulated for use in optimization and dispersion analysis. It includes math models for the vehicle and environment, and currently features a "C" version of shuttle onboard flight software. The STAMPS software is used for mission planning and analysis within ascent/descent, rendezvous, proximity operations, and navigation flight design areas.

  18. Using Advanced Analysis Approaches to Complete Long-Term Evaluations of Natural Attenuation Processes on the Remediation of Dissolved Chlorinated Solvent Contamination

    DTIC Science & Technology

    2008-10-01

    and UTCHEM (Clement et al., 1998). While all four of these software packages use conservation of mass as the basic principle for tracking NAPL...simulate dissolution of a single NAPL component. UTCHEM can be used to simulate dissolution of a multiple NAPL components using either linear or first...parameters. No UTCHEM a/ 3D model, general purpose NAPL simulator. Yes Virulo a/ Probabilistic model for predicting leaching of viruses in unsaturated

  19. Space Station communications and tracking systems modeling and RF link simulation

    NASA Technical Reports Server (NTRS)

    Tsang, Chit-Sang; Chie, Chak M.; Lindsey, William C.

    1986-01-01

    In this final report, the effort spent on Space Station Communications and Tracking System Modeling and RF Link Simulation is described in detail. The effort is mainly divided into three parts: frequency division multiple access (FDMA) system simulation modeling and software implementation; a study on design and evaluation of a functional computerized RF link simulation/analysis system for Space Station; and a study on design and evaluation of simulation system architecture. This report documents the results of these studies. In addition, a separate User's Manual on Space Communications Simulation System (SCSS) (Version 1) documents the software developed for the Space Station FDMA communications system simulation. The final report, SCSS user's manual, and the software located in the NASA JSC system analysis division's VAX 750 computer together serve as the deliverables from LinCom for this project effort.

  20. SeqSIMLA2_exact: simulate multiple disease sites in large pedigrees with given disease status for diseases with low prevalence.

    PubMed

    Yao, Po-Ju; Chung, Ren-Hua

    2016-02-15

    It is difficult for current simulation tools to simulate sequence data in a pre-specified pedigree structure and pre-specified affection status. Previously, we developed a flexible tool, SeqSIMLA2, for simulating sequence data in either unrelated case-control or family samples with different disease and quantitative trait models. Here we extended the tool to efficiently simulate sequences with multiple disease sites in large pedigrees with a given disease status for each pedigree member, assuming that the disease prevalence is low. SeqSIMLA2_exact is implemented with C++ and is available at http://seqsimla.sourceforge.net. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Numerical simulation of convection and heat transfer in Czochralski crystal growth by multiple-relaxation-time LBM

    NASA Astrophysics Data System (ADS)

    Liu, Ding; Huang, Weichao; Zhang, Ni

    2017-07-01

    A two-dimensional axisymmetric swirling model based on the lattice Boltzmann method (LBM) in a pseudo Cartesian coordinate system is posited to simulate Czochralski (Cz) crystal growth in this paper. Specifically, the multiple-relaxation-time LBM (MRT-LBM) combined with the finite difference method (FDM) is used to analyze the melt convection and heat transfer in the process of Cz crystal growth. An incompressible axisymmetric swirling MRT-LB D2Q9 model is applied to solve for the axial and radial velocities by inserting thermal buoyancy and rotational inertial force into the two-dimensional lattice Boltzmann equation. In addition, the melt temperature and the azimuthal velocity are solved by MRT-LB D2Q5 models, and the crystal temperature is solved by FDM. The comparison results of stream functions values of different methods demonstrate that our hybrid model can be used to simulate the fluid-thermal coupling in the axisymmetric swirling model correctly and effectively. Furthermore, numerical simulations of melt convection and heat transfer are conducted under the conditions of high Grashof (Gr) numbers, within the range of 105 ˜ 107, and different high Reynolds (Re) numbers. The experimental results show our hybrid model can obtain the exact solution of complex crystal-growth models and analyze the fluid-thermal coupling effectively under the combined action of natural convection and forced convection.

  2. Multiple Model Adaptive Attitude Control of LEO Satellite with Angular Velocity Constraints

    NASA Astrophysics Data System (ADS)

    Shahrooei, Abolfazl; Kazemi, Mohammad Hosein

    2018-04-01

    In this paper, the multiple model adaptive control is utilized to improve the transient response of attitude control system for a rigid spacecraft. An adaptive output feedback control law is proposed for attitude control under angular velocity constraints and its almost global asymptotic stability is proved. The multiple model adaptive control approach is employed to counteract large uncertainty in parameter space of the inertia matrix. The nonlinear dynamics of a low earth orbit satellite is simulated and the proposed control algorithm is implemented. The reported results show the effectiveness of the suggested scheme.

  3. A feedback control model for network flow with multiple pure time delays

    NASA Technical Reports Server (NTRS)

    Press, J.

    1972-01-01

    A control model describing a network flow hindered by multiple pure time (or transport) delays is formulated. Feedbacks connect each desired output with a single control sector situated at the origin. The dynamic formulation invokes the use of differential difference equations. This causes the characteristic equation of the model to consist of transcendental functions instead of a common algebraic polynomial. A general graphical criterion is developed to evaluate the stability of such a problem. A digital computer simulation confirms the validity of such criterion. An optimal decision making process with multiple delays is presented.

  4. Integration of multiple theories for the simulation of laser interference lithography processes

    NASA Astrophysics Data System (ADS)

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-01

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  5. Integration of multiple theories for the simulation of laser interference lithography processes.

    PubMed

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-24

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  6. Measuring technical and mathematical investigation of multiple reignitions at the switching of a motor using vacuum circuit breakers

    NASA Astrophysics Data System (ADS)

    Luxa, Andreas

    The necessary conditions in switching system and vacuum circuit breaker for the occurrence of multiple re-ignitions and accompanying effects were examined. The shape of the occurring voltages was determined in relationship to other types of overvoltage. A phenomenological model of the arc, based on an extension of the Mayr equation for arcs was used with the simulation program NETOMAC for the switching transients. Factors which affect the arc parameters were analyzed. The results were statistically verified by 3000 three-phase switching tests on 3 standard vacuum circuit breakers under realistic systems conditions; the occurring overvoltage level was measured. Dimensioning criteria for motor simulation circuits in power plants were formulated on the basis of a theoretical equivalence analysis and experimental studies. The simulation model allows a sufficiently correct estimation of all effects.

  7. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  8. Trans-Pacific transport and evolution of aerosols: evaluation of quasi-global WRF-Chem simulation with multiple observations

    NASA Astrophysics Data System (ADS)

    Hu, Zhiyuan; Zhao, Chun; Huang, Jianping; Leung, L. Ruby; Qian, Yun; Yu, Hongbin; Huang, Lei; Kalashnikova, Olga V.

    2016-05-01

    A fully coupled meteorology-chemistry model (WRF-Chem, the Weather Research and Forecasting model coupled with chemistry) has been configured to conduct quasi-global simulation for 5 years (2010-2014) and evaluated with multiple observation data sets for the first time. The evaluation focuses on the simulation over the trans-Pacific transport region using various reanalysis and observational data sets for meteorological fields and aerosol properties. The simulation generally captures the overall spatial and seasonal variability of satellite retrieved aerosol optical depth (AOD) and absorbing AOD (AAOD) over the Pacific that is determined by the outflow of pollutants and dust and the emissions of marine aerosols. The assessment of simulated extinction Ångström exponent (EAE) indicates that the model generally reproduces the variability of aerosol size distributions as seen by satellites. In addition, the vertical profile of aerosol extinction and its seasonality over the Pacific are also well simulated. The difference between the simulation and satellite retrievals can be mainly attributed to model biases in estimating marine aerosol emissions as well as the satellite sampling and retrieval uncertainties. Compared with the surface measurements over the western USA, the model reasonably simulates the observed magnitude and seasonality of dust, sulfate, and nitrate surface concentrations, but significantly underestimates the peak surface concentrations of carbonaceous aerosol likely due to model biases in the spatial and temporal variability of biomass burning emissions and secondary organic aerosol (SOA) production. A sensitivity simulation shows that the trans-Pacific transported dust, sulfate, and nitrate can make significant contribution to surface concentrations over the rural areas of the western USA, while the peaks of carbonaceous aerosol surface concentrations are dominated by the North American emissions. Both the retrievals and simulation show small interannual variability of aerosol characteristics for 2010-2014 averaged over three Pacific sub-regions. The evaluation in this study demonstrates that the WRF-Chem quasi-global simulation can be used for investigating trans-Pacific transport of aerosols and providing reasonable inflow chemical boundaries for the western USA, allowing one to further understand the impact of transported pollutants on the regional air quality and climate with high-resolution nested regional modeling.

  9. Progress in catalytic ignition fabrication, modeling and infrastructure : (part 2) development of a multi-zone engine model simulated using MATLAB software.

    DOT National Transportation Integrated Search

    2014-02-01

    A mathematical model was developed for the purpose of providing students with data : acquisition and engine modeling experience at the University of Idaho. In developing the : model, multiple heat transfer and emissions models were researched and com...

  10. Strong ground motion simulation of the 2016 Kumamoto earthquake of April 16 using multiple point sources

    NASA Astrophysics Data System (ADS)

    Nagasaka, Yosuke; Nozu, Atsushi

    2017-02-01

    The pseudo point-source model approximates the rupture process on faults with multiple point sources for simulating strong ground motions. A simulation with this point-source model is conducted by combining a simple source spectrum following the omega-square model with a path spectrum, an empirical site amplification factor, and phase characteristics. Realistic waveforms can be synthesized using the empirical site amplification factor and phase models even though the source model is simple. The Kumamoto earthquake occurred on April 16, 2016, with M JMA 7.3. Many strong motions were recorded at stations around the source region. Some records were considered to be affected by the rupture directivity effect. This earthquake was suitable for investigating the applicability of the pseudo point-source model, the current version of which does not consider the rupture directivity effect. Three subevents (point sources) were located on the fault plane, and the parameters of the simulation were determined. The simulated results were compared with the observed records at K-NET and KiK-net stations. It was found that the synthetic Fourier spectra and velocity waveforms generally explained the characteristics of the observed records, except for underestimation in the low frequency range. Troughs in the observed Fourier spectra were also well reproduced by placing multiple subevents near the hypocenter. The underestimation is presumably due to the following two reasons. The first is that the pseudo point-source model targets subevents that generate strong ground motions and does not consider the shallow large slip. The second reason is that the current version of the pseudo point-source model does not consider the rupture directivity effect. Consequently, strong pulses were not reproduced enough at stations northeast of Subevent 3 such as KMM004, where the effect of rupture directivity was significant, while the amplitude was well reproduced at most of the other stations. This result indicates the necessity for improving the pseudo point-source model, by introducing azimuth-dependent corner frequency for example, so that it can incorporate the effect of rupture directivity.[Figure not available: see fulltext.

  11. Simulating fire and forest dynamics for a coordinated landscape fuel treatment project in the Sierra Nevada

    Treesearch

    Brandon M. Collins; Scott L. Stephens; Gary B. Roller; John Battles

    2011-01-01

    We evaluate an actual landscape fuel treatment project that was designed by local U. S. Forest Service managers in the northern Sierra Nevada. We model the effects of this project at reducing landscape-level fire behavior at multiple time steps, up to nearly 30 yr beyond treatment implementation. Additionally, we modeled planned treatments under multiple diameter-...

  12. Performance of Geant4 in simulating semiconductor particle detector response in the energy range below 1 MeV

    NASA Astrophysics Data System (ADS)

    Soti, G.; Wauters, F.; Breitenfeldt, M.; Finlay, P.; Kraev, I. S.; Knecht, A.; Porobić, T.; Zákoucký, D.; Severijns, N.

    2013-11-01

    Geant4 simulations play a crucial role in the analysis and interpretation of experiments providing low energy precision tests of the Standard Model. This paper focuses on the accuracy of the description of the electron processes in the energy range between 100 and 1000 keV. The effect of the different simulation parameters and multiple scattering models on the backscattering coefficients is investigated. Simulations of the response of HPGe and passivated implanted planar Si detectors to β particles are compared to experimental results. An overall good agreement is found between Geant4 simulations and experimental data.

  13. Investigation of the Multiple Method Adaptive Control (MMAC) method for flight control systems

    NASA Technical Reports Server (NTRS)

    Athans, M.; Baram, Y.; Castanon, D.; Dunn, K. P.; Green, C. S.; Lee, W. H.; Sandell, N. R., Jr.; Willsky, A. S.

    1979-01-01

    The stochastic adaptive control of the NASA F-8C digital-fly-by-wire aircraft using the multiple model adaptive control (MMAC) method is presented. The selection of the performance criteria for the lateral and the longitudinal dynamics, the design of the Kalman filters for different operating conditions, the identification algorithm associated with the MMAC method, the control system design, and simulation results obtained using the real time simulator of the F-8 aircraft at the NASA Langley Research Center are discussed.

  14. Scattering Models and Basic Experiments in the Microwave Regime

    NASA Technical Reports Server (NTRS)

    Fung, A. K.; Blanchard, A. J. (Principal Investigator)

    1985-01-01

    The objectives of research over the next three years are: (1) to develop a randomly rough surface scattering model which is applicable over the entire frequency band; (2) to develop a computer simulation method and algorithm to simulate scattering from known randomly rough surfaces, Z(x,y); (3) to design and perform laboratory experiments to study geometric and physical target parameters of an inhomogeneous layer; (4) to develop scattering models for an inhomogeneous layer which accounts for near field interaction and multiple scattering in both the coherent and the incoherent scattering components; and (5) a comparison between theoretical models and measurements or numerical simulation.

  15. Monte Carlo based method for fluorescence tomographic imaging with lifetime multiplexing using time gates

    PubMed Central

    Chen, Jin; Venugopal, Vivek; Intes, Xavier

    2011-01-01

    Time-resolved fluorescence optical tomography allows 3-dimensional localization of multiple fluorophores based on lifetime contrast while providing a unique data set for improved resolution. However, to employ the full fluorescence time measurements, a light propagation model that accurately simulates weakly diffused and multiple scattered photons is required. In this article, we derive a computationally efficient Monte Carlo based method to compute time-gated fluorescence Jacobians for the simultaneous imaging of two fluorophores with lifetime contrast. The Monte Carlo based formulation is validated on a synthetic murine model simulating the uptake in the kidneys of two distinct fluorophores with lifetime contrast. Experimentally, the method is validated using capillaries filled with 2.5nmol of ICG and IRDye™800CW respectively embedded in a diffuse media mimicking the average optical properties of mice. Combining multiple time gates in one inverse problem allows the simultaneous reconstruction of multiple fluorophores with increased resolution and minimal crosstalk using the proposed formulation. PMID:21483610

  16. Predicting the impact of combined therapies on myeloma cell growth using a hybrid multi-scale agent-based model.

    PubMed

    Ji, Zhiwei; Su, Jing; Wu, Dan; Peng, Huiming; Zhao, Weiling; Nlong Zhao, Brian; Zhou, Xiaobo

    2017-01-31

    Multiple myeloma is a malignant still incurable plasma cell disorder. This is due to refractory disease relapse, immune impairment, and development of multi-drug resistance. The growth of malignant plasma cells is dependent on the bone marrow (BM) microenvironment and evasion of the host's anti-tumor immune response. Hence, we hypothesized that targeting tumor-stromal cell interaction and endogenous immune system in BM will potentially improve the response of multiple myeloma (MM). Therefore, we proposed a computational simulation of the myeloma development in the complicated microenvironment which includes immune cell components and bone marrow stromal cells and predicted the effects of combined treatment with multi-drugs on myeloma cell growth. We constructed a hybrid multi-scale agent-based model (HABM) that combines an ODE system and Agent-based model (ABM). The ODEs was used for modeling the dynamic changes of intracellular signal transductions and ABM for modeling the cell-cell interactions between stromal cells, tumor, and immune components in the BM. This model simulated myeloma growth in the bone marrow microenvironment and revealed the important role of immune system in this process. The predicted outcomes were consistent with the experimental observations from previous studies. Moreover, we applied this model to predict the treatment effects of three key therapeutic drugs used for MM, and found that the combination of these three drugs potentially suppress the growth of myeloma cells and reactivate the immune response. In summary, the proposed model may serve as a novel computational platform for simulating the formation of MM and evaluating the treatment response of MM to multiple drugs.

  17. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  18. A mechanistic modeling system for estimating large scale emissions and transport of pollen and co-allergens

    PubMed Central

    Efstathiou, Christos; Isukapalli, Sastry

    2011-01-01

    Allergic airway diseases represent a complex health problem which can be exacerbated by the synergistic action of pollen particles and air pollutants such as ozone. Understanding human exposures to aeroallergens requires accurate estimates of the spatial distribution of airborne pollen levels as well as of various air pollutants at different times. However, currently there are no established methods for estimating allergenic pollen emissions and concentrations over large geographic areas such as the United States. A mechanistic modeling system for describing pollen emissions and transport over extensive domains has been developed by adapting components of existing regional scale air quality models and vegetation databases. First, components of the Biogenic Emissions Inventory System (BEIS) were adapted to predict pollen emission patterns. Subsequently, the transport module of the Community Multiscale Air Quality (CMAQ) modeling system was modified to incorporate description of pollen transport. The combined model, CMAQ-pollen, allows for simultaneous prediction of multiple air pollutants and pollen levels in a single model simulation, and uses consistent assumptions related to the transport of multiple chemicals and pollen species. Application case studies for evaluating the combined modeling system included the simulation of birch and ragweed pollen levels for the year 2002, during their corresponding peak pollination periods (April for birch and September for ragweed). The model simulations were driven by previously evaluated meteorological model outputs and emissions inventories for the eastern United States for the simulation period. A semi-quantitative evaluation of CMAQ-pollen was performed using tree and ragweed pollen counts in Newark, NJ for the same time periods. The peak birch pollen concentrations were predicted to occur within two days of the peak measurements, while the temporal patterns closely followed the measured profiles of overall tree pollen. For the case of ragweed pollen, the model was able to capture the patterns observed during September 2002, but did not predict an early peak; this can be associated with a wider species pollination window and inadequate spatial information in current land cover databases. An additional sensitivity simulation was performed to comparatively evaluate the dispersion patterns predicted by CMAQ-pollen with those predicted by the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model, which is used extensively in aerobiological studies. The CMAQ estimated concentration plumes matched the equivalent pollen scenario modeled with HYSPLIT. The novel pollen modeling approach presented here allows simultaneous estimation of multiple airborne allergens and other air pollutants, and is being developed as a central component of an integrated population exposure modeling system, the Modeling Environment for Total Risk studies (MENTOR) for multiple, co-occurring contaminants that include aeroallergens and irritants. PMID:21516207

  19. A mechanistic modeling system for estimating large-scale emissions and transport of pollen and co-allergens

    NASA Astrophysics Data System (ADS)

    Efstathiou, Christos; Isukapalli, Sastry; Georgopoulos, Panos

    2011-04-01

    Allergic airway diseases represent a complex health problem which can be exacerbated by the synergistic action of pollen particles and air pollutants such as ozone. Understanding human exposures to aeroallergens requires accurate estimates of the spatial distribution of airborne pollen levels as well as of various air pollutants at different times. However, currently there are no established methods for estimating allergenic pollen emissions and concentrations over large geographic areas such as the United States. A mechanistic modeling system for describing pollen emissions and transport over extensive domains has been developed by adapting components of existing regional scale air quality models and vegetation databases. First, components of the Biogenic Emissions Inventory System (BEIS) were adapted to predict pollen emission patterns. Subsequently, the transport module of the Community Multiscale Air Quality (CMAQ) modeling system was modified to incorporate description of pollen transport. The combined model, CMAQ-pollen, allows for simultaneous prediction of multiple air pollutants and pollen levels in a single model simulation, and uses consistent assumptions related to the transport of multiple chemicals and pollen species. Application case studies for evaluating the combined modeling system included the simulation of birch and ragweed pollen levels for the year 2002, during their corresponding peak pollination periods (April for birch and September for ragweed). The model simulations were driven by previously evaluated meteorological model outputs and emissions inventories for the eastern United States for the simulation period. A semi-quantitative evaluation of CMAQ-pollen was performed using tree and ragweed pollen counts in Newark, NJ for the same time periods. The peak birch pollen concentrations were predicted to occur within two days of the peak measurements, while the temporal patterns closely followed the measured profiles of overall tree pollen. For the case of ragweed pollen, the model was able to capture the patterns observed during September 2002, but did not predict an early peak; this can be associated with a wider species pollination window and inadequate spatial information in current land cover databases. An additional sensitivity simulation was performed to comparatively evaluate the dispersion patterns predicted by CMAQ-pollen with those predicted by the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model, which is used extensively in aerobiological studies. The CMAQ estimated concentration plumes matched the equivalent pollen scenario modeled with HYSPLIT. The novel pollen modeling approach presented here allows simultaneous estimation of multiple airborne allergens and other air pollutants, and is being developed as a central component of an integrated population exposure modeling system, the Modeling Environment for Total Risk studies (MENTOR) for multiple, co-occurring contaminants that include aeroallergens and irritants.

  20. Influence of transmucosal height in abutments of single and multiple implant-supported prostheses: a non-linear three-dimensional finite element analysis.

    PubMed

    Borie, Eduardo; Leal, Eduardo; Orsi, Iara Augusta; Salamanca, Carlos; Dias, Fernando José; Weber, Benjamin

    2018-01-01

    The aim of this study was to analyze the influence of three different transmucosal heights of the abutments in single and multiple implant-supported prostheses through the finite element method. External hexagon implants, MicroUnit, and EsthetiCone abutments were scanned and placed in an edentulous maxillary model obtained from a tomography database. The simulations were divided into two groups: (1) one implant with 3.75 × 10 mm placed in the upper central incisor, simulating a single implant-supported fixed prosthesis with an EsthetiCone abutment; and (2) two implants with 3.75 × 10 mm placed in the upper lateral incisors with MicroUnit abutments, simulating a multiple implant-supported prosthesis. Subsequently, each group was subdivided into three models according to the transmucosal height (1, 2, and 3 mm). A static oblique load at an angle of 45 degrees to the long axis of the implant in palatal-buccal direction of 150 and 75 N was applied for multiple and single implant-supported prosthesis, respectively. The implants and abutments were assessed according to the equivalent Von Mises stress analyses while the bone and ceramics were analyzed through maximum and minimum principal stresses. The total deformation values increased in all models, while the transmucosal height was augmented. The transmucosal height of the abutments influences the stress values at the bone, ceramics, implants, and abutments of both the single and multiple implant-supported prostheses, with the transmucosal height of 1 mm showing the lowest stress values.

  1. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  2. Multiple Point Statistics algorithm based on direct sampling and multi-resolution images

    NASA Astrophysics Data System (ADS)

    Julien, S.; Renard, P.; Chugunova, T.

    2017-12-01

    Multiple Point Statistics (MPS) has become popular for more than one decade in Earth Sciences, because these methods allow to generate random fields reproducing highly complex spatial features given in a conceptual model, the training image, while classical geostatistics techniques based on bi-point statistics (covariance or variogram) fail to generate realistic models. Among MPS methods, the direct sampling consists in borrowing patterns from the training image to populate a simulation grid. This latter is sequentially filled by visiting each of these nodes in a random order, and then the patterns, whose the number of nodes is fixed, become narrower during the simulation process, as the simulation grid is more densely informed. Hence, large scale structures are caught in the beginning of the simulation and small scale ones in the end. However, MPS may mix spatial characteristics distinguishable at different scales in the training image, and then loose the spatial arrangement of different structures. To overcome this limitation, we propose to perform MPS simulation using a decomposition of the training image in a set of images at multiple resolutions. Applying a Gaussian kernel onto the training image (convolution) results in a lower resolution image, and iterating this process, a pyramid of images depicting fewer details at each level is built, as it can be done in image processing for example to lighten the space storage of a photography. The direct sampling is then employed to simulate the lowest resolution level, and then to simulate each level, up to the finest resolution, conditioned to the level one rank coarser. This scheme helps reproduce the spatial structures at any scale of the training image and then generate more realistic models. We illustrate the method with aerial photographies (satellite images) and natural textures. Indeed, these kinds of images often display typical structures at different scales and are well-suited for MPS simulation techniques.

  3. The agricultural model intercomparison and improvement project (AgMIP): Protocols and pilot studies

    USDA-ARS?s Scientific Manuscript database

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) is a distributed climate-scenario simulation research activity for historical period model intercomparison and future climate change conditions with participation of multiple crop and agricultural economic model groups around the...

  4. Efficient and Extensible Quasi-Explicit Modular Nonlinear Multiscale Battery Model: GH-MSMD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Gi-Heon; Smith, Kandler; Lawrence-Simon, Jake

    Complex physics and long computation time hinder the adoption of computer aided engineering models in the design of large-format battery cells and systems. A modular, efficient battery simulation model -- the multiscale multidomain (MSMD) model -- was previously introduced to aid the scale-up of Li-ion material and electrode designs to complete cell and pack designs, capturing electrochemical interplay with 3-D electronic current pathways and thermal response. Here, this paper enhances the computational efficiency of the MSMD model using a separation of time-scales principle to decompose model field variables. The decomposition provides a quasi-explicit linkage between the multiple length-scale domains andmore » thus reduces time-consuming nested iteration when solving model equations across multiple domains. In addition to particle-, electrode- and cell-length scales treated in the previous work, the present formulation extends to bus bar- and multi-cell module-length scales. We provide example simulations for several variants of GH electrode-domain models.« less

  5. Efficient and Extensible Quasi-Explicit Modular Nonlinear Multiscale Battery Model: GH-MSMD

    DOE PAGES

    Kim, Gi-Heon; Smith, Kandler; Lawrence-Simon, Jake; ...

    2017-03-24

    Complex physics and long computation time hinder the adoption of computer aided engineering models in the design of large-format battery cells and systems. A modular, efficient battery simulation model -- the multiscale multidomain (MSMD) model -- was previously introduced to aid the scale-up of Li-ion material and electrode designs to complete cell and pack designs, capturing electrochemical interplay with 3-D electronic current pathways and thermal response. Here, this paper enhances the computational efficiency of the MSMD model using a separation of time-scales principle to decompose model field variables. The decomposition provides a quasi-explicit linkage between the multiple length-scale domains andmore » thus reduces time-consuming nested iteration when solving model equations across multiple domains. In addition to particle-, electrode- and cell-length scales treated in the previous work, the present formulation extends to bus bar- and multi-cell module-length scales. We provide example simulations for several variants of GH electrode-domain models.« less

  6. LSST: Cadence Design and Simulation

    NASA Astrophysics Data System (ADS)

    Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration

    2009-01-01

    The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.

  7. Monthly mean simulation experiments with a course-mesh global atmospheric model

    NASA Technical Reports Server (NTRS)

    Spar, J.; Klugman, R.; Lutz, R. J.; Notario, J. J.

    1978-01-01

    Substitution of observed monthly mean sea-surface temperatures (SSTs) as lower boundary conditions, in place of climatological SSTs, failed to improve the model simulations. While the impact of SST anomalies on the model output is greater at sea level than at upper levels the impact on the monthly mean simulations is not beneficial at any level. Shifts of one and two days in initialization time produced small, but non-trivial, changes in the model-generated monthly mean synoptic fields. No improvements in the mean simulations resulted from the use of either time-averaged initial data or re-initialization with time-averaged early model output. The noise level of the model, as determined from a multiple initial state perturbation experiment, was found to be generally low, but with a noisier response to initial state errors in high latitudes than the tropics.

  8. Explicit Finite Element Modeling of Multilayer Composite Fabric for Gas Turbine Engine Containment Systems, Phase II. Part 3; Material Model Development and Simulation of Experiments

    NASA Technical Reports Server (NTRS)

    Simmons, J.; Erlich, D.; Shockey, D.

    2009-01-01

    A team consisting of Arizona State University, Honeywell Engines, Systems & Services, the National Aeronautics and Space Administration Glenn Research Center, and SRI International collaborated to develop computational models and verification testing for designing and evaluating turbine engine fan blade fabric containment structures. This research was conducted under the Federal Aviation Administration Airworthiness Assurance Center of Excellence and was sponsored by the Aircraft Catastrophic Failure Prevention Program. The research was directed toward improving the modeling of a turbine engine fabric containment structure for an engine blade-out containment demonstration test required for certification of aircraft engines. The research conducted in Phase II began a new level of capability to design and develop fan blade containment systems for turbine engines. Significant progress was made in three areas: (1) further development of the ballistic fabric model to increase confidence and robustness in the material models for the Kevlar(TradeName) and Zylon(TradeName) material models developed in Phase I, (2) the capability was improved for finite element modeling of multiple layers of fabric using multiple layers of shell elements, and (3) large-scale simulations were performed. This report concentrates on the material model development and simulations of the impact tests.

  9. SpectraPlot.com: Integrated spectroscopic modeling of atomic and molecular gases

    NASA Astrophysics Data System (ADS)

    Goldenstein, Christopher S.; Miller, Victor A.; Mitchell Spearrin, R.; Strand, Christopher L.

    2017-10-01

    SpectraPlot is a web-based application for simulating spectra of atomic and molecular gases. At the time this manuscript was written, SpectraPlot consisted of four primary tools for calculating: (1) atomic and molecular absorption spectra, (2) atomic and molecular emission spectra, (3) transition linestrengths, and (4) blackbody emission spectra. These tools currently employ the NIST ASD, HITRAN2012, and HITEMP2010 databases to perform line-by-line simulations of spectra. SpectraPlot employs a modular, integrated architecture, enabling multiple simulations across multiple databases and/or thermodynamic conditions to be visualized in an interactive plot window. The primary objective of this paper is to describe the architecture and spectroscopic models employed by SpectraPlot in order to provide its users with the knowledge required to understand the capabilities and limitations of simulations performed using SpectraPlot. Further, this manuscript discusses the accuracy of several underlying approximations used to decrease computational time, in particular, the use of far-wing cutoff criteria.

  10. Double-null divertor configuration discharge and disruptive heat flux simulation using TSC on EAST

    NASA Astrophysics Data System (ADS)

    Bo, SHI; Jinhong, YANG; Cheng, YANG; Desheng, CHENG; Hui, WANG; Hui, ZHANG; Haifei, DENG; Junli, QI; Xianzu, GONG; Weihua, WANG

    2018-07-01

    The tokamak simulation code (TSC) is employed to simulate the complete evolution of a disruptive discharge in the experimental advanced superconducting tokamak. The multiplication factor of the anomalous transport coefficient was adjusted to model the major disruptive discharge with double-null divertor configuration based on shot 61 916. The real-time feed-back control system for the plasma displacement was employed. Modeling results of the evolution of the poloidal field coil currents, the plasma current, the major radius, the plasma configuration all show agreement with experimental measurements. Results from the simulation show that during disruption, heat flux about 8 MW m‑2 flows to the upper divertor target plate and about 6 MW m‑2 flows to the lower divertor target plate. Computations predict that different amounts of heat fluxes on the divertor target plate could result by adjusting the multiplication factor of the anomalous transport coefficient. This shows that TSC has high flexibility and predictability.

  11. Modelling a real-world buried valley system with vertical non-stationarity using multiple-point statistics

    NASA Astrophysics Data System (ADS)

    He, Xiulan; Sonnenborg, Torben O.; Jørgensen, Flemming; Jensen, Karsten H.

    2017-03-01

    Stationarity has traditionally been a requirement of geostatistical simulations. A common way to deal with non-stationarity is to divide the system into stationary sub-regions and subsequently merge the realizations for each region. Recently, the so-called partition approach that has the flexibility to model non-stationary systems directly was developed for multiple-point statistics simulation (MPS). The objective of this study is to apply the MPS partition method with conventional borehole logs and high-resolution airborne electromagnetic (AEM) data, for simulation of a real-world non-stationary geological system characterized by a network of connected buried valleys that incise deeply into layered Miocene sediments (case study in Denmark). The results show that, based on fragmented information of the formation boundaries, the MPS partition method is able to simulate a non-stationary system including valley structures embedded in a layered Miocene sequence in a single run. Besides, statistical information retrieved from the AEM data improved the simulation of the geology significantly, especially for the deep-seated buried valley sediments where borehole information is sparse.

  12. Simulated fault injection - A methodology to evaluate fault tolerant microprocessor architectures

    NASA Technical Reports Server (NTRS)

    Choi, Gwan S.; Iyer, Ravishankar K.; Carreno, Victor A.

    1990-01-01

    A simulation-based fault-injection method for validating fault-tolerant microprocessor architectures is described. The approach uses mixed-mode simulation (electrical/logic analysis), and injects transient errors in run-time to assess the resulting fault impact. As an example, a fault-tolerant architecture which models the digital aspects of a dual-channel real-time jet-engine controller is used. The level of effectiveness of the dual configuration with respect to single and multiple transients is measured. The results indicate 100 percent coverage of single transients. Approximately 12 percent of the multiple transients affect both channels; none result in controller failure since two additional levels of redundancy exist.

  13. A prediction model for lift-fan simulator performance. M.S. Thesis - Cleveland State Univ.

    NASA Technical Reports Server (NTRS)

    Yuska, J. A.

    1972-01-01

    The performance characteristics of a model VTOL lift-fan simulator installed in a two-dimensional wing are presented. The lift-fan simulator consisted of a 15-inch diameter fan driven by a turbine contained in the fan hub. The performance of the lift-fan simulator was measured in two ways: (1) the calculated momentum thrust of the fan and turbine (total thrust loading), and (2) the axial-force measured on a load cell force balance (axial-force loading). Tests were conducted over a wide range of crossflow velocities, corrected tip speeds, and wing angle of attack. A prediction modeling technique was developed to help in analyzing the performance characteristics of lift-fan simulators. A multiple linear regression analysis technique is presented which calculates prediction model equations for the dependent variables.

  14. A robust and flexible Geospatial Modeling Interface (GMI) for environmental model deployment and evaluation

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the GMI (Geospatial Modeling Interface) simulation framework for environmental model deployment and assessment. GMI currently provides access to multiple environmental models including AgroEcoSystem-Watershed (AgES-W), Nitrate Leaching and Economic Analysis 2 (NLEA...

  15. Channel Model Optimization with Reflection Residual Component for Indoor MIMO-VLC System

    NASA Astrophysics Data System (ADS)

    Chen, Yong; Li, Tengfei; Liu, Huanlin; Li, Yichao

    2017-12-01

    A fast channel modeling method is studied to solve the problem of reflection channel gain for multiple input multiple output-visible light communications (MIMO-VLC) in the paper. For reducing the computational complexity when associating with the reflection times, no more than 3 reflections are taken into consideration in VLC. We think that higher order reflection link consists of corresponding many times line of sight link and firstly present reflection residual component to characterize higher reflection (more than 2 reflections). We perform computer simulation results for point-to-point channel impulse response, receiving optical power and receiving signal to noise ratio. Based on theoretical analysis and simulation results, the proposed method can effectively reduce the computational complexity of higher order reflection in channel modeling.

  16. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  17. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  18. Consistent radiative transfer modeling of active and passive observations of precipitation

    NASA Astrophysics Data System (ADS)

    Adams, Ian

    2016-04-01

    Spaceborne platforms such as the Tropical Rainfall Measurement Mission (TRMM) and the Global Precipitation Measurement (GPM) mission exploit a combination of active and passive sensors to provide a greater understanding of the three-dimensional structure of precipitation. While "operationalized" retrieval algorithms require fast forward models, the ability to perform higher fidelity simulations is necessary in order to understand the physics of remote sensing problems by testing assumptions and developing parameterizations for the fast models. To ensure proper synergy between active and passive modeling, forward models must be consistent when modeling the responses of radars and radiometers. This work presents a self-consistent transfer model for simulating radar reflectivities and millimeter wave brightness temperatures for precipitating scenes. To accomplish this, we extended the Atmospheric Radiative Transfer Simulator (ARTS) version 2.3 to solve the radiative transfer equation for active sensors and multiple scattering conditions. Early versions of ARTS (1.1) included a passive Monte Carlo solver, and ARTS is capable of handling atmospheres of up to three dimensions with ellipsoidal planetary geometries. The modular nature of ARTS facilitates extensibility, and the well-developed ray-tracing tools are suited for implementation of Monte Carlo algorithms. Finally, since ARTS handles the full Stokes vector, co- and cross-polarized reflectivity products are possible for scenarios that include nonspherical particles, with or without preferential alignment. The accuracy of the forward model will be demonstrated with precipitation events observed by TRMM and GPM, and the effects of multiple scattering will be detailed. The three-dimensional nature of the radiative transfer model will be useful for understanding the effects of nonuniform beamfill and multiple scattering for spatially heterogeneous precipitation events. The targets of this forward model are GPM (the Dual-wavelength Precipitation Radar (DPR) and GPM Microwave Imager (GMI)).

  19. An Analysis of the Relationship Between Atmospheric Heat Transport and the Position of the ITCZ in NASA NEWS products, CMIP5 GCMs, and Multiple Reanalyses

    NASA Astrophysics Data System (ADS)

    Stanfield, R.; Dong, X.; Su, H.; Xi, B.; Jiang, J. H.

    2016-12-01

    In the past few years, studies have found a strong connection between atmospheric heat transport across the equator (AHTEQ) and the position of the ITCZ. This study investigates the seasonal, annual-mean and interannual variability of the ITCZ position and explores the relationships between the ITCZ position and inter-hemispheric energy transport in NASA NEWS products, multiple reanalyses datasets, and CMIP5 simulations. We find large discrepancies exist in the ITCZ-AHTEQ relationships in these datasets and model simulations. The components of energy fluxes are examined to identify the primary sources for the discrepancies among the datasets and models results.

  20. Simulation and analysis of main steam control system based on heat transfer calculation

    NASA Astrophysics Data System (ADS)

    Huang, Zhenqun; Li, Ruyan; Feng, Zhongbao; Wang, Songhan; Li, Wenbo; Cheng, Jiwei; Jin, Yingai

    2018-05-01

    In this paper, after thermal power plant 300MW boiler was studied, mat lab was used to write calculation program about heat transfer process between the main steam and boiler flue gas and amount of water was calculated to ensure the main steam temperature keeping in target temperature. Then heat transfer calculation program was introduced into Simulink simulation platform based on control system multiple models switching and heat transfer calculation. The results show that multiple models switching control system based on heat transfer calculation not only overcome the large inertia of main stream temperature, a large hysteresis characteristic of main stream temperature, but also adapted to the boiler load changing.

  1. D3-Equivariant coupled advertising oscillators model

    NASA Astrophysics Data System (ADS)

    Zhang, Chunrui; Zheng, Huifeng

    2011-04-01

    A ring of three coupled advertising oscillators with delay is considered. Using the symmetric functional differential equation theories, the multiple Hopf bifurcations of the equilibrium at the origin are demonstrated. The existence of multiple branches of bifurcating periodic solution is obtained. Numerical simulation supports our analysis results.

  2. Assessing uncertainty and sensitivity of model parameterizations and parameters in WRF affecting simulated surface fluxes and land-atmosphere coupling over the Amazon region

    NASA Astrophysics Data System (ADS)

    Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.

    2016-12-01

    This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for improving the model physics parameterizations.

  3. ACIRF user's guide: Theory and examples

    NASA Astrophysics Data System (ADS)

    Dana, Roger A.

    1989-12-01

    Design and evaluation of radio frequency systems that must operate through ionospheric disturbances resulting from high altitude nuclear detonations requires an accurate channel model. This model must include the effects of high gain antennas that may be used to receive the signals. Such a model can then be used to construct realizations of the received signal for use in digital simulations of trans-ionospheric links or for use in hardware channel simulators. The FORTRAN channel model ACIRF (Antenna Channel Impulse Response Function) generates random realizations of the impulse response function at the outputs of multiple antennas. This user's guide describes the FORTRAN program ACIRF (version 2.0) that generates realizations of channel impulse response functions at the outputs of multiple antennas with arbitrary beamwidths, pointing angles, and relatives positions. This channel model is valid under strong scattering conditions when Rayleigh fading statistics apply. Both frozen-in and turbulent models for the temporal fluctuations are included in this version of ACIRF. The theory of the channel model is described and several examples are given.

  4. A coarse grain model for protein-surface interactions

    NASA Astrophysics Data System (ADS)

    Wei, Shuai; Knotts, Thomas A.

    2013-09-01

    The interaction of proteins with surfaces is important in numerous applications in many fields—such as biotechnology, proteomics, sensors, and medicine—but fundamental understanding of how protein stability and structure are affected by surfaces remains incomplete. Over the last several years, molecular simulation using coarse grain models has yielded significant insights, but the formalisms used to represent the surface interactions have been rudimentary. We present a new model for protein surface interactions that incorporates the chemical specificity of both the surface and the residues comprising the protein in the context of a one-bead-per-residue, coarse grain approach that maintains computational efficiency. The model is parameterized against experimental adsorption energies for multiple model peptides on different types of surfaces. The validity of the model is established by its ability to quantitatively and qualitatively predict the free energy of adsorption and structural changes for multiple biologically-relevant proteins on different surfaces. The validation, done with proteins not used in parameterization, shows that the model produces remarkable agreement between simulation and experiment.

  5. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    ERIC Educational Resources Information Center

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  6. Flexible Modeling of Survival Data with Covariates Subject to Detection Limits via Multiple Imputation.

    PubMed

    Bernhardt, Paul W; Wang, Huixia Judy; Zhang, Daowen

    2014-01-01

    Models for survival data generally assume that covariates are fully observed. However, in medical studies it is not uncommon for biomarkers to be censored at known detection limits. A computationally-efficient multiple imputation procedure for modeling survival data with covariates subject to detection limits is proposed. This procedure is developed in the context of an accelerated failure time model with a flexible seminonparametric error distribution. The consistency and asymptotic normality of the multiple imputation estimator are established and a consistent variance estimator is provided. An iterative version of the proposed multiple imputation algorithm that approximates the EM algorithm for maximum likelihood is also suggested. Simulation studies demonstrate that the proposed multiple imputation methods work well while alternative methods lead to estimates that are either biased or more variable. The proposed methods are applied to analyze the dataset from a recently-conducted GenIMS study.

  7. interThermalPhaseChangeFoam-A framework for two-phase flow simulations with thermally driven phase change

    NASA Astrophysics Data System (ADS)

    Nabil, Mahdi; Rattner, Alexander S.

    The volume-of-fluid (VOF) approach is a mature technique for simulating two-phase flows. However, VOF simulation of phase-change heat transfer is still in its infancy. Multiple closure formulations have been proposed in the literature, each suited to different applications. While these have enabled significant research advances, few implementations are publicly available, actively maintained, or inter-operable. Here, a VOF solver is presented (interThermalPhaseChangeFoam), which incorporates an extensible framework for phase-change heat transfer modeling, enabling simulation of diverse phenomena in a single environment. The solver employs object oriented OpenFOAM library features, including Run-Time-Type-Identification to enable rapid implementation and run-time selection of phase change and surface tension force models. The solver is packaged with multiple phase change and surface tension closure models, adapted and refined from earlier studies. This code has previously been applied to study wavy film condensation, Taylor flow evaporation, nucleate boiling, and dropwise condensation. Tutorial cases are provided for simulation of horizontal film condensation, smooth and wavy falling film condensation, nucleate boiling, and bubble condensation. Validation and grid sensitivity studies, interfacial transport models, effects of spurious currents from surface tension models, effects of artificial heat transfer due to numerical factors, and parallel scaling performance are described in detail in the Supplemental Material (see Appendix A). By incorporating the framework and demonstration cases into a single environment, users can rapidly apply the solver to study phase-change processes of interest.

  8. Simulating the effects of climatic variation on stem carbon accumulation of a ponderosa pine stand: comparison with annual growth increment data.

    PubMed

    Hunt, E R; Martin, F C; Running, S W

    1991-01-01

    Simulation models of ecosystem processes may be necessary to separate the long-term effects of climate change on forest productivity from the effects of year-to-year variations in climate. The objective of this study was to compare simulated annual stem growth with measured annual stem growth from 1930 to 1982 for a uniform stand of ponderosa pine (Pinus ponderosa Dougl.) in Montana, USA. The model, FOREST-BGC, was used to simulate growth assuming leaf area index (LAI) was either constant or increasing. The measured stem annual growth increased exponentially over time; the differences between the simulated and measured stem carbon accumulations were not large. Growth trends were removed from both the measured and simulated annual increments of stem carbon to enhance the year-to-year variations in growth resulting from climate. The detrended increments from the increasing LAI simulation fit the detrended increments of the stand data over time with an R(2) of 0.47; the R(2) increased to 0.65 when the previous year's simulated detrended increment was included with the current year's simulated increment to account for autocorrelation. Stepwise multiple linear regression of the detrended increments of the stand data versus monthly meteorological variables had an R(2) of 0.37, and the R(2) increased to 0.47 when the previous year's meteorological data were included to account for autocorrelation. Thus, FOREST-BGC was more sensitive to the effects of year-to-year climate variation on annual stem growth than were multiple linear regression models.

  9. A microcontroller-based simulation of dural venous sinus injury for neurosurgical training.

    PubMed

    Cleary, Daniel R; Siler, Dominic A; Whitney, Nathaniel; Selden, Nathan R

    2018-05-01

    OBJECTIVE Surgical simulation has the potential to supplement and enhance traditional resident training. However, the high cost of equipment and limited number of available scenarios have inhibited wider integration of simulation in neurosurgical education. In this study the authors provide initial validation of a novel, low-cost simulation platform that recreates the stress of surgery using a combination of hands-on, model-based, and computer elements. Trainee skill was quantified using multiple time and performance measures. The simulation was initially validated using trainees at the start of their intern year. METHODS The simulation recreates intraoperative superior sagittal sinus injury complicated by air embolism. The simulator model consists of 2 components: a reusable base and a disposable craniotomy pack. The simulator software is flexible and modular to allow adjustments in difficulty or the creation of entirely new clinical scenarios. The reusable simulator base incorporates a powerful microcomputer and multiple sensors and actuators to provide continuous feedback to the software controller, which in turn adjusts both the screen output and physical elements of the model. The disposable craniotomy pack incorporates 3D-printed sections of model skull and brain, as well as artificial dura that incorporates a model sagittal sinus. RESULTS Twelve participants at the 2015 Western Region Society of Neurological Surgeons postgraduate year 1 resident course ("boot camp") provided informed consent and enrolled in a study testing the prototype device. Each trainee was required to successfully create a bilateral parasagittal craniotomy, repair a dural sinus tear, and recognize and correct an air embolus. Participant stress was measured using a heart rate wrist monitor. After participation, each resident completed a 13-question categorical survey. CONCLUSIONS All trainee participants experienced tachycardia during the simulation, although the point in the simulation at which they experienced tachycardia varied. Survey results indicated that participants agreed the simulation was realistic, created stress, and was a useful tool in training neurosurgical residents. This simulator represents a novel, low-cost approach for hands-on training that effectively teaches and tests residents without risk of patient injury.

  10. Characterization of Compton-scatter imaging with an analytical simulation method

    PubMed Central

    Jones, Kevin C; Redler, Gage; Templeton, Alistair; Bernard, Damian; Turian, Julius V; Chu, James C H

    2018-01-01

    By collimating the photons scattered when a megavoltage therapy beam interacts with the patient, a Compton-scatter image may be formed without the delivery of an extra dose. To characterize and assess the potential of the technique, an analytical model for simulating scatter images was developed and validated against Monte Carlo (MC). For three phantoms, the scatter images collected during irradiation with a 6 MV flattening-filter-free therapy beam were simulated. Images, profiles, and spectra were compared for different phantoms and different irradiation angles. The proposed analytical method simulates accurate scatter images up to 1000 times faster than MC. Minor differences between MC and analytical simulated images are attributed to limitations in the isotropic superposition/convolution algorithm used to analytically model multiple-order scattering. For a detector placed at 90° relative to the treatment beam, the simulated scattered photon energy spectrum peaks at 140–220 keV, and 40–50% of the photons are the result of multiple scattering. The high energy photons originate at the beam entrance. Increasing the angle between source and detector increases the average energy of the collected photons and decreases the relative contribution of multiple scattered photons. Multiple scattered photons cause blurring in the image. For an ideal 5 mm diameter pinhole collimator placed 18.5 cm from the isocenter, 10 cGy of deposited dose (2 Hz imaging rate for 1200 MU min−1 treatment delivery) is expected to generate an average 1000 photons per mm2 at the detector. For the considered lung tumor CT phantom, the contrast is high enough to clearly identify the lung tumor in the scatter image. Increasing the treatment beam size perpendicular to the detector plane decreases the contrast, although the scatter subject contrast is expected to be greater than the megavoltage transmission image contrast. With the analytical method, real-time tumor tracking may be possible through comparison of simulated and acquired patient images. PMID:29243663

  11. Characterization of Compton-scatter imaging with an analytical simulation method

    NASA Astrophysics Data System (ADS)

    Jones, Kevin C.; Redler, Gage; Templeton, Alistair; Bernard, Damian; Turian, Julius V.; Chu, James C. H.

    2018-01-01

    By collimating the photons scattered when a megavoltage therapy beam interacts with the patient, a Compton-scatter image may be formed without the delivery of an extra dose. To characterize and assess the potential of the technique, an analytical model for simulating scatter images was developed and validated against Monte Carlo (MC). For three phantoms, the scatter images collected during irradiation with a 6 MV flattening-filter-free therapy beam were simulated. Images, profiles, and spectra were compared for different phantoms and different irradiation angles. The proposed analytical method simulates accurate scatter images up to 1000 times faster than MC. Minor differences between MC and analytical simulated images are attributed to limitations in the isotropic superposition/convolution algorithm used to analytically model multiple-order scattering. For a detector placed at 90° relative to the treatment beam, the simulated scattered photon energy spectrum peaks at 140-220 keV, and 40-50% of the photons are the result of multiple scattering. The high energy photons originate at the beam entrance. Increasing the angle between source and detector increases the average energy of the collected photons and decreases the relative contribution of multiple scattered photons. Multiple scattered photons cause blurring in the image. For an ideal 5 mm diameter pinhole collimator placed 18.5 cm from the isocenter, 10 cGy of deposited dose (2 Hz imaging rate for 1200 MU min-1 treatment delivery) is expected to generate an average 1000 photons per mm2 at the detector. For the considered lung tumor CT phantom, the contrast is high enough to clearly identify the lung tumor in the scatter image. Increasing the treatment beam size perpendicular to the detector plane decreases the contrast, although the scatter subject contrast is expected to be greater than the megavoltage transmission image contrast. With the analytical method, real-time tumor tracking may be possible through comparison of simulated and acquired patient images.

  12. Dealing with Multiple Solutions in Structural Vector Autoregressive Models.

    PubMed

    Beltz, Adriene M; Molenaar, Peter C M

    2016-01-01

    Structural vector autoregressive models (VARs) hold great potential for psychological science, particularly for time series data analysis. They capture the magnitude, direction of influence, and temporal (lagged and contemporaneous) nature of relations among variables. Unified structural equation modeling (uSEM) is an optimal structural VAR instantiation, according to large-scale simulation studies, and it is implemented within an SEM framework. However, little is known about the uniqueness of uSEM results. Thus, the goal of this study was to investigate whether multiple solutions result from uSEM analysis and, if so, to demonstrate ways to select an optimal solution. This was accomplished with two simulated data sets, an empirical data set concerning children's dyadic play, and modifications to the group iterative multiple model estimation (GIMME) program, which implements uSEMs with group- and individual-level relations in a data-driven manner. Results revealed multiple solutions when there were large contemporaneous relations among variables. Results also verified several ways to select the correct solution when the complete solution set was generated, such as the use of cross-validation, maximum standardized residuals, and information criteria. This work has immediate and direct implications for the analysis of time series data and for the inferences drawn from those data concerning human behavior.

  13. epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.

    PubMed

    Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa

    2016-12-01

    Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  14. Design of Accelerator Online Simulator Server Using Structured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Guobao; /Brookhaven; Chu, Chungming

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describesmore » the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.« less

  15. Comparison of software models for energy savings from cool roofs

    DOE PAGES

    New, Joshua; Miller, William A.; Huang, Yu; ...

    2015-06-07

    For this study, a web-based Roof Savings Calculator (RSC) has been deployed for the United States Department of Energy as an industry-consensus tool to help building owners, manufacturers, distributors, contractors and researchers easily run complex roof and attic simulations. RSC simulates multiple roof and attic technologies for side-by-side comparison including reflective roofs, different roof slopes, above sheathing ventilation, radiant barriers, low-emittance roof surfaces, duct location, duct leakage rates, multiple substrate types, and insulation levels. Annual simulations of hour-by-hour, whole-building performance are used to provide estimated annual energy and cost savings from reduced HVAC use. While RSC reported similar cooling savingsmore » to other simulation engines, heating penalty varied significantly. RSC results show reduced cool roofing cost-effectiveness, thus mitigating expected economic incentives for this countermeasure to the urban heat island effect. This paper consolidates comparison of RSC's projected energy savings to other simulation engines including DOE-2.1E, AtticSim, Micropas, and EnergyPlus. Also included are comparisons to previous simulation-based studies, analysis of RSC cooling savings and heating penalties, the role of radiative heat exchange in an attic assembly, and changes made for increased accuracy of the duct model. Finally, radiant heat transfer and duct interaction not previously modeled is considered a major contributor to heating penalties.« less

  16. Interaction of multiple biomimetic antimicrobial polymers with model bacterial membranes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baul, Upayan, E-mail: upayanb@imsc.res.in; Vemparala, Satyavani, E-mail: vani@imsc.res.in; Kuroda, Kenichi, E-mail: kkuroda@umich.edu

    Using atomistic molecular dynamics simulations, interaction of multiple synthetic random copolymers based on methacrylates on prototypical bacterial membranes is investigated. The simulations show that the cationic polymers form a micellar aggregate in water phase and the aggregate, when interacting with the bacterial membrane, induces clustering of oppositely charged anionic lipid molecules to form clusters and enhances ordering of lipid chains. The model bacterial membrane, consequently, develops lateral inhomogeneity in membrane thickness profile compared to polymer-free system. The individual polymers in the aggregate are released into the bacterial membrane in a phased manner and the simulations suggest that the most probablemore » location of the partitioned polymers is near the 1-palmitoyl-2-oleoyl-phosphatidylglycerol (POPG) clusters. The partitioned polymers preferentially adopt facially amphiphilic conformations at lipid-water interface, despite lacking intrinsic secondary structures such as α-helix or β-sheet found in naturally occurring antimicrobial peptides.« less

  17. Molecular Dynamics Study of Water Flow across Multiple Layers of Pristine, Oxidized, and Mixed Regions of Graphene Oxide.

    PubMed

    Willcox, Jon A L; Kim, Hyung J

    2017-02-28

    A molecular dynamics graphene oxide model is used to shed light on commonly overlooked features of graphene oxide membranes. The model features both perpendicular and parallel water flow across multiple sheets of pristine and/or oxidized graphene to simulate "brick-and-mortar" microstructures. Additionally, regions of pristine/oxidized graphene overlap that have thus far been overlooked in the literature are explored. Differences in orientational and hydrogen-bonding features between adjacent layers of water in this mixed region are found to be even more prominent than differences between pristine and oxidized channels. This region also shows lateral water flow in equilibrium simulations and orthogonal flow in non-equilibrium simulations significantly greater than those in the oxidized region, suggesting it may play a non-negligible role in the mechanism of water flow across graphene oxide membranes.

  18. A weighted multiple-relaxation-time lattice Boltzmann method for multiphase flows and its application to partial coalescence cascades

    NASA Astrophysics Data System (ADS)

    Fakhari, Abbas; Bolster, Diogo; Luo, Li-Shi

    2017-07-01

    We present a lattice Boltzmann method (LBM) with a weighted multiple-relaxation-time (WMRT) collision model and an adaptive mesh refinement (AMR) algorithm for direct numerical simulation of two-phase flows in three dimensions. The proposed WMRT model enhances the numerical stability of the LBM for immiscible fluids at high density ratios, particularly on the D3Q27 lattice. The effectiveness and efficiency of the proposed WMRT-LBM-AMR is validated through simulations of (a) buoyancy-driven motion and deformation of a gas bubble rising in a viscous liquid; (b) the bag-breakup mechanism of a falling drop; (c) crown splashing of a droplet on a wet surface; and (d) the partial coalescence mechanism of a liquid drop at a liquid-liquid interface. The numerical simulations agree well with available experimental data and theoretical approximations where applicable.

  19. ViSBARD: Visual System for Browsing, Analysis and Retrieval of Data

    NASA Astrophysics Data System (ADS)

    Roberts, D. Aaron; Boller, Ryan; Rezapkin, V.; Coleman, J.; McGuire, R.; Goldstein, M.; Kalb, V.; Kulkarni, R.; Luckyanova, M.; Byrnes, J.; Kerbel, U.; Candey, R.; Holmes, C.; Chimiak, R.; Harris, B.

    2018-04-01

    ViSBARD interactively visualizes and analyzes space physics data. It provides an interactive integrated 3-D and 2-D environment to determine correlations between measurements across many spacecraft. It supports a variety of spacecraft data products and MHD models and is easily extensible to others. ViSBARD provides a way of visualizing multiple vector and scalar quantities as measured by many spacecraft at once. The data are displayed three-dimesionally along the orbits which may be displayed either as connected lines or as points. The data display allows the rapid determination of vector configurations, correlations between many measurements at multiple points, and global relationships. With the addition of magnetohydrodynamic (MHD) model data, this environment can also be used to validate simulation results with observed data, use simulated data to provide a global context for sparse observed data, and apply feature detection techniques to the simulated data.

  20. Slot-like capacity and resource-like coding in a neural model of multiple-item working memory.

    PubMed

    Standage, Dominic; Pare, Martin

    2018-06-27

    For the past decade, research on the storage limitations of working memory has been dominated by two fundamentally different hypotheses. On the one hand, the contents of working memory may be stored in a limited number of `slots', each with a fixed resolution. On the other hand, any number of items may be stored, but with decreasing resolution. These two hypotheses have been invaluable in characterizing the computational structure of working memory, but neither provides a complete account of the available experimental data, nor speaks to the neural basis of the limitations it characterizes. To address these shortcomings, we simulated a multiple-item working memory task with a cortical network model, the cellular resolution of which allowed us to quantify the coding fidelity of memoranda as a function of memory load, as measured by the discriminability, regularity and reliability of simulated neural spiking. Our simulations account for a wealth of neural and behavioural data from human and non-human primate studies, and they demonstrate that feedback inhibition lowers both capacity and coding fidelity. Because the strength of inhibition scales with the number of items stored by the network, increasing this number progressively lowers fidelity until capacity is reached. Crucially, the model makes specific, testable predictions for neural activity on multiple-item working memory tasks.

  1. Improving ecosystem-scale modeling of evapotranspiration using ecological mechanisms that account for compensatory responses following disturbance

    NASA Astrophysics Data System (ADS)

    Millar, David J.; Ewers, Brent E.; Mackay, D. Scott; Peckham, Scott; Reed, David E.; Sekoni, Adewale

    2017-09-01

    Mountain pine beetle outbreaks in western North America have led to extensive forest mortality, justifiably generating interest in improving our understanding of how this type of ecological disturbance affects hydrological cycles. While observational studies and simulations have been used to elucidate the effects of mountain beetle mortality on hydrological fluxes, an ecologically mechanistic model of forest evapotranspiration (ET) evaluated against field data has yet to be developed. In this work, we use the Terrestrial Regional Ecosystem Exchange Simulator (TREES) to incorporate the ecohydrological impacts of mountain pine beetle disturbance on ET for a lodgepole pine-dominated forest equipped with an eddy covariance tower. An existing degree-day model was incorporated that predicted the life cycle of mountain pine beetles, along with an empirically derived submodel that allowed sap flux to decline as a function of temperature-dependent blue stain fungal growth. The eddy covariance footprint was divided into multiple cohorts for multiple growing seasons, including representations of recently attacked trees and the compensatory effects of regenerating understory, using two different spatial scaling methods. Our results showed that using a multiple cohort approach matched eddy covariance-measured ecosystem-scale ET fluxes well, and showed improved performance compared to model simulations assuming a binary framework of only areas of live and dead overstory. Cumulative growing season ecosystem-scale ET fluxes were 8 - 29% greater using the multicohort approach during years in which beetle attacks occurred, highlighting the importance of including compensatory ecological mechanism in ET models.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yang; Liu, Zhiqiang, E-mail: lzq@semi.ac.cn, E-mail: spring@semi.ac.cn; Yi, Xiaoyan, E-mail: lzq@semi.ac.cn, E-mail: spring@semi.ac.cn

    To evaluate electron leakage in InGaN/GaN multiple quantum well (MQW) light emitting diodes (LEDs), analytic models of ballistic and quasi-ballistic transport are developed. With this model, the impact of critical variables effecting electron leakage, including the electron blocking layer (EBL), structure of multiple quantum wells (MQWs), polarization field, and temperature are explored. The simulated results based on this model shed light on previously reported experimental observations and provide basic criteria for suppressing electron leakage, advancing the design of InGaN/GaN LEDs.

  3. Traffic and Driving Simulator Based on Architecture of Interactive Motion.

    PubMed

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.

  4. Traffic and Driving Simulator Based on Architecture of Interactive Motion

    PubMed Central

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keefer, Donald A.; Shaffer, Eric G.; Storsved, Brynne

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing jointmore » visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less

  6. RVA: A Plugin for ParaView 3.14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-09-04

    RVA is a plugin developed for the 64-bit Windows version of the ParaView 3.14 visualization package. RVA is designed to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed onmore » enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less

  7. Advanced Multiple Processor Configuration Study. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    This summary of a study on multiple processor configurations includes the objectives, background, approach, and results of research undertaken to provide the Air Force with a generalized model of computer processor combinations for use in the evaluation of proposed flight training simulator computational designs. An analysis of a real-time flight…

  8. Using LANDIS II to study the effects of global change in Siberia

    Treesearch

    Eric J. Gustafson; Brian R. Sturtevant; Anatoly Z. Shvidenko; Robert M. Scheller

    2010-01-01

    Landscape dynamics are characterized by complex interactions among multiple disturbance regimes, anthropogenic use and management, and the mosaic of diverse ecological conditions. LANDIS-IT is a landscape forest succession and disturbance model that independently simulates multiple ecological and disturbance processes, accounting for complex interactions to predict...

  9. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. User’s Guide for Biodegradation Reactions in TMVOCBio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Yoojin; Battistelli, Alfredo

    TMVOCBio is an extended version of the TMVOC numerical reservoir simulator, with the capability of simulating multiple biodegradation reactions mediated by different microbial populations or based on different redox reactions, thus involving different electron acceptors. This modeling feature is implemented within the existing TMVOC module in iTOUGH2. TMVOCBio, originally developed by Battistelli (2003; 2004), uses a general modified form of the Monod kinetic rate equation to simulate biodegradation reactions, which effectively simulates the uptake of a substrate while accounting for various limiting factors (i.e., the limitation by substrate, electron acceptor, or nutrients). Two approaches are included: 1) a multiple Monodmore » kinetic rate equation, which assumes all the limiting factors simultaneously affect the substrate uptake rate, and 2) a minimum Monod model, which assumes that the substrate uptake rate is controlled by the most limiting factor among those acting for the specific substrate. As the limiting factors, biomass growth inhibition, toxicity effects, as well as competitive and non-competitive inhibition effects are included. The temperature and moisture dependence of biodegradation reactions is also considered. This report provides mathematical formulations and assumptions used for modeling the biodegradation reactions, and describes additional modeling capabilities. Detailed description of input format for biodegradation reactions is presented along with sample problems.« less

  11. A Lunar Surface Operations Simulator

    NASA Technical Reports Server (NTRS)

    Nayar, H.; Balaram, J.; Cameron, J.; Jain, A.; Lim, C.; Mukherjee, R.; Peters, S.; Pomerantz, M.; Reder, L.; Shakkottai, P.; hide

    2008-01-01

    The Lunar Surface Operations Simulator (LSOS) is being developed to support planning and design of space missions to return astronauts to the moon. Vehicles, habitats, dynamic and physical processes and related environment systems are modeled and simulated in LSOS to assist in the visualization and design optimization of systems for lunar surface operations. A parametric analysis tool and a data browser were also implemented to provide an intuitive interface to run multiple simulations and review their results. The simulator and parametric analysis capability are described in this paper.

  12. Carrier trajectory tracking equations for Simple-band Monte Carlo simulation of avalanche multiplication processes

    NASA Astrophysics Data System (ADS)

    Ong, J. S. L.; Charin, C.; Leong, J. H.

    2017-12-01

    Avalanche photodiodes (APDs) with steep electric field gradients generally have low excess noise that arises from carrier multiplication within the internal gain of the devices, and the Monte Carlo (MC) method is among popular device simulation tools for such devices. However, there are few articles relating to carrier trajectory modeling in MC models for such devices. In this work, a set of electric-field-gradient-dependent carrier trajectory tracking equations are developed and used to update the positions of carriers along the path during Simple-band Monte Carlo (SMC) simulations of APDs with non-uniform electric fields. The mean gain and excess noise results obtained from the SMC model employing these equations show good agreement with the results reported for a series of silicon diodes, including a p+n diode with steep electric field gradients. These results confirm the validity and demonstrate the feasibility of the trajectory tracking equations applied in SMC models for simulating mean gain and excess noise in APDs with non-uniform electric fields. Also, the simulation results of mean gain, excess noise, and carrier ionization positions obtained from the SMC model of this work agree well with those of the conventional SMC model employing the concept of a uniform electric field within a carrier free-flight. These results demonstrate that the electric field variation within a carrier free-flight has an insignificant effect on the predicted mean gain and excess noise results. Therefore, both the SMC model of this work and the conventional SMC model can be used to predict the mean gain and excess noise in APDs with highly non-uniform electric fields.

  13. Reduced rank models for travel time estimation of low order mode pulses.

    PubMed

    Chandrayadula, Tarun K; Wage, Kathleen E; Worcester, Peter F; Dzieciuch, Matthew A; Mercer, James A; Andrew, Rex K; Howe, Bruce M

    2013-10-01

    Mode travel time estimation in the presence of internal waves (IWs) is a challenging problem. IWs perturb the sound speed, which results in travel time wander and mode scattering. A standard approach to travel time estimation is to pulse compress the broadband signal, pick the peak of the compressed time series, and average the peak time over multiple receptions to reduce variance. The peak-picking approach implicitly assumes there is a single strong arrival and does not perform well when there are multiple arrivals due to scattering. This article presents a statistical model for the scattered mode arrivals and uses the model to design improved travel time estimators. The model is based on an Empirical Orthogonal Function (EOF) analysis of the mode time series. Range-dependent simulations and data from the Long-range Ocean Acoustic Propagation Experiment (LOAPEX) indicate that the modes are represented by a small number of EOFs. The reduced-rank EOF model is used to construct a travel time estimator based on the Matched Subspace Detector (MSD). Analysis of simulation and experimental data show that the MSDs are more robust to IW scattering than peak picking. The simulation analysis also highlights how IWs affect the mode excitation by the source.

  14. Risk Assessment and Prediction of Flyrock Distance by Combined Multiple Regression Analysis and Monte Carlo Simulation of Quarry Blasting

    NASA Astrophysics Data System (ADS)

    Armaghani, Danial Jahed; Mahdiyar, Amir; Hasanipanah, Mahdi; Faradonbeh, Roohollah Shirani; Khandelwal, Manoj; Amnieh, Hassan Bakhshandeh

    2016-09-01

    Flyrock is considered as one of the main causes of human injury, fatalities, and structural damage among all undesirable environmental impacts of blasting. Therefore, it seems that the proper prediction/simulation of flyrock is essential, especially in order to determine blast safety area. If proper control measures are taken, then the flyrock distance can be controlled, and, in return, the risk of damage can be reduced or eliminated. The first objective of this study was to develop a predictive model for flyrock estimation based on multiple regression (MR) analyses, and after that, using the developed MR model, flyrock phenomenon was simulated by the Monte Carlo (MC) approach. In order to achieve objectives of this study, 62 blasting operations were investigated in Ulu Tiram quarry, Malaysia, and some controllable and uncontrollable factors were carefully recorded/calculated. The obtained results of MC modeling indicated that this approach is capable of simulating flyrock ranges with a good level of accuracy. The mean of simulated flyrock by MC was obtained as 236.3 m, while this value was achieved as 238.6 m for the measured one. Furthermore, a sensitivity analysis was also conducted to investigate the effects of model inputs on the output of the system. The analysis demonstrated that powder factor is the most influential parameter on fly rock among all model inputs. It is noticeable that the proposed MR and MC models should be utilized only in the studied area and the direct use of them in the other conditions is not recommended.

  15. Simulations of Operation Dynamics of Different Type GaN Particle Sensors

    PubMed Central

    Gaubas, Eugenijus; Ceponis, Tomas; Kalesinskas, Vidas; Pavlov, Jevgenij; Vysniauskas, Juozas

    2015-01-01

    The operation dynamics of the capacitor-type and PIN diode type detectors based on GaN have been simulated using the dynamic and drift-diffusion models. The drift-diffusion current simulations have been implemented by employing the software package Synopsys TCAD Sentaurus. The monopolar and bipolar drift regimes have been analyzed by using dynamic models based on the Shockley-Ramo theorem. The carrier multiplication processes determined by impact ionization have been considered in order to compensate carrier lifetime reduction due to introduction of radiation defects into GaN detector material. PMID:25751080

  16. Evolved Design, Integration, and Test of a Modular, Multi-Link, Spacecraft-Based Robotic Manipulator

    DTIC Science & Technology

    2016-06-01

    of the MATLAB code, the SPART model [24]. The portions of the SPART model relevant to this thesis are contained in (Appendices E –P). While the SPART...the kinematics and the dynamics of the system must be modeled and simulated numerically to understand how the system will behave for a given number... simulators with multiple-link robotic arms has been ongoing. B . STATE OF THE ART 1. An Overarching Context Space-based manipulators and the experimental

  17. A rigorous multiple independent binding site model for determining cell-based equilibrium dissociation constants.

    PubMed

    Drake, Andrew W; Klakamp, Scott L

    2007-01-10

    A new 4-parameter nonlinear equation based on the standard multiple independent binding site model (MIBS) is presented for fitting cell-based ligand titration data in order to calculate the ligand/cell receptor equilibrium dissociation constant and the number of receptors/cell. The most commonly used linear (Scatchard Plot) or nonlinear 2-parameter model (a single binding site model found in commercial programs like Prism(R)) used for analysis of ligand/receptor binding data assumes only the K(D) influences the shape of the titration curve. We demonstrate using simulated data sets that, depending upon the cell surface receptor expression level, the number of cells titrated, and the magnitude of the K(D) being measured, this assumption of always being under K(D)-controlled conditions can be erroneous and can lead to unreliable estimates for the binding parameters. We also compare and contrast the fitting of simulated data sets to the commonly used cell-based binding equation versus our more rigorous 4-parameter nonlinear MIBS model. It is shown through these simulations that the new 4-parameter MIBS model, when used for cell-based titrations under optimal conditions, yields highly accurate estimates of all binding parameters and hence should be the preferred model to fit cell-based experimental nonlinear titration data.

  18. HexSim: A flexible simulation model for forecasting wildlife responses to multiple interacting stressors

    EPA Science Inventory

    With SERDP funding, we have improved upon a popular life history simulator (PATCH), and in doing so produced a powerful new forecasting tool (HexSim). PATCH, our starting point, was spatially explicit and individual-based, and was useful for evaluating a range of terrestrial lif...

  19. On the Sensitivity of Atmospheric Ensembles to Cloud Microphysics in Long-Term Cloud-Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Zeng, Xiping; Tao, Wei-Kuo; Lang, Stephen; Hou, Arthur Y.; Zhang, Minghua; Simpson, Joanne

    2008-01-01

    Month-long large-scale forcing data from two field campaigns are used to drive a cloud-resolving model (CRM) and produce ensemble simulations of clouds and precipitation. Observational data are then used to evaluate the model results. To improve the model results, a new parameterization of the Bergeron process is proposed that incorporates the number concentration of ice nuclei (IN). Numerical simulations reveal that atmospheric ensembles are sensitive to IN concentration and ice crystal multiplication. Two- (2D) and three-dimensional (3D) simulations are carried out to address the sensitivity of atmospheric ensembles to model dimensionality. It is found that the ensembles with high IN concentration are more sensitive to dimensionality than those with low IN concentration. Both the analytic solutions of linear dry models and the CRM output show that there are more convective cores with stronger updrafts in 3D simulations than in 2D, which explains the differing sensitivity of the ensembles to dimensionality at different IN concentrations.

  20. Ergodicity and model quality in template-restrained canonical and temperature/Hamiltonian replica exchange coarse-grained molecular dynamics simulations of proteins.

    PubMed

    Karczyńska, Agnieszka S; Czaplewski, Cezary; Krupa, Paweł; Mozolewska, Magdalena A; Joo, Keehyoung; Lee, Jooyoung; Liwo, Adam

    2017-12-05

    Molecular simulations restrained to single or multiple templates are commonly used in protein-structure modeling. However, the restraints introduce additional barriers, thus impairing the ergodicity of simulations, which can affect the quality of the resulting models. In this work, the effect of restraint types and simulation schemes on ergodicity and model quality was investigated by performing template-restrained canonical molecular dynamics (MD), multiplexed replica-exchange molecular dynamics, and Hamiltonian replica exchange molecular dynamics (HREMD) simulations with the coarse-grained UNRES force field on nine selected proteins, with pseudo-harmonic log-Gaussian (unbounded) or Lorentzian (bounded) restraint functions. The best ergodicity was exhibited by HREMD. It has been found that non-ergodicity does not affect model quality if good templates are used to generate restraints. However, when poor-quality restraints not covering the entire protein are used, the improved ergodicity of HREMD can lead to significantly improved protein models. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Surface Dimming by the 2013 Rim Fire Simulated by a Sectional Aerosol Model

    NASA Technical Reports Server (NTRS)

    Yu, Pengfei; Toon, Owen B.; Bardeen, Charles G; Bucholtz, Anthony; Rosenlof, Karen; Saide, Pablo E.; Da Silva, Arlindo M.; Ziemba, Luke D.; Thornhill, Kenneth L.; Jimenez, Jose-Luis; hide

    2016-01-01

    The Rim Fire of 2013, the third largest area burned by fire recorded in California history, is simulated by a climate model coupled with a size-resolved aerosol model. Modeled aerosol mass, number and particle size distribution are within variability of data obtained from multiple airborne in-situ measurements. Simulations suggest Rim Fire smoke may block 4-6 of sunlight energy reaching the surface, with a dimming efficiency around 120-150 W m(exp -2) per unit aerosol optical depth in the mid-visible at 13:00-15:00 local time. Underestimation of simulated smoke single scattering albedo at mid-visible by 0.04 suggests the model overestimates either the particle size or the absorption due to black carbon. This study shows that exceptional events like the 2013 Rim Fire can be simulated by a climate model with one-degree resolution with overall good skill, though that resolution is still not sufficient to resolve the smoke peak near the source region.

  2. Surface dimming by the 2013 Rim Fire simulated by a sectional aerosol model.

    PubMed

    Yu, Pengfei; Toon, Owen B; Bardeen, Charles G; Bucholtz, Anthony; Rosenlof, Karen H; Saide, Pablo E; Da Silva, Arlindo; Ziemba, Luke D; Thornhill, Kenneth L; Jimenez, Jose-Luis; Campuzano-Jost, Pedro; Schwarz, Joshua P; Perring, Anne E; Froyd, Karl D; Wagner, N L; Mills, Michael J; Reid, Jeffrey S

    2016-06-27

    The Rim Fire of 2013, the third largest area burned by fire recorded in California history, is simulated by a climate model coupled with a size-resolved aerosol model. Modeled aerosol mass, number, and particle size distribution are within variability of data obtained from multiple-airborne in situ measurements. Simulations suggest that Rim Fire smoke may block 4-6% of sunlight energy reaching the surface, with a dimming efficiency around 120-150 W m -2 per unit aerosol optical depth in the midvisible at 13:00-15:00 local time. Underestimation of simulated smoke single scattering albedo at midvisible by 0.04 suggests that the model overestimates either the particle size or the absorption due to black carbon. This study shows that exceptional events like the 2013 Rim Fire can be simulated by a climate model with 1° resolution with overall good skill, although that resolution is still not sufficient to resolve the smoke peak near the source region.

  3. Generation of animation sequences of three dimensional models

    NASA Technical Reports Server (NTRS)

    Poi, Sharon (Inventor); Bell, Brad N. (Inventor)

    1990-01-01

    The invention is directed toward a method and apparatus for generating an animated sequence through the movement of three-dimensional graphical models. A plurality of pre-defined graphical models are stored and manipulated in response to interactive commands or by means of a pre-defined command file. The models may be combined as part of a hierarchical structure to represent physical systems without need to create a separate model which represents the combined system. System motion is simulated through the introduction of translation, rotation and scaling parameters upon a model within the system. The motion is then transmitted down through the system hierarchy of models in accordance with hierarchical definitions and joint movement limitations. The present invention also calls for a method of editing hierarchical structure in response to interactive commands or a command file such that a model may be included, deleted, copied or moved within multiple system model hierarchies. The present invention also calls for the definition of multiple viewpoints or cameras which may exist as part of a system hierarchy or as an independent camera. The simulated movement of the models and systems is graphically displayed on a monitor and a frame is recorded by means of a video controller. Multiple movement and hierarchy manipulations are then recorded as a sequence of frames which may be played back as an animation sequence on a video cassette recorder.

  4. Development of a resource modelling tool to support decision makers in pandemic influenza preparedness: The AsiaFluCap Simulator.

    PubMed

    Stein, Mart Lambertus; Rudge, James W; Coker, Richard; van der Weijden, Charlie; Krumkamp, Ralf; Hanvoravongchai, Piya; Chavez, Irwin; Putthasri, Weerasak; Phommasack, Bounlay; Adisasmito, Wiku; Touch, Sok; Sat, Le Minh; Hsu, Yu-Chen; Kretzschmar, Mirjam; Timen, Aura

    2012-10-12

    Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software.

  5. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  6. Mathematical model for self-propelled droplets driven by interfacial tension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagai, Ken H.; Tachibana, Kunihito; Tobe, Yuta

    2016-03-21

    We propose a model for the spontaneous motion of a droplet induced by inhomogeneity in interfacial tension. The model is derived from a variation of the Lagrangian of the system and we use a time-discretized Morse flow scheme to perform its numerical simulations. Our model can naturally simulate the dynamics of a single droplet, as well as that of multiple droplets, where the volume of each droplet is conserved. We reproduced the ballistic motion and fission of a droplet, and the collision of two droplets was also examined numerically.

  7. Development of a Human Motor Model for the Evaluation of an Integrated Alerting and Notification Flight Deck System

    NASA Technical Reports Server (NTRS)

    Daiker, Ron; Schnell, Thomas

    2010-01-01

    A human motor model was developed on the basis of performance data that was collected in a flight simulator. The motor model is under consideration as one component of a virtual pilot model for the evaluation of NextGen crew alerting and notification systems in flight decks. This model may be used in a digital Monte Carlo simulation to compare flight deck layout design alternatives. The virtual pilot model is being developed as part of a NASA project to evaluate multiple crews alerting and notification flight deck configurations. Model parameters were derived from empirical distributions of pilot data collected in a flight simulator experiment. The goal of this model is to simulate pilot motor performance in the approach-to-landing task. The unique challenges associated with modeling the complex dynamics of humans interacting with the cockpit environment are discussed, along with the current state and future direction of the model.

  8. LES-based filter-matrix lattice Boltzmann model for simulating fully developed turbulent channel flow

    NASA Astrophysics Data System (ADS)

    Zhuo, Congshan; Zhong, Chengwen

    2016-11-01

    In this paper, a three-dimensional filter-matrix lattice Boltzmann (FMLB) model based on large eddy simulation (LES) was verified for simulating wall-bounded turbulent flows. The Vreman subgrid-scale model was employed in the present FMLB-LES framework, which had been proved to be capable of predicting turbulent near-wall region accurately. The fully developed turbulent channel flows were performed at a friction Reynolds number Reτ of 180. The turbulence statistics computed from the present FMLB-LES simulations, including mean stream velocity profile, Reynolds stress profile and root-mean-square velocity fluctuations greed well with the LES results of multiple-relaxation-time (MRT) LB model, and some discrepancies in comparison with those direct numerical simulation (DNS) data of Kim et al. was also observed due to the relatively low grid resolution. Moreover, to investigate the influence of grid resolution on the present LES simulation, a DNS simulation on a finer gird was also implemented by present FMLB-D3Q19 model. Comparisons of detailed computed various turbulence statistics with available benchmark data of DNS showed quite well agreement.

  9. Predicting Honeybee Colony Failure: Using the BEEHAVE Model to Simulate Colony Responses to Pesticides

    PubMed Central

    2015-01-01

    To simulate effects of pesticides on different honeybee (Apis mellifera L.) life stages, we used the BEEHAVE model to explore how increased mortalities of larvae, in-hive workers, and foragers, as well as reduced egg-laying rate, could impact colony dynamics over multiple years. Stresses were applied for 30 days, both as multiples of the modeled control mortality and as set percentage daily mortalities to assess the sensitivity of the modeled colony both to small fluctuations in mortality and periods of low to very high daily mortality. These stresses simulate stylized exposure of the different life stages to nectar and pollen contaminated with pesticide for 30 days. Increasing adult bee mortality had a much greater impact on colony survival than mortality of bee larvae or reduction in egg laying rate. Importantly, the seasonal timing of the imposed mortality affected the magnitude of the impact at colony level. In line with the LD50, we propose a new index of “lethal imposed stress”: the LIS50 which indicates the level of stress on individuals that results in 50% colony mortality. This (or any LISx) is a comparative index for exploring the effects of different stressors at colony level in model simulations. While colony failure is not an acceptable protection goal, this index could be used to inform the setting of future regulatory protection goals. PMID:26444386

  10. Zero-Point Energy Constraint for Unimolecular Dissociation Reactions. Giving Trajectories Multiple Chances To Dissociate Correctly.

    PubMed

    Paul, Amit K; Hase, William L

    2016-01-28

    A zero-point energy (ZPE) constraint model is proposed for classical trajectory simulations of unimolecular decomposition and applied to CH4* → H + CH3 decomposition. With this model trajectories are not allowed to dissociate unless they have ZPE in the CH3 product. If not, they are returned to the CH4* region of phase space and, if necessary, given additional opportunities to dissociate with ZPE. The lifetime for dissociation of an individual trajectory is the time it takes to dissociate with ZPE in CH3, including multiple possible returns to CH4*. With this ZPE constraint the dissociation of CH4* is exponential in time as expected for intrinsic RRKM dynamics and the resulting rate constant is in good agreement with the harmonic quantum value of RRKM theory. In contrast, a model that discards trajectories without ZPE in the reaction products gives a CH4* → H + CH3 rate constant that agrees with the classical and not quantum RRKM value. The rate constant for the purely classical simulation indicates that anharmonicity may be important and the rate constant from the ZPE constrained classical trajectory simulation may not represent the complete anharmonicity of the RRKM quantum dynamics. The ZPE constraint model proposed here is compared with previous models for restricting ZPE flow in intramolecular dynamics, and connecting product and reactant/product quantum energy levels in chemical dynamics simulations.

  11. Performance of distributed multiscale simulations

    PubMed Central

    Borgdorff, J.; Ben Belgacem, M.; Bona-Casas, C.; Fazendeiro, L.; Groen, D.; Hoenen, O.; Mizeranschi, A.; Suter, J. L.; Coster, D.; Coveney, P. V.; Dubitzky, W.; Hoekstra, A. G.; Strand, P.; Chopard, B.

    2014-01-01

    Multiscale simulations model phenomena across natural scales using monolithic or component-based code, running on local or distributed resources. In this work, we investigate the performance of distributed multiscale computing of component-based models, guided by six multiscale applications with different characteristics and from several disciplines. Three modes of distributed multiscale computing are identified: supplementing local dependencies with large-scale resources, load distribution over multiple resources, and load balancing of small- and large-scale resources. We find that the first mode has the apparent benefit of increasing simulation speed, and the second mode can increase simulation speed if local resources are limited. Depending on resource reservation and model coupling topology, the third mode may result in a reduction of resource consumption. PMID:24982258

  12. Memory interface simulator: A computer design aid

    NASA Technical Reports Server (NTRS)

    Taylor, D. S.; Williams, T.; Weatherbee, J. E.

    1972-01-01

    Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.

  13. Concurrent heterogeneous neural model simulation on real-time neuromimetic hardware.

    PubMed

    Rast, Alexander; Galluppi, Francesco; Davies, Sergio; Plana, Luis; Patterson, Cameron; Sharp, Thomas; Lester, David; Furber, Steve

    2011-11-01

    Dedicated hardware is becoming increasingly essential to simulate emerging very-large-scale neural models. Equally, however, it needs to be able to support multiple models of the neural dynamics, possibly operating simultaneously within the same system. This may be necessary either to simulate large models with heterogeneous neural types, or to simplify simulation and analysis of detailed, complex models in a large simulation by isolating the new model to a small subpopulation of a larger overall network. The SpiNNaker neuromimetic chip is a dedicated neural processor able to support such heterogeneous simulations. Implementing these models on-chip uses an integrated library-based tool chain incorporating the emerging PyNN interface that allows a modeller to input a high-level description and use an automated process to generate an on-chip simulation. Simulations using both LIF and Izhikevich models demonstrate the ability of the SpiNNaker system to generate and simulate heterogeneous networks on-chip, while illustrating, through the network-scale effects of wavefront synchronisation and burst gating, methods that can provide effective behavioural abstractions for large-scale hardware modelling. SpiNNaker's asynchronous virtual architecture permits greater scope for model exploration, with scalable levels of functional and temporal abstraction, than conventional (or neuromorphic) computing platforms. The complete system illustrates a potential path to understanding the neural model of computation, by building (and breaking) neural models at various scales, connecting the blocks, then comparing them against the biology: computational cognitive neuroscience. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Preferential attachment in multiple trade networks

    NASA Astrophysics Data System (ADS)

    Foschi, Rachele; Riccaboni, Massimo; Schiavo, Stefano

    2014-08-01

    In this paper we develop a model for the evolution of multiple networks which is able to replicate the concentrated and sparse nature of world trade data. Our model is an extension of the preferential attachment growth model to the case of multiple networks. Countries trade a variety of goods of different complexity. Every country progressively evolves from trading less sophisticated to high-tech goods. The probabilities of capturing more trade opportunities at a given level of complexity and of starting to trade more complex goods are both proportional to the number of existing trade links. We provide a set of theoretical predictions and simulative results. A calibration exercise shows that our model replicates the same concentration level of world trade as well as the sparsity pattern of the trade matrix. We also discuss a set of numerical solutions to deal with large multiple networks.

  15. Boltzmann sampling for an XY model using a non-degenerate optical parametric oscillator network

    NASA Astrophysics Data System (ADS)

    Takeda, Y.; Tamate, S.; Yamamoto, Y.; Takesue, H.; Inagaki, T.; Utsunomiya, S.

    2018-01-01

    We present an experimental scheme of implementing multiple spins in a classical XY model using a non-degenerate optical parametric oscillator (NOPO) network. We built an NOPO network to simulate a one-dimensional XY Hamiltonian with 5000 spins and externally controllable effective temperatures. The XY spin variables in our scheme are mapped onto the phases of multiple NOPO pulses in a single ring cavity and interactions between XY spins are implemented by mutual injections between NOPOs. We show the steady-state distribution of optical phases of such NOPO pulses is equivalent to the Boltzmann distribution of the corresponding XY model. Estimated effective temperatures converged to the setting values, and the estimated temperatures and the mean energy exhibited good agreement with the numerical simulations of the Langevin dynamics of NOPO phases.

  16. Nonparametric estimation of the heterogeneity of a random medium using compound Poisson process modeling of wave multiple scattering.

    PubMed

    Le Bihan, Nicolas; Margerin, Ludovic

    2009-07-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  17. Multiple attractors and dynamics in an OLG model with productive environment

    NASA Astrophysics Data System (ADS)

    Caravaggio, Andrea; Sodini, Mauro

    2018-05-01

    This work analyses an overlapping generations model in which economic activity depends on the exploitation of a free-access natural resource. In addition, public expenditures for environmental maintenance are assumed. By characterising some properties of the map and performing numerical simulations, we investigate consequences of the interplay between environmental public expenditure and private sector. In particular, we identify different scenarios in which multiple equilibria as well as complex dynamics may arise.

  18. Autonomous detection of crowd anomalies in multiple-camera surveillance feeds

    NASA Astrophysics Data System (ADS)

    Nordlöf, Jonas; Andersson, Maria

    2016-10-01

    A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.

  19. deltaGseg: macrostate estimation via molecular dynamics simulations and multiscale time series analysis.

    PubMed

    Low, Diana H P; Motakis, Efthymios

    2013-10-01

    Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.

  20. An image-based reaction field method for electrostatic interactions in molecular dynamics simulations of aqueous solutions

    NASA Astrophysics Data System (ADS)

    Lin, Yuchun; Baumketner, Andrij; Deng, Shaozhong; Xu, Zhenli; Jacobs, Donald; Cai, Wei

    2009-10-01

    In this paper, a new solvation model is proposed for simulations of biomolecules in aqueous solutions that combines the strengths of explicit and implicit solvent representations. Solute molecules are placed in a spherical cavity filled with explicit water, thus providing microscopic detail where it is most needed. Solvent outside of the cavity is modeled as a dielectric continuum whose effect on the solute is treated through the reaction field corrections. With this explicit/implicit model, the electrostatic potential represents a solute molecule in an infinite bath of solvent, thus avoiding unphysical interactions between periodic images of the solute commonly used in the lattice-sum explicit solvent simulations. For improved computational efficiency, our model employs an accurate and efficient multiple-image charge method to compute reaction fields together with the fast multipole method for the direct Coulomb interactions. To minimize the surface effects, periodic boundary conditions are employed for nonelectrostatic interactions. The proposed model is applied to study liquid water. The effect of model parameters, which include the size of the cavity, the number of image charges used to compute reaction field, and the thickness of the buffer layer, is investigated in comparison with the particle-mesh Ewald simulations as a reference. An optimal set of parameters is obtained that allows for a faithful representation of many structural, dielectric, and dynamic properties of the simulated water, while maintaining manageable computational cost. With controlled and adjustable accuracy of the multiple-image charge representation of the reaction field, it is concluded that the employed model achieves convergence with only one image charge in the case of pure water. Future applications to pKa calculations, conformational sampling of solvated biomolecules and electrolyte solutions are briefly discussed.

  1. Can Earth System Model Provide Reasonable Natural Runoff Estimates to Support Water Management Studies?

    NASA Astrophysics Data System (ADS)

    Kao, S. C.; Shi, X.; Kumar, J.; Ricciuto, D. M.; Mao, J.; Thornton, P. E.

    2017-12-01

    With the concern of changing hydrologic regime, there is a crucial need to better understand how water availability may change and influence water management decisions in the projected future climate conditions. Despite that surface hydrology has long been simulated by land model within the Earth System modeling (ESM) framework, given the coarser horizontal resolution and lack of engineering-level calibration, raw runoff from ESM is generally discarded by water resource managers when conducting hydro-climate impact assessments. To identify a likely path to improve the credibility of ESM-simulated natural runoff, we conducted regional model simulation using the land component (ALM) of the Accelerated Climate Modeling for Energy (ACME) version 1 focusing on the conterminous United States (CONUS). Two very different forcing data sets, including (1) the conventional 0.5° CRUNCEP (v5, 1901-2013) and (2) the 1-km Daymet (v3, 1980-2013) aggregated to 0.5°, were used to conduct 20th century transient simulation with satellite phenology. Additional meteorologic and hydrologic observations, including PRISM precipitation and U.S. Geological Survey WaterWatch runoff, were used for model evaluation. For various CONUS hydrologic regions (such as Pacific Northwest), we found that Daymet can significantly improve the reasonableness of simulated ALM runoff even without intensive calibration. The large dry bias of CRUNCEP precipitation (evaluated by PRISM) in multiple CONUS hydrologic regions is believed to be the main reason causing runoff underestimation. The results suggest that when driving with skillful precipitation estimates, ESM has the ability to produce reasonable natural runoff estimates to support further water management studies. Nevertheless, model calibration will be required for regions (such as Upper Colorado) where ill performance is showed for multiple different forcings.

  2. Micromechanics of failure waves in glass. 2: Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Espinosa, H.D.; Xu, Y.; Brar, N.S.

    1997-08-01

    In an attempt to elucidate the failure mechanism responsible for the so-called failure waves in glass, numerical simulations of plate and rod impact experiments, with a multiple-plane model, have been performed. These simulations show that the failure wave phenomenon can be modeled by the nucleation and growth of penny-shaped shear defects from the specimen surface to its interior. Lateral stress increase, reduction of spall strength,and progressive attenuation of axial stress behind the failure front are properly predicted by the multiple-plane model. Numerical simulations of high-strain-rate pressure-shear experiments indicate that the model predicts reasonably well the shear resistance of the materialmore » at strain rates as high as 1 {times} 10{sup 6}/s. The agreement is believed to be the result of the model capability in simulating damage-induced anisotropy. By examining the kinetics of the failure process in plate experiments, the authors show that the progressive glass spallation in the vicinity of the failure front and the rate of increase in lateral stress are more consistent with a representation of inelasticity based on shear-activated flow surfaces, inhomogeneous flow, and microcracking, rather than pure microcracking. In the former mechanism, microcracks are likely formed at a later time at the intersection of flow surfaces, in the case of rod-on-rod impact, stress and radial velocity histories predicted by the microcracking model are in agreement with the experimental measurements. Stress attenuation, pulse duration, and release structure are properly simulated. It is shown that failure wave speeds in excess to 3,600 m/s are required for adequate prediction in rod radial expansion.« less

  3. Continuum mesoscopic framework for multiple interacting species and processes on multiple site types and/or crystallographic planes.

    PubMed

    Chatterjee, Abhijit; Vlachos, Dionisios G

    2007-07-21

    While recently derived continuum mesoscopic equations successfully bridge the gap between microscopic and macroscopic physics, so far they have been derived only for simple lattice models. In this paper, general deterministic continuum mesoscopic equations are derived rigorously via nonequilibrium statistical mechanics to account for multiple interacting surface species and multiple processes on multiple site types and/or different crystallographic planes. Adsorption, desorption, reaction, and surface diffusion are modeled. It is demonstrated that contrary to conventional phenomenological continuum models, microscopic physics, such as the interaction potential, determines the final form of the mesoscopic equation. Models of single component diffusion and binary diffusion of interacting particles on single-type site lattice and of single component diffusion on complex microporous materials' lattices consisting of two types of sites are derived, as illustrations of the mesoscopic framework. Simplification of the diffusion mesoscopic model illustrates the relation to phenomenological models, such as the Fickian and Maxwell-Stefan transport models. It is demonstrated that the mesoscopic equations are in good agreement with lattice kinetic Monte Carlo simulations for several prototype examples studied.

  4. Coexistence of multiple bifurcation modes in memristive diode-bridge-based canonical Chua's circuit

    NASA Astrophysics Data System (ADS)

    Bao, Bocheng; Xu, Li; Wu, Zhimin; Chen, Mo; Wu, Huagan

    2018-07-01

    Based on a memristive diode bridge cascaded with series resistor and inductor filter, a modified memristive canonical Chua's circuit is presented in this paper. With the modelling of the memristive circuit, a normalised system model is built. Stability analyses of the equilibrium points are performed and bifurcation behaviours are investigated by numerical simulations and hardware experiments. Most extraordinary in the memristive circuit is that within a parameter region, coexisting phenomenon of multiple bifurcation modes is emerged under six sets of different initial values, resulting in the coexistence of four sets of topologically different and disconnected attractors. These coexisting attractors are easily captured by repeatedly switching on and off the circuit power supplies, which well verify the numerical simulations.

  5. Effects of cumulus entrainment and multiple cloud types on a January global climate model simulation

    NASA Technical Reports Server (NTRS)

    Yao, Mao-Sung; Del Genio, Anthony D.

    1989-01-01

    An improved version of the GISS Model II cumulus parameterization designed for long-term climate integrations is used to study the effects of entrainment and multiple cloud types on the January climate simulation. Instead of prescribing convective mass as a fixed fraction of the cloud base grid-box mass, it is calculated based on the closure assumption that the cumulus convection restores the atmosphere to a neutral moist convective state at cloud base. This change alone significantly improves the distribution of precipitation, convective mass exchanges, and frequencies in the January climate. The vertical structure of the tropical atmosphere exhibits quasi-equilibrium behavior when this closure is used, even though there is no explicit constraint applied above cloud base.

  6. Multisite EPR oximetry from multiple quadrature harmonics.

    PubMed

    Ahmad, R; Som, S; Johnson, D H; Zweier, J L; Kuppusamy, P; Potter, L C

    2012-01-01

    Multisite continuous wave (CW) electron paramagnetic resonance (EPR) oximetry using multiple quadrature field modulation harmonics is presented. First, a recently developed digital receiver is used to extract multiple harmonics of field modulated projection data. Second, a forward model is presented that relates the projection data to unknown parameters, including linewidth at each site. Third, a maximum likelihood estimator of unknown parameters is reported using an iterative algorithm capable of jointly processing multiple quadrature harmonics. The data modeling and processing are applicable for parametric lineshapes under nonsaturating conditions. Joint processing of multiple harmonics leads to 2-3-fold acceleration of EPR data acquisition. For demonstration in two spatial dimensions, both simulations and phantom studies on an L-band system are reported. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Synthetic observations of protostellar multiple systems

    NASA Astrophysics Data System (ADS)

    Lomax, O.; Whitworth, A. P.

    2018-04-01

    Observations of protostars are often compared with synthetic observations of models in order to infer the underlying physical properties of the protostars. The majority of these models have a single protostar, attended by a disc and an envelope. However, observational and numerical evidence suggests that a large fraction of protostars form as multiple systems. This means that fitting models of single protostars to observations may be inappropriate. We produce synthetic observations of protostellar multiple systems undergoing realistic, non-continuous accretion. These systems consist of multiple protostars with episodic luminosities, embedded self-consistently in discs and envelopes. We model the gas dynamics of these systems using smoothed particle hydrodynamics and we generate synthetic observations by post-processing the snapshots using the SPAMCART Monte Carlo radiative transfer code. We present simulation results of three model protostellar multiple systems. For each of these, we generate 4 × 104 synthetic spectra at different points in time and from different viewing angles. We propose a Bayesian method, using similar calculations to those presented here, but in greater numbers, to infer the physical properties of protostellar multiple systems from observations.

  8. A simulation study to quantify the impacts of exposure ...

    EPA Pesticide Factsheets

    A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  9. Attrition Bias Related to Missing Outcome Data: A Longitudinal Simulation Study.

    PubMed

    Lewin, Antoine; Brondeel, Ruben; Benmarhnia, Tarik; Thomas, Frédérique; Chaix, Basile

    2018-01-01

    Most longitudinal studies do not address potential selection biases due to selective attrition. Using empirical data and simulating additional attrition, we investigated the effectiveness of common approaches to handle missing outcome data from attrition in the association between individual education level and change in body mass index (BMI). Using data from the two waves of the French RECORD Cohort Study (N = 7,172), we first examined how inverse probability weighting (IPW) and multiple imputation handled missing outcome data from attrition in the observed data (stage 1). Second, simulating additional missing data in BMI at follow-up under various missing-at-random scenarios, we quantified the impact of attrition and assessed how multiple imputation performed compared to complete case analysis and to a perfectly specified IPW model as a gold standard (stage 2). With the observed data in stage 1, we found an inverse association between individual education and change in BMI, with complete case analysis, as well as with IPW and multiple imputation. When we simulated additional attrition under a missing-at-random pattern (stage 2), the bias increased with the magnitude of selective attrition, and multiple imputation was useless to address it. Our simulations revealed that selective attrition in the outcome heavily biased the association of interest. The present article contributes to raising awareness that for missing outcome data, multiple imputation does not do better than complete case analysis. More effort is thus needed during the design phase to understand attrition mechanisms by collecting information on the reasons for dropout.

  10. Trans-Pacific transport and evolution of aerosols: Evaluation of quasi-global WRF-Chem simulation with multiple observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Zhiyuan; Zhao, Chun; Huang, Jianping

    A fully coupled meteorology-chemistry model (WRF-Chem, the Weather Research and Forecasting model coupled with chemistry) has been configured to conduct quasi-global simulation for 5 years (2010–2014) and evaluated with multiple observation data sets for the first time. The evaluation focuses on the simulation over the trans-Pacific transport region using various reanalysis and observational data sets for meteorological fields and aerosol properties. The simulation generally captures the overall spatial and seasonal variability of satellite retrieved aerosol optical depth (AOD) and absorbing AOD (AAOD) over the Pacific that is determined by the outflow of pollutants and dust and the emissions of marine aerosols.more » The assessment of simulated extinction Ångström exponent (EAE) indicates that the model generally reproduces the variability of aerosol size distributions as seen by satellites. In addition, the vertical profile of aerosol extinction and its seasonality over the Pacific are also well simulated. The difference between the simulation and satellite retrievals can be mainly attributed to model biases in estimating marine aerosol emissions as well as the satellite sampling and retrieval uncertainties. Compared with the surface measurements over the western USA, the model reasonably simulates the observed magnitude and seasonality of dust, sulfate, and nitrate surface concentrations, but significantly underestimates the peak surface concentrations of carbonaceous aerosol likely due to model biases in the spatial and temporal variability of biomass burning emissions and secondary organic aerosol (SOA) production. A sensitivity simulation shows that the trans-Pacific transported dust, sulfate, and nitrate can make significant contribution to surface concentrations over the rural areas of the western USA, while the peaks of carbonaceous aerosol surface concentrations are dominated by the North American emissions. Both the retrievals and simulation show small interannual variability of aerosol characteristics for 2010–2014 averaged over three Pacific sub-regions. Furthermore, the evaluation in this study demonstrates that the WRF-Chem quasi-global simulation can be used for investigating trans-Pacific transport of aerosols and providing reasonable inflow chemical boundaries for the western USA, allowing one to further understand the impact of transported pollutants on the regional air quality and climate with high-resolution nested regional modeling.« less

  11. Trans-Pacific transport and evolution of aerosols: Evaluation of quasi-global WRF-Chem simulation with multiple observations

    DOE PAGES

    Hu, Zhiyuan; Zhao, Chun; Huang, Jianping; ...

    2016-05-10

    A fully coupled meteorology-chemistry model (WRF-Chem, the Weather Research and Forecasting model coupled with chemistry) has been configured to conduct quasi-global simulation for 5 years (2010–2014) and evaluated with multiple observation data sets for the first time. The evaluation focuses on the simulation over the trans-Pacific transport region using various reanalysis and observational data sets for meteorological fields and aerosol properties. The simulation generally captures the overall spatial and seasonal variability of satellite retrieved aerosol optical depth (AOD) and absorbing AOD (AAOD) over the Pacific that is determined by the outflow of pollutants and dust and the emissions of marine aerosols.more » The assessment of simulated extinction Ångström exponent (EAE) indicates that the model generally reproduces the variability of aerosol size distributions as seen by satellites. In addition, the vertical profile of aerosol extinction and its seasonality over the Pacific are also well simulated. The difference between the simulation and satellite retrievals can be mainly attributed to model biases in estimating marine aerosol emissions as well as the satellite sampling and retrieval uncertainties. Compared with the surface measurements over the western USA, the model reasonably simulates the observed magnitude and seasonality of dust, sulfate, and nitrate surface concentrations, but significantly underestimates the peak surface concentrations of carbonaceous aerosol likely due to model biases in the spatial and temporal variability of biomass burning emissions and secondary organic aerosol (SOA) production. A sensitivity simulation shows that the trans-Pacific transported dust, sulfate, and nitrate can make significant contribution to surface concentrations over the rural areas of the western USA, while the peaks of carbonaceous aerosol surface concentrations are dominated by the North American emissions. Both the retrievals and simulation show small interannual variability of aerosol characteristics for 2010–2014 averaged over three Pacific sub-regions. Furthermore, the evaluation in this study demonstrates that the WRF-Chem quasi-global simulation can be used for investigating trans-Pacific transport of aerosols and providing reasonable inflow chemical boundaries for the western USA, allowing one to further understand the impact of transported pollutants on the regional air quality and climate with high-resolution nested regional modeling.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less

  13. Using multiple group modeling to test moderators in meta-analysis.

    PubMed

    Schoemann, Alexander M

    2016-12-01

    Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. An improved null model for assessing the net effects of multiple stressors on communities.

    PubMed

    Thompson, Patrick L; MacLennan, Megan M; Vinebrooke, Rolf D

    2018-01-01

    Ecological stressors (i.e., environmental factors outside their normal range of variation) can mediate each other through their interactions, leading to unexpected combined effects on communities. Determining whether the net effect of stressors is ecologically surprising requires comparing their cumulative impact to a null model that represents the linear combination of their individual effects (i.e., an additive expectation). However, we show that standard additive and multiplicative null models that base their predictions on the effects of single stressors on community properties (e.g., species richness or biomass) do not provide this linear expectation, leading to incorrect interpretations of antagonistic and synergistic responses by communities. We present an alternative, the compositional null model, which instead bases its predictions on the effects of stressors on individual species, and then aggregates them to the community level. Simulations demonstrate the improved ability of the compositional null model to accurately provide a linear expectation of the net effect of stressors. We simulate the response of communities to paired stressors that affect species in a purely additive fashion and compare the relative abilities of the compositional null model and two standard community property null models (additive and multiplicative) to predict these linear changes in species richness and community biomass across different combinations (both positive, negative, or opposite) and intensities of stressors. The compositional model predicts the linear effects of multiple stressors under almost all scenarios, allowing for proper classification of net effects, whereas the standard null models do not. Our findings suggest that current estimates of the prevalence of ecological surprises on communities based on community property null models are unreliable, and should be improved by integrating the responses of individual species to the community level as does our compositional null model. © 2017 John Wiley & Sons Ltd.

  15. Power of one nonclean qubit

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki; Fujii, Keisuke; Nishimura, Harumichi

    2017-04-01

    The one-clean qubit model (or the DQC1 model) is a restricted model of quantum computing where only a single qubit of the initial state is pure and others are maximally mixed. Although the model is not universal, it can efficiently solve several problems whose classical efficient solutions are not known. Furthermore, it was recently shown that if the one-clean qubit model is classically efficiently simulated, the polynomial hierarchy collapses to the second level. A disadvantage of the one-clean qubit model is, however, that the clean qubit is too clean: for example, in realistic NMR experiments, polarizations are not high enough to have the perfectly pure qubit. In this paper, we consider a more realistic one-clean qubit model, where the clean qubit is not clean, but depolarized. We first show that, for any polarization, a multiplicative-error calculation of the output probability distribution of the model is possible in a classical polynomial time if we take an appropriately large multiplicative error. The result is in strong contrast with that of the ideal one-clean qubit model where the classical efficient multiplicative-error calculation (or even the sampling) with the same amount of error causes the collapse of the polynomial hierarchy. We next show that, for any polarization lower-bounded by an inverse polynomial, a classical efficient sampling (in terms of a sufficiently small multiplicative error or an exponentially small additive error) of the output probability distribution of the model is impossible unless BQP (bounded error quantum polynomial time) is contained in the second level of the polynomial hierarchy, which suggests the hardness of the classical efficient simulation of the one nonclean qubit model.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environmentmore » and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  17. GiPSi:a framework for open source/open architecture software development for organ-level surgical simulation.

    PubMed

    Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank

    2006-04-01

    This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.

  18. Documentation for the MODFLOW 6 framework

    USGS Publications Warehouse

    Hughes, Joseph D.; Langevin, Christian D.; Banta, Edward R.

    2017-08-10

    MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. Growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Often times, there are incompatibilities between these different MODFLOW versions. The report describes a new MODFLOW framework called MODFLOW 6 that is designed to support multiple models and multiple types of models. The framework is written in Fortran using a modular object-oriented design. The primary framework components include the simulation (or main program), Timing Module, Solutions, Models, Exchanges, and Utilities. The first version of the framework focuses on numerical solutions, numerical models, and numerical exchanges. This focus on numerical models allows multiple numerical models to be tightly coupled at the matrix level.

  19. Testing Group Mean Differences of Latent Variables in Multilevel Data Using Multiple-Group Multilevel CFA and Multilevel MIMIC Modeling.

    PubMed

    Kim, Eun Sook; Cao, Chunhua

    2015-01-01

    Considering that group comparisons are common in social science, we examined two latent group mean testing methods when groups of interest were either at the between or within level of multilevel data: multiple-group multilevel confirmatory factor analysis (MG ML CFA) and multilevel multiple-indicators multiple-causes modeling (ML MIMIC). The performance of these methods were investigated through three Monte Carlo studies. In Studies 1 and 2, either factor variances or residual variances were manipulated to be heterogeneous between groups. In Study 3, which focused on within-level multiple-group analysis, six different model specifications were considered depending on how to model the intra-class group correlation (i.e., correlation between random effect factors for groups within cluster). The results of simulations generally supported the adequacy of MG ML CFA and ML MIMIC for multiple-group analysis with multilevel data. The two methods did not show any notable difference in the latent group mean testing across three studies. Finally, a demonstration with real data and guidelines in selecting an appropriate approach to multilevel multiple-group analysis are provided.

  20. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    NASA Astrophysics Data System (ADS)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  1. Composite load spectra for select space propulsion structural components

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Kurth, R. E.; Ho, H.

    1991-01-01

    The objective of this program is to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen posts and system ducting. The first approach will consist of using state of the art probabilistic methods to describe the individual loading conditions and combinations of these loading conditions to synthesize the composite load spectra simulation. The second approach will consist of developing coupled models for composite load spectra simulation which combine the deterministic models for composite load dynamic, acoustic, high pressure, and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients will then be determined using advanced probabilistic simulation methods with and without strategically selected experimental data.

  2. Wafer hotspot prevention using etch aware OPC correction

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Power, Dave; Salama, Mohamed; Chen, Ao

    2016-03-01

    As technology development advances into deep-sub-wavelength nodes, multiple patterning is becoming more essential to achieve the technology shrink requirements. Recently, Optical Proximity Correction (OPC) technology has proposed simultaneous correction of multiple mask-patterns to enable multiple patterning awareness during OPC correction. This is essential to prevent inter-layer hot-spots during the final pattern transfer. In state-of-art literature, multi-layer awareness is achieved using simultaneous resist-contour simulations to predict and correct for hot-spots during mask generation. However, this approach assumes a uniform etch shrink response for all patterns independent of their proximity, which isn't sufficient for the full prevention of inter-exposure hot-spot, for example different color space violations post etch or via coverage/enclosure post etch. In this paper, we explain the need to include the etch component during multiple patterning OPC. We also introduce a novel approach for Etch-aware simultaneous Multiple-patterning OPC, where we calibrate and verify a lumped model that includes the combined resist and etch responses. Adding this extra simulation condition during OPC is suitable for full chip processing from a computation intensity point of view. Also, using this model during OPC to predict and correct inter-exposures hot-spots is similar to previously proposed multiple-patterning OPC, yet our proposed approach more accurately corrects post-etch defects too.

  3. Modeling Smoke Plume-Rise and Dispersion from Southern United States Prescribed Burns with Daysmoke.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achtemeier, Gary, L.; Goodrick, Scott, A.; Liu, Yongqiang

    2011-08-19

    We present Daysmoke, an empirical-statistical plume rise and dispersion model for simulating smoke from prescribed burns. Prescribed fires are characterized by complex plume structure including multiple-core updrafts which makes modeling with simple plume models difficult. Daysmoke accounts for plume structure in a three-dimensional veering/sheering atmospheric environment, multiple-core updrafts, and detrainment of particulate matter. The number of empirical coefficients appearing in the model theory is reduced through a sensitivity analysis with the Fourier Amplitude Sensitivity Test (FAST). Daysmoke simulations for 'bent-over' plumes compare closely with Briggs theory although the two-thirds law is not explicit in Daysmoke. However, the solutions for themore » 'highly-tilted' plume characterized by weak buoyancy, low initial vertical velocity, and large initial plume diameter depart considerably from Briggs theory. Results from a study of weak plumes from prescribed burns at Fort Benning GA showed simulated ground-level PM2.5 comparing favorably with observations taken within the first eight kilometers of eleven prescribed burns. Daysmoke placed plume tops near the lower end of the range of observed plume tops for six prescribed burns. Daysmoke provides the levels and amounts of smoke injected into regional scale air quality models. Results from CMAQ with and without an adaptive grid are presented.« less

  4. A multiple-point geostatistical method for characterizing uncertainty of subsurface alluvial units and its effects on flow and transport

    USGS Publications Warehouse

    Cronkite-Ratcliff, C.; Phelps, G.A.; Boucher, A.

    2012-01-01

    This report provides a proof-of-concept to demonstrate the potential application of multiple-point geostatistics for characterizing geologic heterogeneity and its effect on flow and transport simulation. The study presented in this report is the result of collaboration between the U.S. Geological Survey (USGS) and Stanford University. This collaboration focused on improving the characterization of alluvial deposits by incorporating prior knowledge of geologic structure and estimating the uncertainty of the modeled geologic units. In this study, geologic heterogeneity of alluvial units is characterized as a set of stochastic realizations, and uncertainty is indicated by variability in the results of flow and transport simulations for this set of realizations. This approach is tested on a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. Yucca Flat was chosen as a data source for this test case because it includes both complex geologic and hydrologic characteristics and also contains a substantial amount of both surface and subsurface geologic data. Multiple-point geostatistics is used to model geologic heterogeneity in the subsurface. A three-dimensional (3D) model of spatial variability is developed by integrating alluvial units mapped at the surface with vertical drill-hole data. The SNESIM (Single Normal Equation Simulation) algorithm is used to represent geologic heterogeneity stochastically by generating 20 realizations, each of which represents an equally probable geologic scenario. A 3D numerical model is used to simulate groundwater flow and contaminant transport for each realization, producing a distribution of flow and transport responses to the geologic heterogeneity. From this distribution of flow and transport responses, the frequency of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary.

  5. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, David J.; Reynolds, Daniel R.

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  6. Filters for Improvement of Multiscale Data from Atomistic Simulations

    DOE PAGES

    Gardner, David J.; Reynolds, Daniel R.

    2017-01-05

    Multiscale computational models strive to produce accurate and efficient numerical simulations of systems involving interactions across multiple spatial and temporal scales that typically differ by several orders of magnitude. Some such models utilize a hybrid continuum-atomistic approach combining continuum approximations with first-principles-based atomistic models to capture multiscale behavior. By following the heterogeneous multiscale method framework for developing multiscale computational models, unknown continuum scale data can be computed from an atomistic model. Concurrently coupling the two models requires performing numerous atomistic simulations which can dominate the computational cost of the method. Furthermore, when the resulting continuum data is noisy due tomore » sampling error, stochasticity in the model, or randomness in the initial conditions, filtering can result in significant accuracy gains in the computed multiscale data without increasing the size or duration of the atomistic simulations. In this work, we demonstrate the effectiveness of spectral filtering for increasing the accuracy of noisy multiscale data obtained from atomistic simulations. Moreover, we present a robust and automatic method for closely approximating the optimum level of filtering in the case of additive white noise. By improving the accuracy of this filtered simulation data, it leads to a dramatic computational savings by allowing for shorter and smaller atomistic simulations to achieve the same desired multiscale simulation precision.« less

  7. Interpreting Space-Based Trends in Carbon Monoxide with Multiple Models

    NASA Technical Reports Server (NTRS)

    Strode, Sarah A.; Worden, Helen M.; Damon, Megan; Douglass, Anne R.; Duncan, Bryan N.; Emmons, Louisa K.; Lamarque, Jean-Francois; Manyin, Michael; Oman, Luke D.; Rodriguez, Jose M.; hide

    2016-01-01

    We use a series of chemical transport model and chemistry climate model simulations to investigate the observed negative trends in MOPITT CO over several regions of the world, and to examine the consistency of timedependent emission inventories with observations. We find that simulations driven by the MACCity inventory, used for the Chemistry Climate Modeling Initiative (CCMI), reproduce the negative trends in the CO column observed by MOPITT for 2000-2010 over the eastern United States and Europe. However, the simulations have positive trends over eastern China, in contrast to the negative trends observed by MOPITT. The model bias in CO, after applying MOPITT averaging kernels, contributes to the model-observation discrepancy in the trend over eastern China. This demonstrates that biases in a model's average concentrations can influence the interpretation of the temporal trend compared to satellite observations. The total ozone column plays a role in determining the simulated tropospheric CO trends. A large positive anomaly in the simulated total ozone column in 2010 leads to a negative anomaly in OH and hence a positive anomaly in CO, contributing to the positive trend in simulated CO. These results demonstrate that accurately simulating variability in the ozone column is important for simulating and interpreting trends in CO.

  8. A Social Diffusion Model with an Application on Election Simulation

    PubMed Central

    Wang, Fu-Min; Hung, San-Chuan; Kung, Perng-Hwa; Lin, Shou-De

    2014-01-01

    Issues about opinion diffusion have been studied for decades. It has so far no empirical approach to model the interflow and formation of crowd's opinion in elections due to two reasons. First, unlike the spread of information or flu, individuals have their intrinsic attitudes to election candidates in advance. Second, opinions are generally simply assumed as single values in most diffusion models. However, in this case, an opinion should represent preference toward multiple candidates. Previously done models thus may not intuitively interpret such scenario. This work is to design a diffusion model which is capable of managing the aforementioned scenario. To demonstrate the usefulness of our model, we simulate the diffusion on the network built based on a publicly available bibliography dataset. We compare the proposed model with other well-known models such as independent cascade. It turns out that our model consistently outperforms other models. We additionally investigate electoral issues with our model simulator. PMID:24995351

  9. Driver fatigue detection through multiple entropy fusion analysis in an EEG-based system.

    PubMed

    Min, Jianliang; Wang, Ping; Hu, Jianfeng

    2017-01-01

    Driver fatigue is an important contributor to road accidents, and fatigue detection has major implications for transportation safety. The aim of this research is to analyze the multiple entropy fusion method and evaluate several channel regions to effectively detect a driver's fatigue state based on electroencephalogram (EEG) records. First, we fused multiple entropies, i.e., spectral entropy, approximate entropy, sample entropy and fuzzy entropy, as features compared with autoregressive (AR) modeling by four classifiers. Second, we captured four significant channel regions according to weight-based electrodes via a simplified channel selection method. Finally, the evaluation model for detecting driver fatigue was established with four classifiers based on the EEG data from four channel regions. Twelve healthy subjects performed continuous simulated driving for 1-2 hours with EEG monitoring on a static simulator. The leave-one-out cross-validation approach obtained an accuracy of 98.3%, a sensitivity of 98.3% and a specificity of 98.2%. The experimental results verified the effectiveness of the proposed method, indicating that the multiple entropy fusion features are significant factors for inferring the fatigue state of a driver.

  10. Markov chains for testing redundant software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  11. Parallelization of a spatial random field characterization process using the Method of Anchored Distributions and the HTCondor high throughput computing system

    NASA Astrophysics Data System (ADS)

    Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)

  12. Use of soft data for multi-criteria calibration and validation of APEX: Impact on model simulations

    USDA-ARS?s Scientific Manuscript database

    It is widely known that the use of soft data and multiple model performance criteria in model calibration and validation is critical to ensuring the model capture major hydrologic and water quality processes. The Agricultural Policy/Environmental eXtender (APEX) is a hydrologic and water quality mod...

  13. Modeling Smoke Plume-Rise and Dispersion from Southern United States Prescribed Burns with Daysmoke

    Treesearch

    G L Achtemeier; S L Goodrick; Y Liu; F Garcia-Menendez; Y Hu; M. Odman

    2011-01-01

    We present Daysmoke, an empirical-statistical plume rise and dispersion model for simulating smoke from prescribed burns. Prescribed fires are characterized by complex plume structure including multiple-core updrafts which makes modeling with simple plume models difficult. Daysmoke accounts for plume structure in a three-dimensional veering/sheering atmospheric...

  14. Site-Specific Phosphorylation of VEGFR2 Is Mediated by Receptor Trafficking: Insights from a Computational Model

    PubMed Central

    Clegg, Lindsay Wendel; Mac Gabhann, Feilim

    2015-01-01

    Matrix-binding isoforms and non-matrix-binding isoforms of vascular endothelial growth factor (VEGF) are both capable of stimulating vascular remodeling, but the resulting blood vessel networks are structurally and functionally different. Here, we develop and validate a computational model of the binding of soluble and immobilized ligands to VEGF receptor 2 (VEGFR2), the endosomal trafficking of VEGFR2, and site-specific VEGFR2 tyrosine phosphorylation to study differences in induced signaling between these VEGF isoforms. In capturing essential features of VEGFR2 signaling and trafficking, our model suggests that VEGFR2 trafficking parameters are largely consistent across multiple endothelial cell lines. Simulations demonstrate distinct localization of VEGFR2 phosphorylated on Y1175 and Y1214. This is the first model to clearly show that differences in site-specific VEGFR2 activation when stimulated with immobilized VEGF compared to soluble VEGF can be accounted for by altered trafficking of VEGFR2 without an intrinsic difference in receptor activation. The model predicts that Neuropilin-1 can induce differences in the surface-to-internal distribution of VEGFR2. Simulations also show that ligated VEGFR2 and phosphorylated VEGFR2 levels diverge over time following stimulation. Using this model, we identify multiple key levers that alter how VEGF binding to VEGFR2 results in different coordinated patterns of multiple downstream signaling pathways. Specifically, simulations predict that VEGF immobilization, interactions with Neuropilin-1, perturbations of VEGFR2 trafficking, and changes in expression or activity of phosphatases acting on VEGFR2 all affect the magnitude, duration, and relative strength of VEGFR2 phosphorylation on tyrosines 1175 and 1214, and they do so predictably within our single consistent model framework. PMID:26067165

  15. A 1D thermomechanical network transition constitutive model coupled with multiple structural relaxation for shape memory polymers

    NASA Astrophysics Data System (ADS)

    Zeng, Hao; Xie, Zhimin; Gu, Jianping; Sun, Huiyu

    2018-03-01

    A new thermomechanical network transition constitutive model is proposed in the study to describe the viscoelastic behavior of shape memory polymers (SMPs). Based on the microstructure of semi-crystalline SMPs, a new simplified transformation equation is proposed to describe the transform of transient networks. And the generalized fractional Maxwell model is introduced in the paper to estimate the temperature-dependent storage modulus. In addition, a neo-KAHR theory with multiple discrete relaxation processes is put forward to study the structural relaxation of the nonlinear thermal strain in cooling/heating processes. The evolution equations of the time- and temperature-dependent stress and strain response are developed. In the model, the thermodynamical and mechanical characteristics of SMPs in the typical thermomechanical cycle are described clearly and the irreversible deformation is studied in detail. Finally, the typical thermomechanical cycles are simulated using the present constitutive model, and the simulation results agree well with the experimental results.

  16. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genser, Krzysztof; Hatcher, Robert; Perdue, Gabriel

    2016-11-10

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies.more » It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.« less

  17. Constraining Stochastic Parametrisation Schemes Using High-Resolution Model Simulations

    NASA Astrophysics Data System (ADS)

    Christensen, H. M.; Dawson, A.; Palmer, T.

    2017-12-01

    Stochastic parametrisations are used in weather and climate models as a physically motivated way to represent model error due to unresolved processes. Designing new stochastic schemes has been the target of much innovative research over the last decade. While a focus has been on developing physically motivated approaches, many successful stochastic parametrisation schemes are very simple, such as the European Centre for Medium-Range Weather Forecasts (ECMWF) multiplicative scheme `Stochastically Perturbed Parametrisation Tendencies' (SPPT). The SPPT scheme improves the skill of probabilistic weather and seasonal forecasts, and so is widely used. However, little work has focused on assessing the physical basis of the SPPT scheme. We address this matter by using high-resolution model simulations to explicitly measure the `error' in the parametrised tendency that SPPT seeks to represent. The high resolution simulations are first coarse-grained to the desired forecast model resolution before they are used to produce initial conditions and forcing data needed to drive the ECMWF Single Column Model (SCM). By comparing SCM forecast tendencies with the evolution of the high resolution model, we can measure the `error' in the forecast tendencies. In this way, we provide justification for the multiplicative nature of SPPT, and for the temporal and spatial scales of the stochastic perturbations. However, we also identify issues with the SPPT scheme. It is therefore hoped these measurements will improve both holistic and process based approaches to stochastic parametrisation. Figure caption: Instantaneous snapshot of the optimal SPPT stochastic perturbation, derived by comparing high-resolution simulations with a low resolution forecast model.

  18. Multiagent intelligent systems

    NASA Astrophysics Data System (ADS)

    Krause, Lee S.; Dean, Christopher; Lehman, Lynn A.

    2003-09-01

    This paper will discuss a simulation approach based upon a family of agent-based models. As the demands placed upon simulation technology by such applications as Effects Based Operations (EBO), evaluations of indicators and warnings surrounding homeland defense and commercial demands such financial risk management current single thread based simulations will continue to show serious deficiencies. The types of "what if" analysis required to support these types of applications, demand rapidly re-configurable approaches capable of aggregating large models incorporating multiple viewpoints. The use of agent technology promises to provide a broad spectrum of models incorporating differing viewpoints through a synthesis of a collection of models. Each model would provide estimates to the overall scenario based upon their particular measure or aspect. An agent framework, denoted as the "family" would provide a common ontology in support of differing aspects of the scenario. This approach permits the future of modeling to change from viewing the problem as a single thread simulation, to take into account multiple viewpoints from different models. Even as models are updated or replaced the agent approach permits rapid inclusion in new or modified simulations. In this approach a variety of low and high-resolution information and its synthesis requires a family of models. Each agent "publishes" its support for a given measure and each model provides their own estimates on the scenario based upon their particular measure or aspect. If more than one agent provides the same measure (e.g. cognitive) then the results from these agents are combined to form an aggregate measure response. The objective would be to inform and help calibrate a qualitative model, rather than merely to present highly aggregated statistical information. As each result is processed, the next action can then be determined. This is done by a top-level decision system that communicates to the family at the ontology level without any specific understanding of the processes (or model) behind each agent. The increasingly complex demands upon simulation for the necessity to incorporate the breadth and depth of influencing factors makes a family of agent based models a promising solution. This paper will discuss that solution with syntax and semantics necessary to support the approach.

  19. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator.

    PubMed

    Drewes, Rich; Zou, Quan; Goodman, Philip H

    2009-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.

  20. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator

    PubMed Central

    Drewes, Rich; Zou, Quan; Goodman, Philip H.

    2008-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707

  1. Composite Load Spectra for Select Space Propulsion Structural Components

    NASA Technical Reports Server (NTRS)

    Ho, Hing W.; Newell, James F.

    1994-01-01

    Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.

  2. Lattice Boltzmann simulations of immiscible displacement process with large viscosity ratios

    NASA Astrophysics Data System (ADS)

    Rao, Parthib; Schaefer, Laura

    2017-11-01

    Immiscible displacement is a key physical mechanism involved in enhanced oil recovery and carbon sequestration processes. This multiphase flow phenomenon involves a complex interplay of viscous, capillary, inertial and wettability effects. The lattice Boltzmann (LB) method is an accurate and efficient technique for modeling and simulating multiphase/multicomponent flows especially in complex flow configurations and media. In this presentation we present numerical simulation results of displacement process in thin long channels. The results are based on a new psuedo-potential multicomponent LB model with multiple relaxation time collision (MRT) model and explicit forcing scheme. We demonstrate that the proposed model is capable of accurately simulating the displacement process involving fluids with a wider range of viscosity ratios (>100) and which also leads to viscosity-independent interfacial tension and reduction of some important numerical artifacts.

  3. Experimental identification of a comb-shaped chaotic region in multiple parameter spaces simulated by the Hindmarsh—Rose neuron model

    NASA Astrophysics Data System (ADS)

    Jia, Bing

    2014-03-01

    A comb-shaped chaotic region has been simulated in multiple two-dimensional parameter spaces using the Hindmarsh—Rose (HR) neuron model in many recent studies, which can interpret almost all of the previously simulated bifurcation processes with chaos in neural firing patterns. In the present paper, a comb-shaped chaotic region in a two-dimensional parameter space was reproduced, which presented different processes of period-adding bifurcations with chaos with changing one parameter and fixed the other parameter at different levels. In the biological experiments, different period-adding bifurcation scenarios with chaos by decreasing the extra-cellular calcium concentration were observed from some neural pacemakers at different levels of extra-cellular 4-aminopyridine concentration and from other pacemakers at different levels of extra-cellular caesium concentration. By using the nonlinear time series analysis method, the deterministic dynamics of the experimental chaotic firings were investigated. The period-adding bifurcations with chaos observed in the experiments resembled those simulated in the comb-shaped chaotic region using the HR model. The experimental results show that period-adding bifurcations with chaos are preserved in different two-dimensional parameter spaces, which provides evidence of the existence of the comb-shaped chaotic region and a demonstration of the simulation results in different two-dimensional parameter spaces in the HR neuron model. The results also present relationships between different firing patterns in two-dimensional parameter spaces.

  4. Thread scheduling for GPU-based OPC simulation on multi-thread

    NASA Astrophysics Data System (ADS)

    Lee, Heejun; Kim, Sangwook; Hong, Jisuk; Lee, Sooryong; Han, Hwansoo

    2018-03-01

    As semiconductor product development based on shrinkage continues, the accuracy and difficulty required for the model based optical proximity correction (MBOPC) is increasing. OPC simulation time, which is the most timeconsuming part of MBOPC, is rapidly increasing due to high pattern density in a layout and complex OPC model. To reduce OPC simulation time, we attempt to apply graphic processing unit (GPU) to MBOPC because OPC process is good to be programmed in parallel. We address some issues that may typically happen during GPU-based OPC simulation in multi thread system, such as "out of memory" and "GPU idle time". To overcome these problems, we propose a thread scheduling method, which manages OPC jobs in multiple threads in such a way that simulations jobs from multiple threads are alternatively executed on GPU while correction jobs are executed at the same time in each CPU cores. It was observed that the amount of GPU peak memory usage decreases by up to 35%, and MBOPC runtime also decreases by 4%. In cases where out of memory issues occur in a multi-threaded environment, the thread scheduler was used to improve MBOPC runtime up to 23%.

  5. Modeling and Simulation of Lab-on-a-Chip Systems

    DTIC Science & Technology

    2005-08-12

    complex chip geometries (including multiple turns). Variations of sample concentration profiles in laminar diffusion-based micromixers are also derived...CHAPTER 6 MODELING OF LAMINAR DIFFUSION-BASED COMPLEX ELECTROKINETIC PASSIVE MICROMIXERS ...140 6.4.4 Multi-Stream (Inter-Digital) Micromixers

  6. Multiple-point statistical simulation for hydrogeological models: 3-D training image development and conditioning strategies

    NASA Astrophysics Data System (ADS)

    Høyer, Anne-Sophie; Vignoli, Giulio; Mejer Hansen, Thomas; Thanh Vu, Le; Keefer, Donald A.; Jørgensen, Flemming

    2017-12-01

    Most studies on the application of geostatistical simulations based on multiple-point statistics (MPS) to hydrogeological modelling focus on relatively fine-scale models and concentrate on the estimation of facies-level structural uncertainty. Much less attention is paid to the use of input data and optimal construction of training images. For instance, even though the training image should capture a set of spatial geological characteristics to guide the simulations, the majority of the research still relies on 2-D or quasi-3-D training images. In the present study, we demonstrate a novel strategy for 3-D MPS modelling characterized by (i) realistic 3-D training images and (ii) an effective workflow for incorporating a diverse group of geological and geophysical data sets. The study covers an area of 2810 km2 in the southern part of Denmark. MPS simulations are performed on a subset of the geological succession (the lower to middle Miocene sediments) which is characterized by relatively uniform structures and dominated by sand and clay. The simulated domain is large and each of the geostatistical realizations contains approximately 45 million voxels with size 100 m × 100 m × 5 m. Data used for the modelling include water well logs, high-resolution seismic data, and a previously published 3-D geological model. We apply a series of different strategies for the simulations based on data quality, and develop a novel method to effectively create observed spatial trends. The training image is constructed as a relatively small 3-D voxel model covering an area of 90 km2. We use an iterative training image development strategy and find that even slight modifications in the training image create significant changes in simulations. Thus, this study shows how to include both the geological environment and the type and quality of input information in order to achieve optimal results from MPS modelling. We present a practical workflow to build the training image and effectively handle different types of input information to perform large-scale geostatistical modelling.

  7. An integrated modelling framework for neural circuits with multiple neuromodulators.

    PubMed

    Joshi, Alok; Youssofzadeh, Vahab; Vemana, Vinith; McGinnity, T M; Prasad, Girijesh; Wong-Lin, KongFatt

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. © 2017 The Authors.

  8. An integrated modelling framework for neural circuits with multiple neuromodulators

    PubMed Central

    Vemana, Vinith

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. PMID:28100828

  9. Multiple transmitter performance with appropriate amplitude modulation for free-space optical communication.

    PubMed

    Tellez, Jason A; Schmidt, Jason D

    2011-08-20

    The propagation of a free-space optical communications signal through atmospheric turbulence experiences random fluctuations in intensity, including signal fades, which negatively impact the performance of the communications link. The gamma-gamma probability density function is commonly used to model the scintillation of a single beam. One proposed method to reduce the occurrence of scintillation-induced fades at the receiver plane involves the use of multiple beams propagating through independent paths, resulting in a sum of independent gamma-gamma random variables. Recently an analytical model for the probability distribution of irradiance from the sum of multiple independent beams was developed. Because truly independent beams are practically impossible to create, we present here a more general but approximate model for the distribution of beams traveling through partially correlated paths. This model compares favorably with wave-optics simulations and highlights the reduced scintillation as the number of transmitted beams is increased. Additionally, a pulse-position modulation scheme is used to reduce the impact of signal fades when they occur. Analytical and simulated results showed significantly improved performance when compared to fixed threshold on/off keying. © 2011 Optical Society of America

  10. PCTO-SIM: Multiple-point geostatistical modeling using parallel conditional texture optimization

    NASA Astrophysics Data System (ADS)

    Pourfard, Mohammadreza; Abdollahifard, Mohammad J.; Faez, Karim; Motamedi, Sayed Ahmad; Hosseinian, Tahmineh

    2017-05-01

    Multiple-point Geostatistics is a well-known general statistical framework by which complex geological phenomena have been modeled efficiently. Pixel-based and patch-based are two major categories of these methods. In this paper, the optimization-based category is used which has a dual concept in texture synthesis as texture optimization. Our extended version of texture optimization uses the energy concept to model geological phenomena. While honoring the hard point, the minimization of our proposed cost function forces simulation grid pixels to be as similar as possible to training images. Our algorithm has a self-enrichment capability and creates a richer training database from a sparser one through mixing the information of all surrounding patches of the simulation nodes. Therefore, it preserves pattern continuity in both continuous and categorical variables very well. It also shows a fuzzy result in its every realization similar to the expected result of multi realizations of other statistical models. While the main core of most previous Multiple-point Geostatistics methods is sequential, the parallel main core of our algorithm enabled it to use GPU efficiently to reduce the CPU time. One new validation method for MPS has also been proposed in this paper.

  11. Video event classification and image segmentation based on noncausal multidimensional hidden Markov models.

    PubMed

    Ma, Xiang; Schonfeld, Dan; Khokhar, Ashfaq A

    2009-06-01

    In this paper, we propose a novel solution to an arbitrary noncausal, multidimensional hidden Markov model (HMM) for image and video classification. First, we show that the noncausal model can be solved by splitting it into multiple causal HMMs and simultaneously solving each causal HMM using a fully synchronous distributed computing framework, therefore referred to as distributed HMMs. Next we present an approximate solution to the multiple causal HMMs that is based on an alternating updating scheme and assumes a realistic sequential computing framework. The parameters of the distributed causal HMMs are estimated by extending the classical 1-D training and classification algorithms to multiple dimensions. The proposed extension to arbitrary causal, multidimensional HMMs allows state transitions that are dependent on all causal neighbors. We, thus, extend three fundamental algorithms to multidimensional causal systems, i.e., 1) expectation-maximization (EM), 2) general forward-backward (GFB), and 3) Viterbi algorithms. In the simulations, we choose to limit ourselves to a noncausal 2-D model whose noncausality is along a single dimension, in order to significantly reduce the computational complexity. Simulation results demonstrate the superior performance, higher accuracy rate, and applicability of the proposed noncausal HMM framework to image and video classification.

  12. Measurement and Structural Model Class Separation in Mixture CFA: ML/EM versus MCMC

    ERIC Educational Resources Information Center

    Depaoli, Sarah

    2012-01-01

    Parameter recovery was assessed within mixture confirmatory factor analysis across multiple estimator conditions under different simulated levels of mixture class separation. Mixture class separation was defined in the measurement model (through factor loadings) and the structural model (through factor variances). Maximum likelihood (ML) via the…

  13. The Performance of IRT Model Selection Methods with Mixed-Format Tests

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.; Chang, Wanchen; Dodd, Barbara G.

    2012-01-01

    When tests consist of multiple-choice and constructed-response items, researchers are confronted with the question of which item response theory (IRT) model combination will appropriately represent the data collected from these mixed-format tests. This simulation study examined the performance of six model selection criteria, including the…

  14. HexSim: A flexible simulation model for forecasting wildlife responses to multiple interacting stressors - ESRP Meeting

    EPA Science Inventory

    With SERDP funding, we have improved upon a popular life history simulator (PATCH), and indoing so produced a powerful new forecasting tool (HexSim). PATCH, our starting point, was spatially explicit and individual-based, and was useful for evaluating a range of terrestrial life...

  15. Spatial application of WEPS for estimating wind erosion in the Pacific Northwest

    USDA-ARS?s Scientific Manuscript database

    The Wind Erosion Prediction System (WEPS) is used to simulate soil erosion on croplands and was originally designed to run field scale simulations. This research is an extension of the WEPS model to run on multiple fields (grids) covering a larger region. We modified the WEPS source code to allow it...

  16. Proton Lateral Broadening Distribution Comparisons Between GRNTRN, MCNPX, and Laboratory Beam Measurements

    NASA Technical Reports Server (NTRS)

    Mertens, Christopher J.; Moyers, Michael F.; Walker, Steven A.; Tweed, John

    2010-01-01

    Recent developments in NASA s deterministic High charge (Z) and Energy TRaNsport (HZETRN) code have included lateral broadening of primary ion beams due to small-angle multiple Coulomb scattering, and coupling of the ion-nuclear scattering interactions with energy loss and straggling. This new version of HZETRN is based on Green function methods, called GRNTRN, and is suitable for modeling transport with both space environment and laboratory boundary conditions. Multiple scattering processes are a necessary extension to GRNTRN in order to accurately model ion beam experiments, to simulate the physical and biological-effective radiation dose, and to develop new methods and strategies for light ion radiation therapy. In this paper we compare GRNTRN simulations of proton lateral broadening distributions with beam measurements taken at Loma Linda University Proton Therapy Facility. The simulated and measured lateral broadening distributions are compared for a 250 MeV proton beam on aluminum, polyethylene, polystyrene, bone substitute, iron, and lead target materials. The GRNTRN results are also compared to simulations from the Monte Carlo MCNPX code for the same projectile-target combinations described above.

  17. An assessment of the cloud signals simulated by NICAM using ISCCP, CALIPSO, and CloudSat satellite simulators

    NASA Astrophysics Data System (ADS)

    Kodama, C.; Noda, A. T.; Satoh, M.

    2012-06-01

    This study presents an assessment of three-dimensional structures of hydrometeors simulated by the NICAM, global nonhydrostatic atmospheric model without cumulus parameterization, using multiple satellite data sets. A satellite simulator package (COSP: the CFMIP Observation Simulator Package) is employed to consistently compare model output with ISCCP, CALIPSO, and CloudSat satellite observations. Special focus is placed on high thin clouds, which are not observable in the conventional ISCCP data set, but can be detected by the CALIPSO observations. For the control run, the NICAM simulation qualitatively captures the geographical distributions of the high, middle, and low clouds, even though the horizontal mesh spacing is as coarse as 14 km. The simulated low cloud is very close to that of the CALIPSO low cloud. Both the CloudSat observations and NICAM simulation show a boomerang-type pattern in the radar reflectivity-height histogram, suggesting that NICAM realistically simulates the deep cloud development process. A striking difference was found in the comparisons of high thin cirrus, showing overestimated cloud and higher cloud top in the model simulation. Several model sensitivity experiments are conducted with different cloud microphysical parameters to reduce the model-observation discrepancies in high thin cirrus. In addition, relationships among clouds, Hadley circulation, outgoing longwave radiation and precipitation are discussed through the sensitivity experiments.

  18. Nonlinear modeling of wave-topography interactions, shear instabilities and shear induced wave breaking using vortex method

    NASA Astrophysics Data System (ADS)

    Guha, Anirban

    2017-11-01

    Theoretical studies on linear shear instabilities as well as different kinds of wave interactions often use simple velocity and/or density profiles (e.g. constant, piecewise) for obtaining good qualitative and quantitative predictions of the initial disturbances. Moreover, such simple profiles provide a minimal model to obtain a mechanistic understanding of shear instabilities. Here we have extended this minimal paradigm into nonlinear domain using vortex method. Making use of unsteady Bernoulli's equation in presence of linear shear, and extending Birkhoff-Rott equation to multiple interfaces, we have numerically simulated the interaction between multiple fully nonlinear waves. This methodology is quite general, and has allowed us to simulate diverse problems that can be essentially reduced to the minimal system with interacting waves, e.g. spilling and plunging breakers, stratified shear instabilities (Holmboe, Taylor-Caulfield, stratified Rayleigh), jet flows, and even wave-topography interaction problem like Bragg resonance. We found that the minimal models capture key nonlinear features (e.g. wave breaking features like cusp formation and roll-ups) which are observed in experiments and/or extensive simulations with smooth, realistic profiles.

  19. Multiagent Work Practice Simulation: Progress and Challenges

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)

    2001-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  20. Multiagent Work Practice Simulation: Progress and Challenges

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten

    2002-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  1. Assimilating Flow Data into Complex Multiple-Point Statistical Facies Models Using Pilot Points Method

    NASA Astrophysics Data System (ADS)

    Ma, W.; Jafarpour, B.

    2017-12-01

    We develop a new pilot points method for conditioning discrete multiple-point statistical (MPS) facies simulation on dynamic flow data. While conditioning MPS simulation on static hard data is straightforward, their calibration against nonlinear flow data is nontrivial. The proposed method generates conditional models from a conceptual model of geologic connectivity, known as a training image (TI), by strategically placing and estimating pilot points. To place pilot points, a score map is generated based on three sources of information:: (i) the uncertainty in facies distribution, (ii) the model response sensitivity information, and (iii) the observed flow data. Once the pilot points are placed, the facies values at these points are inferred from production data and are used, along with available hard data at well locations, to simulate a new set of conditional facies realizations. While facies estimation at the pilot points can be performed using different inversion algorithms, in this study the ensemble smoother (ES) and its multiple data assimilation variant (ES-MDA) are adopted to update permeability maps from production data, which are then used to statistically infer facies types at the pilot point locations. The developed method combines the information in the flow data and the TI by using the former to infer facies values at select locations away from the wells and the latter to ensure consistent facies structure and connectivity where away from measurement locations. Several numerical experiments are used to evaluate the performance of the developed method and to discuss its important properties.

  2. The Importance of Explicitly Representing Soil Carbon with Depth over the Permafrost Region in Earth System Models: Implications for Atmospheric Carbon Dynamics at Multiple Temporal Scales between 1960 and 2300.

    NASA Astrophysics Data System (ADS)

    McGuire, A. D.

    2014-12-01

    We conducted an assessment of changes in permafrost area and carbon storage simulated by process-based models between 1960 and 2300. The models participating in this comparison were those that had joined the model integration team of the Vulnerability of Permafrost Carbon Research Coordination Network (see http://www.biology.ufl.edu/permafrostcarbon/). Each of the models in this comparison conducted simulations over the permafrost land region in the Northern Hemisphere driven by CCSM4-simulated climate for RCP 4.5 and 8.5 scenarios. Among the models, the area of permafrost (defined as the area for which active layer thickness was less than 3 m) ranged between 13.2 and 20.0 million km2. Between 1960 and 2300, models indicated the loss of permafrost area between 5.1 to 6.0 million km2 for RCP 4.5 and between 7.1 and 15.2 million km2 for RCP 8.5. Among the models, the density of soil carbon storage in 1960 ranged between 13 and 42 thousand g C m-2; models that explicitly represented carbon with depth had estimates greater than 27 thousand g C m-2. For the RCP 4.5 scenario, changes in soil carbon between 1960 and 2300 ranged between losses of 32 Pg C to gains of 58 Pg C, in which models that explicitly represent soil carbon with depth simulated losses or lower gains of soil carbon in comparison with those that did not. For the RCP 8.5 scenario, changes in soil carbon between 1960 and 2300 ranged between losses of 642 Pg C to gains of 66 Pg C, in which those models that represent soil carbon explicitly with depth all simulated losses, while those that do not all simulated gains. These results indicate that there are substantial differences in responses of carbon dynamics between model that do and do not explicitly represent soil carbon with depth in the permafrost region. We present analyses of the implications of the differences for atmospheric carbon dynamics at multiple temporal scales between 1960 and 2300.

  3. Accounting for aquifer heterogeneity from geological data to management tools.

    PubMed

    Blouin, Martin; Martel, Richard; Gloaguen, Erwan

    2013-01-01

    A nested workflow of multiple-point geostatistics (MPG) and sequential Gaussian simulation (SGS) was tested on a study area of 6 km(2) located about 20 km northwest of Quebec City, Canada. In order to assess its geological and hydrogeological parameter heterogeneity and to provide tools to evaluate uncertainties in aquifer management, direct and indirect field measurements are used as inputs in the geostatistical simulations to reproduce large and small-scale heterogeneities. To do so, the lithological information is first associated to equivalent hydrogeological facies (hydrofacies) according to hydraulic properties measured at several wells. Then, heterogeneous hydrofacies (HF) realizations are generated using a prior geological model as training image (TI) with the MPG algorithm. The hydraulic conductivity (K) heterogeneity modeling within each HF is finally computed using SGS algorithm. Different K models are integrated in a finite-element hydrogeological model to calculate multiple transport simulations. Different scenarios exhibit variations in mass transport path and dispersion associated with the large- and small-scale heterogeneity respectively. Three-dimensional maps showing the probability of overpassing different thresholds are presented as examples of management tools. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  4. Study of cosmic ray events with high muon multiplicity using the ALICE detector at the CERN Large Hadron Collider

    DOE PAGES

    Adam, J.

    2016-01-19

    ALICE is one of four large experiments at the CERN Large Hadron Collider near Geneva, specially designed to study particle production in ultra-relativistic heavy-ion collisions. Located 52 meters underground with 28 meters of overburden rock, it has also been used to detect muons produced by cosmic ray interactions in the upper atmosphere. Here, we present the multiplicity distribution of these atmospheric muons and its comparison with Monte Carlo simulations. Our analysis exploits the large size and excellent tracking capability of the ALICE Time Projection Chamber. A special emphasis is given to the study of high multiplicity events containing more thanmore » 100 reconstructed muons and corresponding to a muon areal density rho(mu) > 5.9 m(-2). Similar events have been studied in previous underground experiments such as ALEPH and DELPHI at LEP. While these experiments were able to reproduce the measured muon multiplicity distribution with Monte Carlo simulations at low and intermediate multiplicities, their simulations failed to describe the frequency of the highest multiplicity events. In this work we show that the high multiplicity events observed in ALICE stem from primary cosmic rays with energies above 10(16) eV and that the frequency of these events can be successfully described by assuming a heavy mass composition of primary cosmic rays in this energy range. Furthermore, the development of the resulting air showers was simulated using the latest version of QGSJET to model hadronic interactions. This observation places significant constraints on alternative, more exotic, production mechanisms for these events.« less

  5. Safe motion planning for mobile agents: A model of reactive planning for multiple mobile agents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fujimura, Kikuo.

    1990-01-01

    The problem of motion planning for multiple mobile agents is studied. Each planning agent independently plans its own action based on its map which contains a limited information about the environment. In an environment where more than one mobile agent interacts, the motions of the robots are uncertain and dynamic. A model for reactive agents is described and simulation results are presented to show their behavior patterns. 18 refs., 2 figs.

  6. Integrated Predictive Tools for Customizing Microstructure and Material Properties of Additively Manufactured Aerospace Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radhakrishnan, Balasubramaniam; Fattebert, Jean-Luc; Gorti, Sarma B.

    Additive Manufacturing (AM) refers to a process by which digital three-dimensional (3-D) design data is converted to build up a component by depositing material layer-by-layer. United Technologies Corporation (UTC) is currently involved in fabrication and certification of several AM aerospace structural components made from aerospace materials. This is accomplished by using optimized process parameters determined through numerous design-of-experiments (DOE)-based studies. Certification of these components is broadly recognized as a significant challenge, with long lead times, very expensive new product development cycles and very high energy consumption. Because of these challenges, United Technologies Research Center (UTRC), together with UTC business unitsmore » have been developing and validating an advanced physics-based process model. The specific goal is to develop a physics-based framework of an AM process and reliably predict fatigue properties of built-up structures as based on detailed solidification microstructures. Microstructures are predicted using process control parameters including energy source power, scan velocity, deposition pattern, and powder properties. The multi-scale multi-physics model requires solution and coupling of governing physics that will allow prediction of the thermal field and enable solution at the microstructural scale. The state-of-the-art approach to solve these problems requires a huge computational framework and this kind of resource is only available within academia and national laboratories. The project utilized the parallel phase-fields codes at Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL), along with the high-performance computing (HPC) capabilities existing at the two labs to demonstrate the simulation of multiple dendrite growth in threedimensions (3-D). The LLNL code AMPE was used to implement the UTRC phase field model that was previously developed for a model binary alloy, and the simulation results were compared against the UTRC simulation results, followed by extension of the UTRC model to simulate multiple dendrite growth in 3-D. The ORNL MEUMAPPS code was used to simulate dendritic growth in a model ternary alloy with the same equilibrium solidification range as the Ni-base alloy 718 using realistic model parameters, including thermodynamic integration with a Calphad based model for the ternary alloy. Implementation of the UTRC model in AMPE met with several numerical and parametric issues that were resolved and good comparison between the simulation results obtained by the two codes was demonstrated for two dimensional (2-D) dendrites. 3-D dendrite growth was then demonstrated with the AMPE code using nondimensional parameters obtained in 2-D simulations. Multiple dendrite growth in 2-D and 3-D were demonstrated using ORNL’s MEUMAPPS code using simple thermal boundary conditions. MEUMAPPS was then modified to incorporate the complex, time-dependent thermal boundary conditions obtained by UTRC’s thermal modeling of single track AM experiments to drive the phase field simulations. The results were in good agreement with UTRC’s experimental measurements.« less

  7. Accuracy and performance of 3D mask models in optical projection lithography

    NASA Astrophysics Data System (ADS)

    Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar

    2011-04-01

    Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.

  8. Computational investigation of cholesterol binding sites on mitochondrial VDAC.

    PubMed

    Weiser, Brian P; Salari, Reza; Eckenhoff, Roderic G; Brannigan, Grace

    2014-08-21

    The mitochondrial voltage-dependent anion channel (VDAC) allows passage of ions and metabolites across the mitochondrial outer membrane. Cholesterol binds mammalian VDAC, and we investigated the effects of binding to human VDAC1 with atomistic molecular dynamics simulations that totaled 1.4 μs. We docked cholesterol to specific sites on VDAC that were previously identified with NMR, and we tested the reliability of multiple docking results in each site with simulations. The most favorable binding modes were used to build a VDAC model with cholesterol occupying five unique sites, and during multiple 100 ns simulations, cholesterol stably and reproducibly remained bound to the protein. For comparison, VDAC was simulated in systems with identical components but with cholesterol initially unbound. The dynamics of loops that connect adjacent β-strands were most affected by bound cholesterol, with the averaged root-mean-square fluctuation (RMSF) of multiple residues altered by 20-30%. Cholesterol binding also stabilized charged residues inside the channel and localized the surrounding electrostatic potentials. Despite this, ion diffusion through the channel was not significantly affected by bound cholesterol, as evidenced by multi-ion potential of mean force measurements. Although we observed modest effects of cholesterol on the open channel, our model will be particularly useful in experiments that investigate how cholesterol affects VDAC function under applied electrochemical forces and also how other ligands and proteins interact with the channel.

  9. Computational Investigation of Cholesterol Binding Sites on Mitochondrial VDAC

    PubMed Central

    2015-01-01

    The mitochondrial voltage-dependent anion channel (VDAC) allows passage of ions and metabolites across the mitochondrial outer membrane. Cholesterol binds mammalian VDAC, and we investigated the effects of binding to human VDAC1 with atomistic molecular dynamics simulations that totaled 1.4 μs. We docked cholesterol to specific sites on VDAC that were previously identified with NMR, and we tested the reliability of multiple docking results in each site with simulations. The most favorable binding modes were used to build a VDAC model with cholesterol occupying five unique sites, and during multiple 100 ns simulations, cholesterol stably and reproducibly remained bound to the protein. For comparison, VDAC was simulated in systems with identical components but with cholesterol initially unbound. The dynamics of loops that connect adjacent β-strands were most affected by bound cholesterol, with the averaged root-mean-square fluctuation (RMSF) of multiple residues altered by 20–30%. Cholesterol binding also stabilized charged residues inside the channel and localized the surrounding electrostatic potentials. Despite this, ion diffusion through the channel was not significantly affected by bound cholesterol, as evidenced by multi-ion potential of mean force measurements. Although we observed modest effects of cholesterol on the open channel, our model will be particularly useful in experiments that investigate how cholesterol affects VDAC function under applied electrochemical forces and also how other ligands and proteins interact with the channel. PMID:25080204

  10. Pointing System Simulation Toolbox with Application to a Balloon Mission Simulator

    NASA Technical Reports Server (NTRS)

    Maringolo Baldraco, Rosana M.; Aretskin-Hariton, Eliot D.; Swank, Aaron J.

    2017-01-01

    The development of attitude estimation and pointing-control algorithms is necessary in order to achieve high-fidelity modeling for a Balloon Mission Simulator (BMS). A pointing system simulation toolbox was developed to enable this. The toolbox consists of a star-tracker (ST) and Inertial Measurement Unit (IMU) signal generator, a UDP (User Datagram Protocol) communication le (bridge), and an indirect-multiplicative extended Kalman filter (imEKF). This document describes the Python toolbox developed and the results of its implementation in the imEKF.

  11. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  12. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE PAGES

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    2016-04-01

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  13. Smoking cessation treatment and outcomes patterns simulation: a new framework for evaluating the potential health and economic impact of smoking cessation interventions.

    PubMed

    Getsios, Denis; Marton, Jenő P; Revankar, Nikhil; Ward, Alexandra J; Willke, Richard J; Rublee, Dale; Ishak, K Jack; Xenakis, James G

    2013-09-01

    Most existing models of smoking cessation treatments have considered a single quit attempt when modelling long-term outcomes. To develop a model to simulate smokers over their lifetimes accounting for multiple quit attempts and relapses which will allow for prediction of the long-term health and economic impact of smoking cessation strategies. A discrete event simulation (DES) that models individuals' life course of smoking behaviours, attempts to quit, and the cumulative impact on health and economic outcomes was developed. Each individual is assigned one of the available strategies used to support each quit attempt; the outcome of each attempt, time to relapses if abstinence is achieved, and time between quit attempts is tracked. Based on each individual's smoking or abstinence patterns, the risk of developing diseases associated with smoking (chronic obstructive pulmonary disease, lung cancer, myocardial infarction and stroke) is determined and the corresponding costs, changes to mortality, and quality of life assigned. Direct costs are assessed from the perspective of a comprehensive US healthcare payer ($US, 2012 values). Quit attempt strategies that can be evaluated in the current simulation include unassisted quit attempts, brief counselling, behavioural modification therapy, nicotine replacement therapy, bupropion, and varenicline, with the selection of strategies and time between quit attempts based on equations derived from survey data. Equations predicting the success of quit attempts as well as the short-term probability of relapse were derived from five varenicline clinical trials. Concordance between the five trials and predictions from the simulation on abstinence at 12 months was high, indicating that the equations predicting success and relapse in the first year following a quit attempt were reliable. Predictions allowing for only a single quit attempt versus unrestricted attempts demonstrate important differences, with the single quit attempt simulation predicting 19 % more smoking-related diseases and 10 % higher costs associated with smoking-related diseases. Differences are most prominent in predictions of the time that individuals abstain from smoking: 13.2 years on average over a lifetime allowing for multiple quit attempts, versus only 1.2 years with single quit attempts. Differences in abstinence time estimates become substantial only 5 years into the simulation. In the multiple quit attempt simulations, younger individuals survived longer, yet had lower lifetime smoking-related disease and total costs, while the opposite was true for those with high levels of nicotine dependence. By allowing for multiple quit attempts over the course of individuals' lives, the simulation can provide more reliable estimates on the health and economic impact of interventions designed to increase abstinence from smoking. Furthermore, the individual nature of the simulation allows for evaluation of outcomes in populations with different baseline profiles. DES provides a framework for comprehensive and appropriate predictions when applied to smoking cessation over smoker lifetimes.

  14. Evaluation of the new EMAC-SWIFT chemistry climate model

    NASA Astrophysics Data System (ADS)

    Scheffler, Janice; Langematz, Ulrike; Wohltmann, Ingo; Rex, Markus

    2016-04-01

    It is well known that the representation of atmospheric ozone chemistry in weather and climate models is essential for a realistic simulation of the atmospheric state. Including atmospheric ozone chemistry into climate simulations is usually done by prescribing a climatological ozone field, by including a fast linear ozone scheme into the model or by using a climate model with complex interactive chemistry. While prescribed climatological ozone fields are often not aligned with the modelled dynamics, a linear ozone scheme may not be applicable for a wide range of climatological conditions. Although interactive chemistry provides a realistic representation of atmospheric chemistry such model simulations are computationally very expensive and hence not suitable for ensemble simulations or simulations with multiple climate change scenarios. A new approach to represent atmospheric chemistry in climate models which can cope with non-linearities in ozone chemistry and is applicable to a wide range of climatic states is the Semi-empirical Weighted Iterative Fit Technique (SWIFT) that is driven by reanalysis data and has been validated against observational satellite data and runs of a full Chemistry and Transport Model. SWIFT has recently been implemented into the ECHAM/MESSy (EMAC) chemistry climate model that uses a modular approach to climate modelling where individual model components can be switched on and off. Here, we show first results of EMAC-SWIFT simulations and validate these against EMAC simulations using the complex interactive chemistry scheme MECCA, and against observations.

  15. Comparing multiple turbulence restoration algorithms performance on noisy anisoplanatic imagery

    NASA Astrophysics Data System (ADS)

    Rucci, Michael A.; Hardie, Russell C.; Dapore, Alexander J.

    2017-05-01

    In this paper, we compare the performance of multiple turbulence mitigation algorithms to restore imagery degraded by atmospheric turbulence and camera noise. In order to quantify and compare algorithm performance, imaging scenes were simulated by applying noise and varying levels of turbulence. For the simulation, a Monte-Carlo wave optics approach is used to simulate the spatially and temporally varying turbulence in an image sequence. A Poisson-Gaussian noise mixture model is then used to add noise to the observed turbulence image set. These degraded image sets are processed with three separate restoration algorithms: Lucky Look imaging, bispectral speckle imaging, and a block matching method with restoration filter. These algorithms were chosen because they incorporate different approaches and processing techniques. The results quantitatively show how well the algorithms are able to restore the simulated degraded imagery.

  16. Use of Multiple GPUs to Speedup the Execution of a Three-Dimensional Computational Model of the Innate Immune System

    NASA Astrophysics Data System (ADS)

    Xavier, M. P.; do Nascimento, T. M.; dos Santos, R. W.; Lobosco, M.

    2014-03-01

    The development of computational systems that mimics the physiological response of organs or even the entire body is a complex task. One of the issues that makes this task extremely complex is the huge computational resources needed to execute the simulations. For this reason, the use of parallel computing is mandatory. In this work, we focus on the simulation of temporal and spatial behaviour of some human innate immune system cells and molecules in a small three-dimensional section of a tissue. To perform this simulation, we use multiple Graphics Processing Units (GPUs) in a shared-memory environment. Despite of high initialization and communication costs imposed by the use of GPUs, the techniques used to implement the HIS simulator have shown to be very effective to achieve this purpose.

  17. Modeling target normal sheath acceleration using handoffs between multiple simulations

    NASA Astrophysics Data System (ADS)

    McMahon, Matthew; Willis, Christopher; Mitchell, Robert; King, Frank; Schumacher, Douglass; Akli, Kramer; Freeman, Richard

    2013-10-01

    We present a technique to model the target normal sheath acceleration (TNSA) process using full-scale LSP PIC simulations. The technique allows for a realistic laser, full size target and pre-plasma, and sufficient propagation length for the accelerated ions and electrons. A first simulation using a 2D Cartesian grid models the laser-plasma interaction (LPI) self-consistently and includes field ionization. Electrons accelerated by the laser are imported into a second simulation using a 2D cylindrical grid optimized for the initial TNSA process and incorporating an equation of state. Finally, all of the particles are imported to a third simulation optimized for the propagation of the accelerated ions and utilizing a static field solver for initialization. We also show use of 3D LPI simulations. Simulation results are compared to recent ion acceleration experiments using SCARLET laser at The Ohio State University. This work was performed with support from ASOFR under contract # FA9550-12-1-0341, DARPA, and allocations of computing time from the Ohio Supercomputing Center.

  18. Dynamical simulation of E-ELT segmented primary mirror

    NASA Astrophysics Data System (ADS)

    Sedghi, B.; Muller, M.; Bauvir, B.

    2011-09-01

    The dynamical behavior of the primary mirror (M1) has an important impact on the control of the segments and the performance of the telescope. Control of large segmented mirrors with a large number of actuators and sensors and multiple control loops in real life is a challenging problem. In virtual life, modeling, simulation and analysis of the M1 bears similar difficulties and challenges. In order to capture the dynamics of the segment subunits (high frequency modes) and the telescope back structure (low frequency modes), high order dynamical models with a very large number of inputs and outputs need to be simulated. In this paper, different approaches for dynamical modeling and simulation of the M1 segmented mirror subject to various perturbations, e.g. sensor noise, wind load, vibrations, earthquake are presented.

  19. Cloud-based simulations on Google Exacycle reveal ligand modulation of GPCR activation pathways

    NASA Astrophysics Data System (ADS)

    Kohlhoff, Kai J.; Shukla, Diwakar; Lawrenz, Morgan; Bowman, Gregory R.; Konerding, David E.; Belov, Dan; Altman, Russ B.; Pande, Vijay S.

    2014-01-01

    Simulations can provide tremendous insight into the atomistic details of biological mechanisms, but micro- to millisecond timescales are historically only accessible on dedicated supercomputers. We demonstrate that cloud computing is a viable alternative that brings long-timescale processes within reach of a broader community. We used Google's Exacycle cloud-computing platform to simulate two milliseconds of dynamics of a major drug target, the G-protein-coupled receptor β2AR. Markov state models aggregate independent simulations into a single statistical model that is validated by previous computational and experimental results. Moreover, our models provide an atomistic description of the activation of a G-protein-coupled receptor and reveal multiple activation pathways. Agonists and inverse agonists interact differentially with these pathways, with profound implications for drug design.

  20. A Multi-Scale Method for Dynamics Simulation in Continuum Solvent Models I: Finite-Difference Algorithm for Navier-Stokes Equation.

    PubMed

    Xiao, Li; Cai, Qin; Li, Zhilin; Zhao, Hongkai; Luo, Ray

    2014-11-25

    A multi-scale framework is proposed for more realistic molecular dynamics simulations in continuum solvent models by coupling a molecular mechanics treatment of solute with a fluid mechanics treatment of solvent. This article reports our initial efforts to formulate the physical concepts necessary for coupling the two mechanics and develop a 3D numerical algorithm to simulate the solvent fluid via the Navier-Stokes equation. The numerical algorithm was validated with multiple test cases. The validation shows that the algorithm is effective and stable, with observed accuracy consistent with our design.

  1. Enhanced nonlinearity interval mapping scheme for high-performance simulation-optimization of watershed-scale BMP placement

    NASA Astrophysics Data System (ADS)

    Zou, Rui; Riverson, John; Liu, Yong; Murphy, Ryan; Sim, Youn

    2015-03-01

    Integrated continuous simulation-optimization models can be effective predictors of a process-based responses for cost-benefit optimization of best management practices (BMPs) selection and placement. However, practical application of simulation-optimization model is computationally prohibitive for large-scale systems. This study proposes an enhanced Nonlinearity Interval Mapping Scheme (NIMS) to solve large-scale watershed simulation-optimization problems several orders of magnitude faster than other commonly used algorithms. An efficient interval response coefficient (IRC) derivation method was incorporated into the NIMS framework to overcome a computational bottleneck. The proposed algorithm was evaluated using a case study watershed in the Los Angeles County Flood Control District. Using a continuous simulation watershed/stream-transport model, Loading Simulation Program in C++ (LSPC), three nested in-stream compliance points (CP)—each with multiple Total Maximum Daily Loads (TMDL) targets—were selected to derive optimal treatment levels for each of the 28 subwatersheds, so that the TMDL targets at all the CP were met with the lowest possible BMP implementation cost. Genetic Algorithm (GA) and NIMS were both applied and compared. The results showed that the NIMS took 11 iterations (about 11 min) to complete with the resulting optimal solution having a total cost of 67.2 million, while each of the multiple GA executions took 21-38 days to reach near optimal solutions. The best solution obtained among all the GA executions compared had a minimized cost of 67.7 million—marginally higher, but approximately equal to that of the NIMS solution. The results highlight the utility for decision making in large-scale watershed simulation-optimization formulations.

  2. Models of emergency departments for reducing patient waiting times.

    PubMed

    Laskowski, Marek; McLeod, Robert D; Friesen, Marcia R; Podaima, Blake W; Alfa, Attahiru S

    2009-07-02

    In this paper, we apply both agent-based models and queuing models to investigate patient access and patient flow through emergency departments. The objective of this work is to gain insights into the comparative contributions and limitations of these complementary techniques, in their ability to contribute empirical input into healthcare policy and practice guidelines. The models were developed independently, with a view to compare their suitability to emergency department simulation. The current models implement relatively simple general scenarios, and rely on a combination of simulated and real data to simulate patient flow in a single emergency department or in multiple interacting emergency departments. In addition, several concepts from telecommunications engineering are translated into this modeling context. The framework of multiple-priority queue systems and the genetic programming paradigm of evolutionary machine learning are applied as a means of forecasting patient wait times and as a means of evolving healthcare policy, respectively. The models' utility lies in their ability to provide qualitative insights into the relative sensitivities and impacts of model input parameters, to illuminate scenarios worthy of more complex investigation, and to iteratively validate the models as they continue to be refined and extended. The paper discusses future efforts to refine, extend, and validate the models with more data and real data relative to physical (spatial-topographical) and social inputs (staffing, patient care models, etc.). Real data obtained through proximity location and tracking system technologies is one example discussed.

  3. Models of Emergency Departments for Reducing Patient Waiting Times

    PubMed Central

    Laskowski, Marek; McLeod, Robert D.; Friesen, Marcia R.; Podaima, Blake W.; Alfa, Attahiru S.

    2009-01-01

    In this paper, we apply both agent-based models and queuing models to investigate patient access and patient flow through emergency departments. The objective of this work is to gain insights into the comparative contributions and limitations of these complementary techniques, in their ability to contribute empirical input into healthcare policy and practice guidelines. The models were developed independently, with a view to compare their suitability to emergency department simulation. The current models implement relatively simple general scenarios, and rely on a combination of simulated and real data to simulate patient flow in a single emergency department or in multiple interacting emergency departments. In addition, several concepts from telecommunications engineering are translated into this modeling context. The framework of multiple-priority queue systems and the genetic programming paradigm of evolutionary machine learning are applied as a means of forecasting patient wait times and as a means of evolving healthcare policy, respectively. The models' utility lies in their ability to provide qualitative insights into the relative sensitivities and impacts of model input parameters, to illuminate scenarios worthy of more complex investigation, and to iteratively validate the models as they continue to be refined and extended. The paper discusses future efforts to refine, extend, and validate the models with more data and real data relative to physical (spatial–topographical) and social inputs (staffing, patient care models, etc.). Real data obtained through proximity location and tracking system technologies is one example discussed. PMID:19572015

  4. Layout-aware simulation of soft errors in sub-100 nm integrated circuits

    NASA Astrophysics Data System (ADS)

    Balbekov, A.; Gorbunov, M.; Bobkov, S.

    2016-12-01

    Single Event Transient (SET) caused by charged particle traveling through the sensitive volume of integral circuit (IC) may lead to different errors in digital circuits in some cases. In technologies below 180 nm, a single particle can affect multiple devices causing multiple SET. This fact adds the complexity to fault tolerant devices design, because the schematic design techniques become useless without their layout consideration. The most common layout mitigation technique is a spatial separation of sensitive nodes of hardened circuits. Spatial separation decreases the circuit performance and increases power consumption. Spacing should thus be reasonable and its scaling follows the device dimensions' scaling trend. This paper presents the development of the SET simulation approach comprised of SPICE simulation with "double exponent" current source as SET model. The technique uses layout in GDSII format to locate nearby devices that can be affected by a single particle and that can share the generated charge. The developed software tool automatizes multiple simulations and gathers the produced data to present it as the sensitivity map. The examples of conducted simulations of fault tolerant cells and their sensitivity maps are presented in this paper.

  5. Toward Rigorous Parameterization of Underconstrained Neural Network Models Through Interactive Visualization and Steering of Connectivity Generation

    PubMed Central

    Nowke, Christian; Diaz-Pier, Sandra; Weyers, Benjamin; Hentschel, Bernd; Morrison, Abigail; Kuhlen, Torsten W.; Peyser, Alexander

    2018-01-01

    Simulation models in many scientific fields can have non-unique solutions or unique solutions which can be difficult to find. Moreover, in evolving systems, unique final state solutions can be reached by multiple different trajectories. Neuroscience is no exception. Often, neural network models are subject to parameter fitting to obtain desirable output comparable to experimental data. Parameter fitting without sufficient constraints and a systematic exploration of the possible solution space can lead to conclusions valid only around local minima or around non-minima. To address this issue, we have developed an interactive tool for visualizing and steering parameters in neural network simulation models. In this work, we focus particularly on connectivity generation, since finding suitable connectivity configurations for neural network models constitutes a complex parameter search scenario. The development of the tool has been guided by several use cases—the tool allows researchers to steer the parameters of the connectivity generation during the simulation, thus quickly growing networks composed of multiple populations with a targeted mean activity. The flexibility of the software allows scientists to explore other connectivity and neuron variables apart from the ones presented as use cases. With this tool, we enable an interactive exploration of parameter spaces and a better understanding of neural network models and grapple with the crucial problem of non-unique network solutions and trajectories. In addition, we observe a reduction in turn around times for the assessment of these models, due to interactive visualization while the simulation is computed. PMID:29937723

  6. Multiple Sensing Application on Wireless Sensor Network Simulation using NS3

    NASA Astrophysics Data System (ADS)

    Kurniawan, I. F.; Bisma, R.

    2018-01-01

    Hardware enhancement provides opportunity to install various sensor device on single monitoring node which then enables users to acquire multiple data simultaneously. Constructing multiple sensing application in NS3 is a challenging task since numbers of aspects such as wireless communication, packet transmission pattern, and energy model must be taken into account. Despite of numerous types of monitoring data available, this study only considers two types such as periodic, and event-based data. Periodical data will generate monitoring data follows configured interval, while event-based transmit data when certain determined condition is met. Therefore, this study attempts to cover mentioned aspects in NS3. Several simulations are performed with different number of nodes on arbitrary communication scheme.

  7. Interpreting space-based trends in carbon monoxide with multiple models

    DOE PAGES

    Strode, Sarah A.; Worden, Helen M.; Damon, Megan; ...

    2016-06-10

    Here, we use a series of chemical transport model and chemistry climate model simulations to investigate the observed negative trends in MOPITT CO over several regions of the world, and to examine the consistency of time-dependent emission inventories with observations. We also found that simulations driven by the MACCity inventory, used for the Chemistry Climate Modeling Initiative (CCMI), reproduce the negative trends in the CO column observed by MOPITT for 2000–2010 over the eastern United States and Europe. However, the simulations have positive trends over eastern China, in contrast to the negative trends observed by MOPITT. The model bias inmore » CO, after applying MOPITT averaging kernels, contributes to the model–observation discrepancy in the trend over eastern China. This demonstrates that biases in a model's average concentrations can influence the interpretation of the temporal trend compared to satellite observations. The total ozone column plays a role in determining the simulated tropospheric CO trends. A large positive anomaly in the simulated total ozone column in 2010 leads to a negative anomaly in OH and hence a positive anomaly in CO, contributing to the positive trend in simulated CO. Our results demonstrate that accurately simulating variability in the ozone column is important for simulating and interpreting trends in CO.« less

  8. Interpreting space-based trends in carbon monoxide with multiple models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strode, Sarah A.; Worden, Helen M.; Damon, Megan

    Here, we use a series of chemical transport model and chemistry climate model simulations to investigate the observed negative trends in MOPITT CO over several regions of the world, and to examine the consistency of time-dependent emission inventories with observations. We also found that simulations driven by the MACCity inventory, used for the Chemistry Climate Modeling Initiative (CCMI), reproduce the negative trends in the CO column observed by MOPITT for 2000–2010 over the eastern United States and Europe. However, the simulations have positive trends over eastern China, in contrast to the negative trends observed by MOPITT. The model bias inmore » CO, after applying MOPITT averaging kernels, contributes to the model–observation discrepancy in the trend over eastern China. This demonstrates that biases in a model's average concentrations can influence the interpretation of the temporal trend compared to satellite observations. The total ozone column plays a role in determining the simulated tropospheric CO trends. A large positive anomaly in the simulated total ozone column in 2010 leads to a negative anomaly in OH and hence a positive anomaly in CO, contributing to the positive trend in simulated CO. Our results demonstrate that accurately simulating variability in the ozone column is important for simulating and interpreting trends in CO.« less

  9. SU-E-T-378: Evaluation of An Analytical Model for the Inter-Seed Attenuation Effect in 103-Pd Multi-Seed Implant Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safigholi, H; Soliman, A; Song, W

    Purpose: Brachytherapy treatment planning systems based on TG-43 protocol calculate the dose in water and neglects the heterogeneity effect of seeds in multi-seed implant brachytherapy. In this research, the accuracy of a novel analytical model that we propose for the inter-seed attenuation effect (ISA) for 103-Pd seed model is evaluated. Methods: In the analytical model, dose perturbation due to the ISA effect for each seed in an LDR multi-seed implant for 103-Pd is calculated by assuming that the seed of interest is active and the other surrounding seeds are inactive. The cumulative dosimetric effect of all seeds is then summedmore » using the superposition principle. The model is based on pre Monte Carlo (MC) simulated 3D kernels of the dose perturbations caused by the ISA effect. The cumulative ISA effect due to multiple surrounding seeds is obtained by a simple multiplication of the individual ISA effect by each seed, the effect of which is determined by the distance from the seed of interest. This novel algorithm is then compared with full MC water-based simulations (FMCW). Results: The results show that the dose perturbation model we propose is in excellent agreement with the FMCW values for a case with three seeds separated by 1 cm. The average difference of the model and the FMCW simulations was less than 8%±2%. Conclusion: Using the proposed novel analytical ISA effect model, one could expedite the corrections due to the ISA dose perturbation effects during permanent seed 103-Pd brachytherapy planning with minimal increase in time since the model is based on multiplications and superposition. This model can be applied, in principle, to any other brachytherapy seeds. Further work is necessary to validate this model on a more complicated geometry as well.« less

  10. Multiple piezo-patch energy harvesters integrated to a thin plate with AC-DC conversion: analytical modeling and numerical validation

    NASA Astrophysics Data System (ADS)

    Aghakhani, Amirreza; Basdogan, Ipek; Erturk, Alper

    2016-04-01

    Plate-like components are widely used in numerous automotive, marine, and aerospace applications where they can be employed as host structures for vibration based energy harvesting. Piezoelectric patch harvesters can be easily attached to these structures to convert the vibrational energy to the electrical energy. Power output investigations of these harvesters require accurate models for energy harvesting performance evaluation and optimization. Equivalent circuit modeling of the cantilever-based vibration energy harvesters for estimation of electrical response has been proposed in recent years. However, equivalent circuit formulation and analytical modeling of multiple piezo-patch energy harvesters integrated to thin plates including nonlinear circuits has not been studied. In this study, equivalent circuit model of multiple parallel piezoelectric patch harvesters together with a resistive load is built in electronic circuit simulation software SPICE and voltage frequency response functions (FRFs) are validated using the analytical distributedparameter model. Analytical formulation of the piezoelectric patches in parallel configuration for the DC voltage output is derived while the patches are connected to a standard AC-DC circuit. The analytic model is based on the equivalent load impedance approach for piezoelectric capacitance and AC-DC circuit elements. The analytic results are validated numerically via SPICE simulations. Finally, DC power outputs of the harvesters are computed and compared with the peak power amplitudes in the AC output case.

  11. A novel biomechanical model assessing continuous orthodontic archwire activation

    PubMed Central

    Canales, Christopher; Larson, Matthew; Grauer, Dan; Sheats, Rose; Stevens, Clarke; Ko, Ching-Chang

    2013-01-01

    Objective The biomechanics of a continuous archwire inserted into multiple orthodontic brackets is poorly understood. The purpose of this research was to apply the birth-death technique to simulate insertion of an orthodontic wire and consequent transfer of forces to the dentition in an anatomically accurate model. Methods A digital model containing the maxillary dentition, periodontal ligament (PDL), and surrounding bone was constructed from human computerized tomography data. Virtual brackets were placed on four teeth (central and lateral incisors, canine and first premolar), and a steel archwire (0.019″ × 0.025″) with a 0.5 mm step bend to intrude the lateral incisor was virtually inserted into the bracket slots. Forces applied to the dentition and surrounding structures were simulated utilizing the birth-death technique. Results The goal of simulating a complete bracket-wire system on accurate anatomy including multiple teeth was achieved. Orthodontic force delivered by the wire-bracket interaction was: central incisor 19.1 N, lateral incisor 21.9 N, and canine 19.9 N. Loading the model with equivalent point forces showed a different stress distribution in the PDL. Conclusions The birth-death technique proved to be a useful biomechanical simulation method for placement of a continuous archwire in orthodontic brackets. The ability to view the stress distribution throughout proper anatomy and appliances advances understanding of orthodontic biomechanics. PMID:23374936

  12. A systematic petri net approach for multiple-scale modeling and simulation of biochemical processes.

    PubMed

    Chen, Ming; Hu, Minjie; Hofestädt, Ralf

    2011-06-01

    A method to exploit hybrid Petri nets for modeling and simulating biochemical processes in a systematic way was introduced. Both molecular biology and biochemical engineering aspects are manipulated. With discrete and continuous elements, the hybrid Petri nets can easily handle biochemical factors such as metabolites concentration and kinetic behaviors. It is possible to translate both molecular biological behavior and biochemical processes workflow into hybrid Petri nets in a natural manner. As an example, penicillin production bioprocess is modeled to illustrate the concepts of the methodology. Results of the dynamic of production parameters in the bioprocess were simulated and observed diagrammatically. Current problems and post-genomic perspectives were also discussed.

  13. Improvement in precipitation-runoff model simulations by recalibration with basin-specific data, and subsequent model applications, Onondaga Lake Basin, Onondaga County, New York

    USGS Publications Warehouse

    Coon, William F.

    2011-01-01

    Simulation of streamflows in small subbasins was improved by adjusting model parameter values to match base flows, storm peaks, and storm recessions more precisely than had been done with the original model. Simulated recessional and low flows were either increased or decreased as appropriate for a given stream, and simulated peak flows generally were lowered in the revised model. The use of suspended-sediment concentrations rather than concentrations of the surrogate constituent, total suspended solids, resulted in increases in the simulated low-flow sediment concentrations and, in most cases, decreases in the simulated peak-flow sediment concentrations. Simulated orthophosphate concentrations in base flows generally increased but decreased for peak flows in selected headwater subbasins in the revised model. Compared with the original model, phosphorus concentrations simulated by the revised model were comparable in forested subbasins, generally decreased in developed and wetland-dominated subbasins, and increased in agricultural subbasins. A final revision to the model was made by the addition of the simulation of chloride (salt) concentrations in the Onondaga Creek Basin to help water-resource managers better understand the relative contributions of salt from multiple sources in this particular tributary. The calibrated revised model was used to (1) compute loading rates for the various land types that were simulated in the model, (2) conduct a watershed-management analysis that estimated the portion of the total load that was likely to be transported to Onondaga Lake from each of the modeled subbasins, (3) compute and assess chloride loads to Onondaga Lake from the Onondaga Creek Basin, and (4) simulate precolonization (forested) conditions in the basin to estimate the probable minimum phosphorus loads to the lake.

  14. Intrusion of granitic magma into the continental crust facilitated by magma pulsing and dike-diapir interactions: Numerical simulations

    NASA Astrophysics Data System (ADS)

    Cao, Wenrong; Kaus, Boris J. P.; Paterson, Scott

    2016-06-01

    We conducted a 2-D thermomechanical modeling study of intrusion of granitic magma into the continental crust to explore the roles of multiple pulsing and dike-diapir interactions in the presence of visco-elasto-plastic rheology. Multiple pulsing is simulated by replenishing source regions with new pulses of magma at a certain temporal frequency. Parameterized "pseudo-dike zones" above magma pulses are included. Simulation results show that both diking and pulsing are crucial factors facilitating the magma ascent and emplacement. Multiple pulses keep the magmatic system from freezing and facilitate the initiation of pseudo-dike zones, which in turn heat the host rock roof, lower its viscosity, and create pathways for later ascending pulses of magma. Without diking, magma cannot penetrate the highly viscous upper crust. Without multiple pulsing, a single magma body solidifies quickly and it cannot ascent over a long distance. Our results shed light on the incremental growth of magma chambers, recycling of continental crust, and evolution of a continental arc such as the Sierra Nevada arc in California.

  15. A Multiplicative Cascade Model for High-Resolution Space-Time Downscaling of Rainfall

    NASA Astrophysics Data System (ADS)

    Raut, Bhupendra A.; Seed, Alan W.; Reeder, Michael J.; Jakob, Christian

    2018-02-01

    Distributions of rainfall with the time and space resolutions of minutes and kilometers, respectively, are often needed to drive the hydrological models used in a range of engineering, environmental, and urban design applications. The work described here is the first step in constructing a model capable of downscaling rainfall to scales of minutes and kilometers from time and space resolutions of several hours and a hundred kilometers. A multiplicative random cascade model known as the Short-Term Ensemble Prediction System is run with parameters from the radar observations at Melbourne (Australia). The orographic effects are added through multiplicative correction factor after the model is run. In the first set of model calculations, 112 significant rain events over Melbourne are simulated 100 times. Because of the stochastic nature of the cascade model, the simulations represent 100 possible realizations of the same rain event. The cascade model produces realistic spatial and temporal patterns of rainfall at 6 min and 1 km resolution (the resolution of the radar data), the statistical properties of which are in close agreement with observation. In the second set of calculations, the cascade model is run continuously for all days from January 2008 to August 2015 and the rainfall accumulations are compared at 12 locations in the greater Melbourne area. The statistical properties of the observations lie with envelope of the 100 ensemble members. The model successfully reproduces the frequency distribution of the 6 min rainfall intensities, storm durations, interarrival times, and autocorrelation function.

  16. Rim Fire and its Radiative impact Simulated in CESM/CARMA

    NASA Astrophysics Data System (ADS)

    Yu, P.; Toon, O. B.; Bardeen, C.; Bucholtz, A.; Rosenlof, K. H.; Saide, P. E.; da Silva, A. M., Jr.; Ziemba, L. D.; Jimenez, J. L.; Schwarz, J. P.; Wagner, N. L.; Lack, D. A.; Mills, M. J.; Reid, J. S.

    2015-12-01

    The Rim Fire of 2013, the third largest area burned by fire recorded in California history, is simulated by CESM1/CARMA. Modeled aerosol mass, number, effective radius, and extinction coefficient are within variability of data obtained from multiple airborne measurements and satellite measurements. Simulations suggest Rim Fire smoke may block 4-6% of sunlight reaching the surface, with a cooling efficiency around 120-150 W m-2 per unit aerosol optical depth. This study shows that exceptional events like the 2013 Rim Fire can be simulated by a climate model with one-degree resolution, though that resolution is still not sufficient to resolve the smoke peak near the source region.

  17. Self-Tuning of Design Variables for Generalized Predictive Control

    NASA Technical Reports Server (NTRS)

    Lin, Chaung; Juang, Jer-Nan

    2000-01-01

    Three techniques are introduced to determine the order and control weighting for the design of a generalized predictive controller. These techniques are based on the application of fuzzy logic, genetic algorithms, and simulated annealing to conduct an optimal search on specific performance indexes or objective functions. Fuzzy logic is found to be feasible for real-time and on-line implementation due to its smooth and quick convergence. On the other hand, genetic algorithms and simulated annealing are applicable for initial estimation of the model order and control weighting, and final fine-tuning within a small region of the solution space, Several numerical simulations for a multiple-input and multiple-output system are given to illustrate the techniques developed in this paper.

  18. Global magnetohydrodynamic simulations on multiple GPUs

    NASA Astrophysics Data System (ADS)

    Wong, Un-Hong; Wong, Hon-Cheng; Ma, Yonghui

    2014-01-01

    Global magnetohydrodynamic (MHD) models play the major role in investigating the solar wind-magnetosphere interaction. However, the huge computation requirement in global MHD simulations is also the main problem that needs to be solved. With the recent development of modern graphics processing units (GPUs) and the Compute Unified Device Architecture (CUDA), it is possible to perform global MHD simulations in a more efficient manner. In this paper, we present a global magnetohydrodynamic (MHD) simulator on multiple GPUs using CUDA 4.0 with GPUDirect 2.0. Our implementation is based on the modified leapfrog scheme, which is a combination of the leapfrog scheme and the two-step Lax-Wendroff scheme. GPUDirect 2.0 is used in our implementation to drive multiple GPUs. All data transferring and kernel processing are managed with CUDA 4.0 API instead of using MPI or OpenMP. Performance measurements are made on a multi-GPU system with eight NVIDIA Tesla M2050 (Fermi architecture) graphics cards. These measurements show that our multi-GPU implementation achieves a peak performance of 97.36 GFLOPS in double precision.

  19. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    PubMed Central

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  20. Space Shuttle Propulsion Systems Plume Modeling and Simulation for the Lift-Off Computational Fluid Dynamics Model

    NASA Technical Reports Server (NTRS)

    Strutzenberg, L. L.; Dougherty, N. S.; Liever, P. A.; West, J. S.; Smith, S. D.

    2007-01-01

    This paper details advances being made in the development of Reynolds-Averaged Navier-Stokes numerical simulation tools, models, and methods for the integrated Space Shuttle Vehicle at launch. The conceptual model and modeling approach described includes the development of multiple computational models to appropriately analyze the potential debris transport for critical debris sources at Lift-Off. The conceptual model described herein involves the integration of propulsion analysis for the nozzle/plume flow with the overall 3D vehicle flowfield at Lift-Off. Debris Transport Analyses are being performed using the Shuttle Lift-Off models to assess the risk to the vehicle from Lift-Off debris and appropriately prioritized mitigation of potential debris sources to continue to reduce vehicle risk. These integrated simulations are being used to evaluate plume-induced debris environments where the multi-plume interactions with the launch facility can potentially accelerate debris particles toward the vehicle.

  1. Technical note: Simultaneous fully dynamic characterization of multiple input–output relationships in climate models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kravitz, Ben; MacMartin, Douglas G.; Rasch, Philip J.

    We introduce system identification techniques to climate science wherein multiple dynamic input–output relationships can be simultaneously characterized in a single simulation. This method, involving multiple small perturbations (in space and time) of an input field while monitoring output fields to quantify responses, allows for identification of different timescales of climate response to forcing without substantially pushing the climate far away from a steady state. We use this technique to determine the steady-state responses of low cloud fraction and latent heat flux to heating perturbations over 22 regions spanning Earth's oceans. We show that the response characteristics are similar to thosemore » of step-change simulations, but in this new method the responses for 22 regions can be characterized simultaneously. Moreover, we can estimate the timescale over which the steady-state response emerges. The proposed methodology could be useful for a wide variety of purposes in climate science, including characterization of teleconnections and uncertainty quantification to identify the effects of climate model tuning parameters.« less

  2. Neural-Fuzzy model Based Steel Pipeline Multiple Cracks Classification

    NASA Astrophysics Data System (ADS)

    Elwalwal, Hatem Mostafa; Mahzan, Shahruddin Bin Hj.; Abdalla, Ahmed N.

    2017-10-01

    While pipes are cheaper than other means of transportation, this cost saving comes with a major price: pipes are subject to cracks, corrosion etc., which in turn can cause leakage and environmental damage. In this paper, Neural-Fuzzy model for multiple cracks classification based on Lamb Guide Wave. Simulation results for 42 sample were collected using ANSYS software. The current research object to carry on the numerical simulation and experimental study, aiming at finding an effective way to detection and the localization of cracks and holes defects in the main body of pipeline. Considering the damage form of multiple cracks and holes which may exist in pipeline, to determine the respective position in the steel pipe. In addition, the technique used in this research a guided lamb wave based structural health monitoring method whereas piezoelectric transducers will use as exciting and receiving sensors by Pitch-Catch method. Implementation of simple learning mechanism has been developed specially for the ANN for fuzzy the system represented.

  3. Technical note: Simultaneous fully dynamic characterization of multiple input–output relationships in climate models

    DOE PAGES

    Kravitz, Ben; MacMartin, Douglas G.; Rasch, Philip J.; ...

    2017-02-17

    We introduce system identification techniques to climate science wherein multiple dynamic input–output relationships can be simultaneously characterized in a single simulation. This method, involving multiple small perturbations (in space and time) of an input field while monitoring output fields to quantify responses, allows for identification of different timescales of climate response to forcing without substantially pushing the climate far away from a steady state. We use this technique to determine the steady-state responses of low cloud fraction and latent heat flux to heating perturbations over 22 regions spanning Earth's oceans. We show that the response characteristics are similar to thosemore » of step-change simulations, but in this new method the responses for 22 regions can be characterized simultaneously. Moreover, we can estimate the timescale over which the steady-state response emerges. The proposed methodology could be useful for a wide variety of purposes in climate science, including characterization of teleconnections and uncertainty quantification to identify the effects of climate model tuning parameters.« less

  4. A kinetic Monte Carlo simulation method of van der Waals epitaxy for atomistic nucleation-growth processes of transition metal dichalcogenides.

    PubMed

    Nie, Yifan; Liang, Chaoping; Cha, Pil-Ryung; Colombo, Luigi; Wallace, Robert M; Cho, Kyeongjae

    2017-06-07

    Controlled growth of crystalline solids is critical for device applications, and atomistic modeling methods have been developed for bulk crystalline solids. Kinetic Monte Carlo (KMC) simulation method provides detailed atomic scale processes during a solid growth over realistic time scales, but its application to the growth modeling of van der Waals (vdW) heterostructures has not yet been developed. Specifically, the growth of single-layered transition metal dichalcogenides (TMDs) is currently facing tremendous challenges, and a detailed understanding based on KMC simulations would provide critical guidance to enable controlled growth of vdW heterostructures. In this work, a KMC simulation method is developed for the growth modeling on the vdW epitaxy of TMDs. The KMC method has introduced full material parameters for TMDs in bottom-up synthesis: metal and chalcogen adsorption/desorption/diffusion on substrate and grown TMD surface, TMD stacking sequence, chalcogen/metal ratio, flake edge diffusion and vacancy diffusion. The KMC processes result in multiple kinetic behaviors associated with various growth behaviors observed in experiments. Different phenomena observed during vdW epitaxy process are analysed in terms of complex competitions among multiple kinetic processes. The KMC method is used in the investigation and prediction of growth mechanisms, which provide qualitative suggestions to guide experimental study.

  5. Model predictive control of a wind turbine modelled in Simpack

    NASA Astrophysics Data System (ADS)

    Jassmann, U.; Berroth, J.; Matzke, D.; Schelenz, R.; Reiter, M.; Jacobs, G.; Abel, D.

    2014-06-01

    Wind turbines (WT) are steadily growing in size to increase their power production, which also causes increasing loads acting on the turbine's components. At the same time large structures, such as the blades and the tower get more flexible. To minimize this impact, the classical control loops for keeping the power production in an optimum state are more and more extended by load alleviation strategies. These additional control loops can be unified by a multiple-input multiple-output (MIMO) controller to achieve better balancing of tuning parameters. An example for MIMO control, which has been paid more attention to recently by wind industry, is Model Predictive Control (MPC). In a MPC framework a simplified model of the WT is used to predict its controlled outputs. Based on a user-defined cost function an online optimization calculates the optimal control sequence. Thereby MPC can intrinsically incorporate constraints e.g. of actuators. Turbine models used for calculation within the MPC are typically simplified. For testing and verification usually multi body simulations, such as FAST, BLADED or FLEX5 are used to model system dynamics, but they are still limited in the number of degrees of freedom (DOF). Detailed information about load distribution (e.g. inside the gearbox) cannot be provided by such models. In this paper a Model Predictive Controller is presented and tested in a co-simulation with SlMPACK, a multi body system (MBS) simulation framework used for detailed load analysis. The analysis are performed on the basis of the IME6.0 MBS WT model, described in this paper. It is based on the rotor of the NREL 5MW WT and consists of a detailed representation of the drive train. This takes into account a flexible main shaft and its main bearings with a planetary gearbox, where all components are modelled flexible, as well as a supporting flexible main frame. The wind loads are simulated using the NREL AERODYN v13 code which has been implemented as a routine to SlMPACK. This modeling approach allows to investigate the nonlinear behavior of wind loads and nonlinear drive train dynamics. Thereby the MPC's impact on specific loads and effects not covered by standard simulation tools can be assessed and investigated. Keywords. wind turbine simulation, model predictive control, multi body simulation, MIMO, load alleviation

  6. Two-dimensional simulation and modeling in scanning electron microscope imaging and metrology research.

    PubMed

    Postek, Michael T; Vladár, András E; Lowney, Jeremiah R; Keery, William J

    2002-01-01

    Traditional Monte Carlo modeling of the electron beam-specimen interactions in a scanning electron microscope (SEM) produces information about electron beam penetration and output signal generation at either a single beam-landing location, or multiple landing positions. If the multiple landings lie on a line, the results can be graphed in a line scan-like format. Monte Carlo results formatted as line scans have proven useful in providing one-dimensional information about the sample (e.g., linewidth). When used this way, this process is called forward line scan modeling. In the present work, the concept of image simulation (or the first step in the inverse modeling of images) is introduced where the forward-modeled line scan data are carried one step further to construct theoretical two-dimensional (2-D) micrographs (i.e., theoretical SEM images) for comparison with similar experimentally obtained micrographs. This provides an ability to mimic and closely match theory and experiment using SEM images. Calculated and/or measured libraries of simulated images can be developed with this technique. The library concept will prove to be very useful in the determination of dimensional and other properties of simple structures, such as integrated circuit parts, where the shape of the features is preferably measured from a single top-down image or a line scan. This paper presents one approach to the generation of 2-D simulated images and presents some suggestions as to their application to critical dimension metrology.

  7. Digital Full-Scope Simulation of a Conventional Nuclear Power Plant Control Room, Phase 2: Installation of a Reconfigurable Simulator to Support Nuclear Plant Sustainability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Vivek Agarwal; Kirk Fitzgerald

    2013-03-01

    The U.S. Department of Energy’s Light Water Reactor Sustainability program has developed a control room simulator in support of control room modernization at nuclear power plants in the U.S. This report highlights the recent completion of this reconfigurable, full-scale, full-scope control room simulator buildout at the Idaho National Laboratory. The simulator is fully reconfigurable, meaning it supports multiple plant models developed by different simulator vendors. The simulator is full-scale, using glasstop virtual panels to display the analog control boards found at current plants. The present installation features 15 glasstop panels, uniquely achieving a complete control room representation. The simulator ismore » also full-scope, meaning it uses the same plant models used for training simulators at actual plants. Unlike in the plant training simulators, the deployment on glasstop panels allows a high degree of customization of the panels, allowing the simulator to be used for research on the design of new digital control systems for control room modernization. This report includes separate sections discussing the glasstop panels, their layout to mimic control rooms at actual plants, technical details on creating a multi-plant and multi-vendor reconfigurable simulator, and current efforts to support control room modernization at U.S. utilities. The glasstop simulator provides an ideal testbed for prototyping and validating new control room concepts. Equally importantly, it is helping create a standardized and vetted human factors engineering process that can be used across the nuclear industry to ensure control room upgrades maintain and even improve current reliability and safety.« less

  8. Modelling and simulation of biased agonism dynamics at a G protein-coupled receptor.

    PubMed

    Bridge, L J; Mead, J; Frattini, E; Winfield, I; Ladds, G

    2018-04-07

    Theoretical models of G protein-coupled receptor (GPCR) concentration-response relationships often assume an agonist producing a single functional response via a single active state of the receptor. These models have largely been analysed assuming steady-state conditions. There is now much experimental evidence to suggest that many GPCRs can exist in multiple receptor conformations and elicit numerous functional responses, with ligands having the potential to activate different signalling pathways to varying extents-a concept referred to as biased agonism, functional selectivity or pluri-dimensional efficacy. Moreover, recent experimental results indicate a clear possibility for time-dependent bias, whereby an agonist's bias with respect to different pathways may vary dynamically. Efforts towards understanding the implications of temporal bias by characterising and quantifying ligand effects on multiple pathways will clearly be aided by extending current equilibrium binding and biased activation models to include G protein activation dynamics. Here, we present a new model of time-dependent biased agonism, based on ordinary differential equations for multiple cubic ternary complex activation models with G protein cycle dynamics. This model allows simulation and analysis of multi-pathway activation bias dynamics at a single receptor for the first time, at the level of active G protein (α GTP ), towards the analysis of dynamic functional responses. The model is generally applicable to systems with N G G proteins and N* active receptor states. Numerical simulations for N G =N * =2 reveal new insights into the effects of system parameters (including cooperativities, and ligand and receptor concentrations) on bias dynamics, highlighting new phenomena including the dynamic inter-conversion of bias direction. Further, we fit this model to 'wet' experimental data for two competing G proteins (G i and G s ) that become activated upon stimulation of the adenosine A 1 receptor with adenosine derivative compounds. Finally, we show that our model can qualitatively describe the temporal dynamics of this competing G protein activation. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. How will climate change affect watershed mercury export in a representative Coastal Plain watershed?

    NASA Astrophysics Data System (ADS)

    Golden, H. E.; Knightes, C. D.; Conrads, P. A.; Feaster, T.; Davis, G. M.; Benedict, S. T.; Bradley, P. M.

    2012-12-01

    Future climate change is expected to drive variations in watershed hydrological processes and water quality across a wide range of physiographic provinces, ecosystems, and spatial scales. How such shifts in climatic conditions will impact watershed mercury (Hg) dynamics and hydrologically-driven Hg transport is a significant concern. We simulate the responses of watershed hydrological and total Hg (HgT) fluxes and concentrations to a unified set of past and future climate change projections in a Coastal Plain basin using multiple watershed models. We use two statistically downscaled global precipitation and temperature models, ECHO, a hybrid of the ECHAM4 and HOPE-G models, and the Community Climate System Model (CCSM3) across two thirty-year simulations (1980 to 2010 and 2040 to 2070). We apply three watershed models to quantify and bracket potential changes in hydrologic and HgT fluxes, including the Visualizing Ecosystems for Land Management Assessment Model for Hg (VELMA-Hg), the Grid Based Mercury Model (GBMM), and TOPLOAD, a water quality constituent model linked to TOPMODEL hydrological simulations. We estimate a decrease in average annual HgT fluxes in response to climate change using the ECHO projections and an increase with the CCSM3 projections in the study watershed. Average monthly HgT fluxes increase using both climate change projections between in the late spring (March through May), when HgT concentrations and flow are high. Results suggest that hydrological transport associated with changes in precipitation and temperature is the primary mechanism driving HgT flux response to climate change. Our multiple model/multiple projection approach allows us to bracket the relative response of HgT fluxes to climate change, thereby illustrating the uncertainty associated with the projections. In addition, our approach allows us to examine potential variations in climate change-driven water and HgT export based on different conceptualizations of watershed HgT dynamics and the representative mathematical structures underpinning existing watershed Hg models.

  10. Using Monte Carlo Ray tracing to Understand the Vibrational Response of UN as Measured by Neutron Spectroscopy

    NASA Astrophysics Data System (ADS)

    Lin, J. Y. Y.; Aczel, A. A.; Abernathy, D. L.; Nagler, S. E.; Buyers, W. J. L.; Granroth, G. E.

    2014-03-01

    Recently neutron spectroscopy measurements, using the ARCS and SEQUOIA time-of-flight chopper spectrometers, observed an extended series of equally spaced modes in UN that are well described by quantum harmonic oscillator behavior of the N atoms. Additional contributions to the scattering are also observed. Monte Carlo ray tracing simulations with various sample kernels have allowed us to distinguish between the response from the N oscillator scattering, contributions that arise from the U partial phonon density of states (PDOS), and all forms of multiple scattering. These simulations confirm that multiple scattering contributes an ~ Q -independent background to the spectrum at the oscillator mode positions. All three of the aforementioned contributions are necessary to accurately model the experimental data. These simulations were also used to compare the T dependence of the oscillator modes in SEQUOIA data to that predicted by the binary solid model. This work was sponsored by the Scientific User Facilities Division, Office of Basic Energy Sciences, U.S. Department of Energy.

  11. Evaluation of mean climate in a chemistry-climate model simulation

    NASA Astrophysics Data System (ADS)

    Hong, S.; Park, H.; Wie, J.; Park, R.; Lee, S.; Moon, B. K.

    2017-12-01

    Incorporation of the interactive chemistry is essential for understanding chemistry-climate interactions and feedback processes in climate models. Here we assess a newly developed chemistry-climate model (GRIMs-Chem), which is based on the Global/Regional Integrated Model system (GRIMs) including the aerosol direct effect as well as stratospheric linearized ozone chemistry (LINOZ). We conducted GRIMs-Chem with observed sea surface temperature during the period of 1979-2010, and compared the simulation results with observations and also with CMIP models. To measure the relative performance of our model, we define the quantitative performance metric using the Taylor diagram. This metric allow us to assess overall features in simulating multiple variables. Overall, our model better reproduce the zonal mean spatial pattern of temperature, horizontal wind, vertical motion, and relative humidity relative to other models. However, the model did not produce good simulations at upper troposphere (200 hPa). It is currently unclear which model processes are responsible for this. AcknowledgementsThis research was supported by the Korea Ministry of Environment (MOE) as "Climate Change Correspondence Program."

  12. Model Selection for the Multiple Model Adaptive Algorithm for In-Flight Simulation.

    DTIC Science & Technology

    1987-12-01

    of the two models, while the other model was given a probability of approximately zero. If the probabilties were exactly one and zero for the...Figures 6-103 through 6-107. From Figure 6-103, it can be seen that the probabilty of the model associated with the 10,000 ft, 0.35 Mach flight con

  13. Sampling ARG of multiple populations under complex configurations of subdivision and admixture.

    PubMed

    Carrieri, Anna Paola; Utro, Filippo; Parida, Laxmi

    2016-04-01

    Simulating complex evolution scenarios of multiple populations is an important task for answering many basic questions relating to population genomics. Apart from the population samples, the underlying Ancestral Recombinations Graph (ARG) is an additional important means in hypothesis checking and reconstruction studies. Furthermore, complex simulations require a plethora of interdependent parameters making even the scenario-specification highly non-trivial. We present an algorithm SimRA that simulates generic multiple population evolution model with admixture. It is based on random graphs that improve dramatically in time and space requirements of the classical algorithm of single populations.Using the underlying random graphs model, we also derive closed forms of expected values of the ARG characteristics i.e., height of the graph, number of recombinations, number of mutations and population diversity in terms of its defining parameters. This is crucial in aiding the user to specify meaningful parameters for the complex scenario simulations, not through trial-and-error based on raw compute power but intelligent parameter estimation. To the best of our knowledge this is the first time closed form expressions have been computed for the ARG properties. We show that the expected values closely match the empirical values through simulations.Finally, we demonstrate that SimRA produces the ARG in compact forms without compromising any accuracy. We demonstrate the compactness and accuracy through extensive experiments. SimRA (Simulation based on Random graph Algorithms) source, executable, user manual and sample input-output sets are available for downloading at: https://github.com/ComputationalGenomics/SimRA CONTACT: : parida@us.ibm.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Improving and Understanding Climate Models: Scale-Aware Parameterization of Cloud Water Inhomogeneity and Sensitivity of MJO Simulation to Physical Parameters in a Convection Scheme

    NASA Astrophysics Data System (ADS)

    Xie, Xin

    Microphysics and convection parameterizations are two key components in a climate model to simulate realistic climatology and variability of cloud distribution and the cycles of energy and water. When a model has varying grid size or simulations have to be run with different resolutions, scale-aware parameterization is desirable so that we do not have to tune model parameters tailored to a particular grid size. The subgrid variability of cloud hydrometers is known to impact microphysics processes in climate models and is found to highly depend on spatial scale. A scale- aware liquid cloud subgrid variability parameterization is derived and implemented in the Community Earth System Model (CESM) in this study using long-term radar-based ground measurements from the Atmospheric Radiation Measurement (ARM) program. When used in the default CESM1 with the finite-volume dynamic core where a constant liquid inhomogeneity parameter was assumed, the newly developed parameterization reduces the cloud inhomogeneity in high latitudes and increases it in low latitudes. This is due to both the smaller grid size in high latitudes, and larger grid size in low latitudes in the longitude-latitude grid setting of CESM as well as the variation of the stability of the atmosphere. The single column model and general circulation model (GCM) sensitivity experiments show that the new parameterization increases the cloud liquid water path in polar regions and decreases it in low latitudes. Current CESM1 simulation suffers from the bias of both the pacific double ITCZ precipitation and weak Madden-Julian oscillation (MJO). Previous studies show that convective parameterization with multiple plumes may have the capability to alleviate such biases in a more uniform and physical way. A multiple-plume mass flux convective parameterization is used in Community Atmospheric Model (CAM) to investigate the sensitivity of MJO simulations. We show that MJO simulation is sensitive to entrainment rate specification. We found that shallow plumes can generate and sustain the MJO propagation in the model.

  15. Use of multiple picosecond high-mass molecular dynamics simulations to predict crystallographic B-factors of folded globular proteins.

    PubMed

    Pang, Yuan-Ping

    2016-09-01

    Predicting crystallographic B-factors of a protein from a conventional molecular dynamics simulation is challenging, in part because the B-factors calculated through sampling the atomic positional fluctuations in a picosecond molecular dynamics simulation are unreliable, and the sampling of a longer simulation yields overly large root mean square deviations between calculated and experimental B-factors. This article reports improved B-factor prediction achieved by sampling the atomic positional fluctuations in multiple picosecond molecular dynamics simulations that use uniformly increased atomic masses by 100-fold to increase time resolution. Using the third immunoglobulin-binding domain of protein G, bovine pancreatic trypsin inhibitor, ubiquitin, and lysozyme as model systems, the B-factor root mean square deviations (mean ± standard error) of these proteins were 3.1 ± 0.2-9 ± 1 Å 2 for Cα and 7.3 ± 0.9-9.6 ± 0.2 Å 2 for Cγ, when the sampling was done for each of these proteins over 20 distinct, independent, and 50-picosecond high-mass molecular dynamics simulations with AMBER forcefield FF12MC or FF14SB. These results suggest that sampling the atomic positional fluctuations in multiple picosecond high-mass molecular dynamics simulations may be conducive to a priori prediction of crystallographic B-factors of a folded globular protein.

  16. Binary variable multiple-model multiple imputation to address missing data mechanism uncertainty: Application to a smoking cessation trial

    PubMed Central

    Siddique, Juned; Harel, Ofer; Crespi, Catherine M.; Hedeker, Donald

    2014-01-01

    The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables that formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software. PMID:24634315

  17. BOREAS RSS-8 BIOME-BGC Model Simulations at Tower Flux Sites in 1994

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Kimball, John

    2000-01-01

    BIOME-BGC is a general ecosystem process model designed to simulate biogeochemical and hydrologic processes across multiple scales (Running and Hunt, 1993). In this investigation, BIOME-BGC was used to estimate daily water and carbon budgets for the BOREAS tower flux sites for 1994. Carbon variables estimated by the model include gross primary production (i.e., net photosynthesis), maintenance and heterotrophic respiration, net primary production, and net ecosystem carbon exchange. Hydrologic variables estimated by the model include snowcover, evaporation, transpiration, evapotranspiration, soil moisture, and outflow. The information provided by the investigation includes input initialization and model output files for various sites in tabular ASCII format.

  18. Multiple model self-tuning control for a class of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Huang, Miao; Wang, Xin; Wang, Zhenlei

    2015-10-01

    This study develops a novel nonlinear multiple model self-tuning control method for a class of nonlinear discrete-time systems. An increment system model and a modified robust adaptive law are proposed to expand the application range, thus eliminating the assumption that either the nonlinear term of the nonlinear system or its differential term is global-bounded. The nonlinear self-tuning control method can address the situation wherein the nonlinear system is not subject to a globally uniformly asymptotically stable zero dynamics by incorporating the pole-placement scheme. A novel, nonlinear control structure based on this scheme is presented to improve control precision. Stability and convergence can be confirmed when the proposed multiple model self-tuning control method is applied. Furthermore, simulation results demonstrate the effectiveness of the proposed method.

  19. MOAB: a spatially explicit, individual-based expert system for creating animal foraging models

    USGS Publications Warehouse

    Carter, J.; Finn, John T.

    1999-01-01

    We describe the development, structure, and corroboration process of a simulation model of animal behavior (MOAB). MOAB can create spatially explicit, individual-based animal foraging models. Users can create or replicate heterogeneous landscape patterns, and place resources and individual animals of a goven species on that landscape to simultaneously simulate the foraging behavior of multiple species. The heuristic rules for animal behavior are maintained in a user-modifiable expert system. MOAB can be used to explore hypotheses concerning the influence of landscape patttern on animal movement and foraging behavior. A red fox (Vulpes vulpes L.) foraging and nest predation model was created to test MOAB's capabilities. Foxes were simulated for 30-day periods using both expert system and random movement rules. Home range size, territory formation and other available simulation studies. A striped skunk (Mephitis mephitis L.) model also was developed. The expert system model proved superior to stochastic in respect to territory formation, general movement patterns and home range size.

  20. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  1. Trend and uncertainty analysis of simulated climate change impacts with multiple GCMs and emission scenarios

    USDA-ARS?s Scientific Manuscript database

    Impacts of climate change on hydrology, soil erosion, and wheat production during 2010-2039 at El Reno in central Oklahoma, USA, were simulated using the Water Erosion Prediction Project (WEPP) model. Projections from four GCMs (CCSR/NIES, CGCM2, CSIRO-Mk2, and HadCM3) under three emissions scenari...

  2. A THREE-DIMENSIONAL AIR FLOW MODEL FOR SOIL VENTING: SUPERPOSITION OF ANLAYTICAL FUNCTIONS

    EPA Science Inventory

    A three-dimensional computer model was developed for the simulation of the soil-air pressure distribution at steady state and specific discharge vectors during soil venting with multiple wells in unsaturated soil. The Kirchhoff transformation of dependent variables and coordinate...

  3. Optimal averaging of soil moisture predictions from ensemble land surface model simulations

    USDA-ARS?s Scientific Manuscript database

    The correct interpretation of ensemble information obtained from the parallel implementation of multiple land surface models (LSMs) requires information concerning the LSM ensemble’s mutual error covariance. Here we propose a new technique for obtaining such information using an instrumental variabl...

  4. Simulation of Plant Physiological Process Using Fuzzy Variables

    Treesearch

    Daniel L. Schmoldt

    1991-01-01

    Qualitative modelling can help us understand and project effects of multiple stresses on trees. It is not practical to collect and correlate empirical data for all combinations of plant/environments and human/climate stresses, especially for mature trees in natural settings. Therefore, a mechanistic model was developed to describe ecophysiological processes. This model...

  5. Development and comparison of multiple regression models to predict bankfull channel dimensions for use in hydrologic models

    USDA-ARS?s Scientific Manuscript database

    Bankfull hydraulic geometry relationships are used to estimate channel dimensions for streamflow simulation models, which require channel geometry data as input parameters. Often, one nationwide curve is used across the entire United States (U.S.) (e.g., in Soil and Water Assessment Tool), even tho...

  6. Slow walking model for children with multiple disabilities via an application of humanoid robot

    NASA Astrophysics Data System (ADS)

    Wang, ZeFeng; Peyrodie, Laurent; Cao, Hua; Agnani, Olivier; Watelain, Eric; Wang, HaoPing

    2016-02-01

    Walk training research with children having multiple disabilities is presented. Orthosis aid in walking for children with multiple disabilities such as Cerebral Palsy continues to be a clinical and technological challenge. In order to reduce pain and improve treatment strategies, an intermediate structure - humanoid robot NAO - is proposed as an assay platform to study walking training models, to be transferred to future special exoskeletons for children. A suitable and stable walking model is proposed for walk training. It would be simulated and tested on NAO. This comparative study of zero moment point (ZMP) supports polygons and energy consumption validates the model as more stable than the conventional NAO. Accordingly direction variation of the center of mass and the slopes of linear regression knee/ankle angles, the Slow Walk model faithfully emulates the gait pattern of children.

  7. Adaptive control of a jet turboshaft engine driving a variable pitch propeller using multiple models

    NASA Astrophysics Data System (ADS)

    Ahmadian, Narjes; Khosravi, Alireza; Sarhadi, Pouria

    2017-08-01

    In this paper, a multiple model adaptive control (MMAC) method is proposed for a gas turbine engine. The model of a twin spool turbo-shaft engine driving a variable pitch propeller includes various operating points. Variations in fuel flow and propeller pitch inputs produce different operating conditions which force the controller to be adopted rapidly. Important operating points are three idle, cruise and full thrust cases for the entire flight envelope. A multi-input multi-output (MIMO) version of second level adaptation using multiple models is developed. Also, stability analysis using Lyapunov method is presented. The proposed method is compared with two conventional first level adaptation and model reference adaptive control techniques. Simulation results for JetCat SPT5 turbo-shaft engine demonstrate the performance and fidelity of the proposed method.

  8. ALTERNATIVES TO HELIUM-3 FOR NEUTRON MULTIPLICITY DETECTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.

    Collaboration between the Pacific Northwest National Laboratory (PNNL) and the Los Alamos National Laboratory (LANL) is underway to evaluate neutron detection technologies that might replace the high-pressure helium (3He) tubes currently used in neutron multiplicity counter for safeguards applications. The current stockpile of 3He is diminishing and alternatives are needed for a variety of neutron detection applications including multiplicity counters. The first phase of this investigation uses a series of Monte Carlo calculations to simulate the performance of an existing neutron multiplicity counter configuration by replacing the 3He tubes in a model for that counter with candidate alternative technologies. Thesemore » alternative technologies are initially placed in approximately the same configuration as the 3He tubes to establish a reference level of performance against the 3He-based system. After these reference-level results are established, the configurations of the alternative models will be further modified for performance optimization. The 3He model for these simulations is the one used by LANL to develop and benchmark the Epithermal Neutron Multiplicity Counter (ENMC) detector, as documented by H.O. Menlove, et al. in the 2004 LANL report LA-14088. The alternative technologies being evaluated are the boron-tri-fluoride-filled proportional tubes, boron-lined tubes, and lithium coated materials previously tested as possible replacements in portal monitor screening applications, as documented by R.T. Kouzes, et al. in the 2010 PNNL report PNNL-72544 and NIM A 623 (2010) 1035–1045. The models and methods used for these comparative calculations will be described and preliminary results shown« less

  9. Use of Multi-class Empirical Orthogonal Function for Identification of Hydrogeological Parameters and Spatiotemporal Pattern of Multiple Recharges in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Huang, C. L.; Hsu, N. S.; Yeh, W. W. G.; Hsieh, I. H.

    2017-12-01

    This study develops an innovative calibration method for regional groundwater modeling by using multi-class empirical orthogonal functions (EOFs). The developed method is an iterative approach. Prior to carrying out the iterative procedures, the groundwater storage hydrographs associated with the observation wells are calculated. The combined multi-class EOF amplitudes and EOF expansion coefficients of the storage hydrographs are then used to compute the initial gauss of the temporal and spatial pattern of multiple recharges. The initial guess of the hydrogeological parameters are also assigned according to in-situ pumping experiment. The recharges include net rainfall recharge and boundary recharge, and the hydrogeological parameters are riverbed leakage conductivity, horizontal hydraulic conductivity, vertical hydraulic conductivity, storage coefficient, and specific yield. The first step of the iterative algorithm is to conduct the numerical model (i.e. MODFLOW) by the initial guess / adjusted values of the recharges and parameters. Second, in order to determine the best EOF combination of the error storage hydrographs for determining the correction vectors, the objective function is devised as minimizing the root mean square error (RMSE) of the simulated storage hydrographs. The error storage hydrograph are the differences between the storage hydrographs computed from observed and simulated groundwater level fluctuations. Third, adjust the values of recharges and parameters and repeat the iterative procedures until the stopping criterion is reached. The established methodology was applied to the groundwater system of Ming-Chu Basin, Taiwan. The study period is from January 1st to December 2ed in 2012. Results showed that the optimal EOF combination for the multiple recharges and hydrogeological parameters can decrease the RMSE of the simulated storage hydrographs dramatically within three calibration iterations. It represents that the iterative approach that using EOF techniques can capture the groundwater flow tendency and detects the correction vector of the simulated error sources. Hence, the established EOF-based methodology can effectively and accurately identify the multiple recharges and hydrogeological parameters.

  10. Epidemic Spreading in a Multi-compartment System

    NASA Astrophysics Data System (ADS)

    Gao, Zong-Mao; Gu, Jiao; Li, Wei

    2012-02-01

    We introduce the variant rate and white noise into the susceptible-infected-removed (SIR) model for epidemics, discuss the epidemic dynamics of a multiple-compartment system, and describe this system by using master equations. For both the local epidemic spreading system and the whole multiple-compartment system, we find that a threshold could be useful in forecasting when the epidemic vanishes. Furthermore, numerical simulations show that a model with the variant infection rate and white noise can improve fitting with real SARS data.

  11. Simulator for beam-based LHC collimator alignment

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Aßmann, Ralph; Redaelli, Stefano; Sammut, Nicholas

    2014-02-01

    In the CERN Large Hadron Collider, collimators need to be set up to form a multistage hierarchy to ensure efficient multiturn cleaning of halo particles. Automatic algorithms were introduced during the first run to reduce the beam time required for beam-based setup, improve the alignment accuracy, and reduce the risk of human errors. Simulating the alignment procedure would allow for off-line tests of alignment policies and algorithms. A simulator was developed based on a diffusion beam model to generate the characteristic beam loss signal spike and decay produced when a collimator jaw touches the beam, which is observed in a beam loss monitor (BLM). Empirical models derived from the available measurement data are used to simulate the steady-state beam loss and crosstalk between multiple BLMs. The simulator design is presented, together with simulation results and comparison to measurement data.

  12. iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems

    NASA Astrophysics Data System (ADS)

    Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.

    2017-11-01

    iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.

  13. Adjustment of Measurements with Multiplicative Errors: Error Analysis, Estimates of the Variance of Unit Weight, and Effect on Volume Estimation from LiDAR-Type Digital Elevation Models

    PubMed Central

    Shi, Yun; Xu, Peiliang; Peng, Junhuan; Shi, Chuang; Liu, Jingnan

    2014-01-01

    Modern observation technology has verified that measurement errors can be proportional to the true values of measurements such as GPS, VLBI baselines and LiDAR. Observational models of this type are called multiplicative error models. This paper is to extend the work of Xu and Shimada published in 2000 on multiplicative error models to analytical error analysis of quantities of practical interest and estimates of the variance of unit weight. We analytically derive the variance-covariance matrices of the three least squares (LS) adjustments, the adjusted measurements and the corrections of measurements in multiplicative error models. For quality evaluation, we construct five estimators for the variance of unit weight in association of the three LS adjustment methods. Although LiDAR measurements are contaminated with multiplicative random errors, LiDAR-based digital elevation models (DEM) have been constructed as if they were of additive random errors. We will simulate a model landslide, which is assumed to be surveyed with LiDAR, and investigate the effect of LiDAR-type multiplicative error measurements on DEM construction and its effect on the estimate of landslide mass volume from the constructed DEM. PMID:24434880

  14. A flexible object-oriented software framework for developing complex multimedia simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less

  15. Development of a resource modelling tool to support decision makers in pandemic influenza preparedness: The AsiaFluCap Simulator

    PubMed Central

    2012-01-01

    Background Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. Results The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. Conclusions The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software. PMID:23061807

  16. Analysis of in vitro fertilization data with multiple outcomes using discrete time-to-event analysis

    PubMed Central

    Maity, Arnab; Williams, Paige; Ryan, Louise; Missmer, Stacey; Coull, Brent; Hauser, Russ

    2014-01-01

    In vitro fertilization (IVF) is an increasingly common method of assisted reproductive technology. Because of the careful observation and followup required as part of the procedure, IVF studies provide an ideal opportunity to identify and assess clinical and demographic factors along with environmental exposures that may impact successful reproduction. A major challenge in analyzing data from IVF studies is handling the complexity and multiplicity of outcome, resulting from both multiple opportunities for pregnancy loss within a single IVF cycle in addition to multiple IVF cycles. To date, most evaluations of IVF studies do not make use of full data due to its complex structure. In this paper, we develop statistical methodology for analysis of IVF data with multiple cycles and possibly multiple failure types observed for each individual. We develop a general analysis framework based on a generalized linear modeling formulation that allows implementation of various types of models including shared frailty models, failure specific frailty models, and transitional models, using standard software. We apply our methodology to data from an IVF study conducted at the Brigham and Women’s Hospital, Massachusetts. We also summarize the performance of our proposed methods based on a simulation study. PMID:24317880

  17. AgMIP: Next Generation Models and Assessments

    NASA Astrophysics Data System (ADS)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6 that involves the key modeling groups from around the world including North America, Europe, South America, Sub-Saharan Africa, South Asia, East Asia, and Australia and Oceania. This community process will lead to mutually agreed protocols for coordinated global and regional assessments.

  18. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  19. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  20. Design of a multi-agent hydroeconomic model to simulate a complex human-water system: Early insights from the Jordan Water Project

    NASA Astrophysics Data System (ADS)

    Yoon, J.; Klassert, C. J. A.; Lachaut, T.; Selby, P. D.; Knox, S.; Gorelick, S.; Rajsekhar, D.; Tilmant, A.; Avisse, N.; Harou, J. J.; Gawel, E.; Klauer, B.; Mustafa, D.; Talozi, S.; Sigel, K.

    2015-12-01

    Our work focuses on development of a multi-agent, hydroeconomic model for purposes of water policy evaluation in Jordan. The model adopts a modular approach, integrating biophysical modules that simulate natural and engineered phenomena with human modules that represent behavior at multiple levels of decision making. The hydrologic modules are developed using spatially-distributed groundwater and surface water models, which are translated into compact simulators for efficient integration into the multi-agent model. For the groundwater model, we adopt a response matrix method approach in which a 3-dimensional MODFLOW model of a complex regional groundwater system is converted into a linear simulator of groundwater response by pre-processing drawdown results from several hundred numerical simulation runs. Surface water models for each major surface water basin in the country are developed in SWAT and similarly translated into simple rainfall-runoff functions for integration with the multi-agent model. The approach balances physically-based, spatially-explicit representation of hydrologic systems with the efficiency required for integration into a complex multi-agent model that is computationally amenable to robust scenario analysis. For the multi-agent model, we explicitly represent human agency at multiple levels of decision making, with agents representing riparian, management, supplier, and water user groups. The agents' decision making models incorporate both rule-based heuristics as well as economic optimization. The model is programmed in Python using Pynsim, a generalizable, open-source object-oriented code framework for modeling network-based water resource systems. The Jordan model is one of the first applications of Pynsim to a real-world water management case study. Preliminary results from a tanker market scenario run through year 2050 are presented in which several salient features of the water system are investigated: competition between urban and private farmer agents, the emergence of a private tanker market, disparities in economic wellbeing to different user groups caused by unique supply conditions, and response of the complex system to various policy interventions.

  1. Modeling techniques for quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Jirauschek, Christian; Kubis, Tillmann

    2014-03-01

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation of quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.

  2. Modeling techniques for quantum cascade lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jirauschek, Christian; Kubis, Tillmann

    2014-03-15

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation ofmore » quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.« less

  3. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  4. Targeted numerical simulations of binary black holes for GW170104

    NASA Astrophysics Data System (ADS)

    Healy, J.; Lange, J.; O'Shaughnessy, R.; Lousto, C. O.; Campanelli, M.; Williamson, A. R.; Zlochower, Y.; Calderón Bustillo, J.; Clark, J. A.; Evans, C.; Ferguson, D.; Ghonge, S.; Jani, K.; Khamesra, B.; Laguna, P.; Shoemaker, D. M.; Boyle, M.; García, A.; Hemberger, D. A.; Kidder, L. E.; Kumar, P.; Lovelace, G.; Pfeiffer, H. P.; Scheel, M. A.; Teukolsky, S. A.

    2018-03-01

    In response to LIGO's observation of GW170104, we performed a series of full numerical simulations of binary black holes, each designed to replicate likely realizations of its dynamics and radiation. These simulations have been performed at multiple resolutions and with two independent techniques to solve Einstein's equations. For the nonprecessing and precessing simulations, we demonstrate the two techniques agree mode by mode, at a precision substantially in excess of statistical uncertainties in current LIGO's observations. Conversely, we demonstrate our full numerical solutions contain information which is not accurately captured with the approximate phenomenological models commonly used to infer compact binary parameters. To quantify the impact of these differences on parameter inference for GW170104 specifically, we compare the predictions of our simulations and these approximate models to LIGO's observations of GW170104.

  5. The Mediated MIMIC Model for Understanding the Underlying Mechanism of DIF.

    PubMed

    Cheng, Ying; Shao, Can; Lathrop, Quinn N

    2016-02-01

    Due to its flexibility, the multiple-indicator, multiple-causes (MIMIC) model has become an increasingly popular method for the detection of differential item functioning (DIF). In this article, we propose the mediated MIMIC model method to uncover the underlying mechanism of DIF. This method extends the usual MIMIC model by including one variable or multiple variables that may completely or partially mediate the DIF effect. If complete mediation effect is found, the DIF effect is fully accounted for. Through our simulation study, we find that the mediated MIMIC model is very successful in detecting the mediation effect that completely or partially accounts for DIF, while keeping the Type I error rate well controlled for both balanced and unbalanced sample sizes between focal and reference groups. Because it is successful in detecting such mediation effects, the mediated MIMIC model may help explain DIF and give guidance in the revision of a DIF item.

  6. The Mediated MIMIC Model for Understanding the Underlying Mechanism of DIF

    PubMed Central

    Cheng, Ying; Shao, Can; Lathrop, Quinn N.

    2015-01-01

    Due to its flexibility, the multiple-indicator, multiple-causes (MIMIC) model has become an increasingly popular method for the detection of differential item functioning (DIF). In this article, we propose the mediated MIMIC model method to uncover the underlying mechanism of DIF. This method extends the usual MIMIC model by including one variable or multiple variables that may completely or partially mediate the DIF effect. If complete mediation effect is found, the DIF effect is fully accounted for. Through our simulation study, we find that the mediated MIMIC model is very successful in detecting the mediation effect that completely or partially accounts for DIF, while keeping the Type I error rate well controlled for both balanced and unbalanced sample sizes between focal and reference groups. Because it is successful in detecting such mediation effects, the mediated MIMIC model may help explain DIF and give guidance in the revision of a DIF item.

  7. Distribution of model uncertainty across multiple data streams

    NASA Astrophysics Data System (ADS)

    Wutzler, Thomas

    2014-05-01

    When confronting biogeochemical models with a diversity of observational data streams, we are faced with the problem of weighing the data streams. Without weighing or multiple blocked cost functions, model uncertainty is allocated to the sparse data streams and possible bias in processes that are strongly constraint is exported to processes that are constrained by sparse data streams only. In this study we propose an approach that aims at making model uncertainty a factor of observations uncertainty, that is constant over all data streams. Further we propose an implementation based on Monte-Carlo Markov chain sampling combined with simulated annealing that is able to determine this variance factor. The method is exemplified both with very simple models, artificial data and with an inversion of the DALEC ecosystem carbon model against multiple observations of Howland forest. We argue that the presented approach is able to help and maybe resolve the problem of bias export to sparse data streams.

  8. Selection of latent variables for multiple mixed-outcome models

    PubMed Central

    ZHOU, LING; LIN, HUAZHEN; SONG, XINYUAN; LI, YI

    2014-01-01

    Latent variable models have been widely used for modeling the dependence structure of multiple outcomes data. However, the formulation of a latent variable model is often unknown a priori, the misspecification will distort the dependence structure and lead to unreliable model inference. Moreover, multiple outcomes with varying types present enormous analytical challenges. In this paper, we present a class of general latent variable models that can accommodate mixed types of outcomes. We propose a novel selection approach that simultaneously selects latent variables and estimates parameters. We show that the proposed estimator is consistent, asymptotically normal and has the oracle property. The practical utility of the methods is confirmed via simulations as well as an application to the analysis of the World Values Survey, a global research project that explores peoples’ values and beliefs and the social and personal characteristics that might influence them. PMID:27642219

  9. LDRD project final report : hybrid AI/cognitive tactical behavior framework for LVC.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Djordjevich, Donna D.; Xavier, Patrick Gordon; Brannon, Nathan Gregory

    This Lab-Directed Research and Development (LDRD) sought to develop technology that enhances scenario construction speed, entity behavior robustness, and scalability in Live-Virtual-Constructive (LVC) simulation. We investigated issues in both simulation architecture and behavior modeling. We developed path-planning technology that improves the ability to express intent in the planning task while still permitting an efficient search algorithm. An LVC simulation demonstrated how this enables 'one-click' layout of squad tactical paths, as well as dynamic re-planning for simulated squads and for real and simulated mobile robots. We identified human response latencies that can be exploited in parallel/distributed architectures. We did an experimentalmore » study to determine where parallelization would be productive in Umbra-based force-on-force (FOF) simulations. We developed and implemented a data-driven simulation composition approach that solves entity class hierarchy issues and supports assurance of simulation fairness. Finally, we proposed a flexible framework to enable integration of multiple behavior modeling components that model working memory phenomena with different degrees of sophistication.« less

  10. Solar Occultation Retrieval Algorithm Development

    NASA Technical Reports Server (NTRS)

    Lumpe, Jerry D.

    2004-01-01

    This effort addresses the comparison and validation of currently operational solar occultation retrieval algorithms, and the development of generalized algorithms for future application to multiple platforms. initial development of generalized forward model algorithms capable of simulating transmission data from of the POAM II/III and SAGE II/III instruments. Work in the 2" quarter will focus on: completion of forward model algorithms, including accurate spectral characteristics for all instruments, and comparison of simulated transmission data with actual level 1 instrument data for specific occultation events.

  11. A new fault diagnosis algorithm for AUV cooperative localization system

    NASA Astrophysics Data System (ADS)

    Shi, Hongyang; Miao, Zhiyong; Zhang, Yi

    2017-10-01

    Multiple AUVs cooperative localization as a new kind of underwater positioning technology, not only can improve the positioning accuracy, but also has many advantages the single AUV does not have. It is necessary to detect and isolate the fault to increase the reliability and availability of the AUVs cooperative localization system. In this paper, the Extended Multiple Model Adaptive Cubature Kalmam Filter (EMMACKF) method is presented to detect the fault. The sensor failures are simulated based on the off-line experimental data. Experimental results have shown that the faulty apparatus can be diagnosed effectively using the proposed method. Compared with Multiple Model Adaptive Extended Kalman Filter and Multi-Model Adaptive Unscented Kalman Filter, both accuracy and timelines have been improved to some extent.

  12. An Investigation of Jogging Biomechanics using the Full-Body Lumbar Spine Model: Model Development and Validation

    PubMed Central

    Raabe, Margaret E.; Chaudhari, Ajit M.W.

    2016-01-01

    The ability of a biomechanical simulation to produce results that can translate to real-life situations is largely dependent on the physiological accuracy of the musculoskeletal model. There are a limited number of freely-available, full-body models that exist in OpenSim, and those that do exist are very limited in terms of trunk musculature and degrees of freedom in the spine. Properly modeling the motion and musculature of the trunk is necessary to most accurately estimate lower extremity and spinal loading. The objective of this study was to develop and validate a more physiologically accurate OpenSim full-body model. By building upon three previously developed OpenSim models, the Full-Body Lumbar Spine (FBLS) model, comprised of 21 segments, 30 degrees-of-freedom, and 324 musculotendon actuators, was developed. The five lumbar vertebrae were modeled as individual bodies, and coupled constraints were implemented to describe the net motion of the spine. The eight major muscle groups of the lumbar spine were modeled (rectus abdominis, external and internal obliques, erector spinae, multifidus, quadratus lumborum, psoas major, and latissimus dorsi), and many of these muscle groups were modeled as multiple fascicles allowing the large muscles to act in multiple directions. The resulting FBLS model's trunk muscle geometry, maximal isometric joint moments, and simulated muscle activations compare well to experimental data. The FBLS model will be made freely available (https://simtk.org/home/fullbodylumbar) for others to perform additional analyses and develop simulations investigating full-body dynamics and contributions of the trunk muscles to dynamic tasks. PMID:26947033

  13. Simulating the complex output of rainfall and hydrological processes using the information contained in large data sets: the Direct Sampling approach.

    NASA Astrophysics Data System (ADS)

    Oriani, Fabio

    2017-04-01

    The unpredictable nature of rainfall makes its estimation as much difficult as it is essential to hydrological applications. Stochastic simulation is often considered a convenient approach to asses the uncertainty of rainfall processes, but preserving their irregular behavior and variability at multiple scales is a challenge even for the most advanced techniques. In this presentation, an overview on the Direct Sampling technique [1] and its recent application to rainfall and hydrological data simulation [2, 3] is given. The algorithm, having its roots in multiple-point statistics, makes use of a training data set to simulate the outcome of a process without inferring any explicit probability measure: the data are simulated in time or space by sampling the training data set where a sufficiently similar group of neighbor data exists. This approach allows preserving complex statistical dependencies at different scales with a good approximation, while reducing the parameterization to the minimum. The straights and weaknesses of the Direct Sampling approach are shown through a series of applications to rainfall and hydrological data: from time-series simulation to spatial rainfall fields conditioned by elevation or a climate scenario. In the era of vast databases, is this data-driven approach a valid alternative to parametric simulation techniques? [1] Mariethoz G., Renard P., and Straubhaar J. (2010), The Direct Sampling method to perform multiple-point geostatistical simulations, Water. Rerous. Res., 46(11), http://dx.doi.org/10.1029/2008WR007621 [2] Oriani F., Straubhaar J., Renard P., and Mariethoz G. (2014), Simulation of rainfall time series from different climatic regions using the direct sampling technique, Hydrol. Earth Syst. Sci., 18, 3015-3031, http://dx.doi.org/10.5194/hess-18-3015-2014 [3] Oriani F., Borghi A., Straubhaar J., Mariethoz G., Renard P. (2016), Missing data simulation inside flow rate time-series using multiple-point statistics, Environ. Model. Softw., vol. 86, pp. 264 - 276, http://dx.doi.org/10.1016/j.envsoft.2016.10.002

  14. One-dimensional collision carts computer model and its design ideas for productive experiential learning

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang

    2012-05-01

    We develop an Easy Java Simulation (EJS) model for students to experience the physics of idealized one-dimensional collision carts. The physics model is described and simulated by both continuous dynamics and discrete transition during collision. In designing the simulations, we discuss briefly three pedagogical considerations namely (1) a consistent simulation world view with a pen and paper representation, (2) a data table, scientific graphs and symbolic mathematical representations for ease of data collection and multiple representational visualizations and (3) a game for simple concept testing that can further support learning. We also suggest using a physical world setup augmented by simulation by highlighting three advantages of real collision carts equipment such as a tacit 3D experience, random errors in measurement and the conceptual significance of conservation of momentum applied to just before and after collision. General feedback from the students has been relatively positive, and we hope teachers will find the simulation useful in their own classes.

  15. An ensemble-based dynamic Bayesian averaging approach for discharge simulations using multiple global precipitation products and hydrological models

    NASA Astrophysics Data System (ADS)

    Qi, Wei; Liu, Junguo; Yang, Hong; Sweetapple, Chris

    2018-03-01

    Global precipitation products are very important datasets in flow simulations, especially in poorly gauged regions. Uncertainties resulting from precipitation products, hydrological models and their combinations vary with time and data magnitude, and undermine their application to flow simulations. However, previous studies have not quantified these uncertainties individually and explicitly. This study developed an ensemble-based dynamic Bayesian averaging approach (e-Bay) for deterministic discharge simulations using multiple global precipitation products and hydrological models. In this approach, the joint probability of precipitation products and hydrological models being correct is quantified based on uncertainties in maximum and mean estimation, posterior probability is quantified as functions of the magnitude and timing of discharges, and the law of total probability is implemented to calculate expected discharges. Six global fine-resolution precipitation products and two hydrological models of different complexities are included in an illustrative application. e-Bay can effectively quantify uncertainties and therefore generate better deterministic discharges than traditional approaches (weighted average methods with equal and varying weights and maximum likelihood approach). The mean Nash-Sutcliffe Efficiency values of e-Bay are up to 0.97 and 0.85 in training and validation periods respectively, which are at least 0.06 and 0.13 higher than traditional approaches. In addition, with increased training data, assessment criteria values of e-Bay show smaller fluctuations than traditional approaches and its performance becomes outstanding. The proposed e-Bay approach bridges the gap between global precipitation products and their pragmatic applications to discharge simulations, and is beneficial to water resources management in ungauged or poorly gauged regions across the world.

  16. FastSim: A Fast Simulation for the SuperB Detector

    NASA Astrophysics Data System (ADS)

    Andreassen, R.; Arnaud, N.; Brown, D. N.; Burmistrov, L.; Carlson, J.; Cheng, C.-h.; Di Simone, A.; Gaponenko, I.; Manoni, E.; Perez, A.; Rama, M.; Roberts, D.; Rotondo, M.; Simi, G.; Sokoloff, M.; Suzuki, A.; Walsh, J.

    2011-12-01

    We have developed a parameterized (fast) simulation for detector optimization and physics reach studies of the proposed SuperB Flavor Factory in Italy. Detector components are modeled as thin sections of planes, cylinders, disks or cones. Particle-material interactions are modeled using simplified cross-sections and formulas. Active detectors are modeled using parameterized response functions. Geometry and response parameters are configured using xml files with a custom-designed schema. Reconstruction algorithms adapted from BaBar are used to build tracks and clusters. Multiple sources of background signals can be merged with primary signals. Pattern recognition errors are modeled statistically by randomly misassigning nearby tracking hits. Standard BaBar analysis tuples are used as an event output. Hadronic B meson pair events can be simulated at roughly 10Hz.

  17. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Florian Wellmann, J.; Thiele, Sam T.; Lindsay, Mark D.; Jessell, Mark W.

    2016-03-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilize the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  18. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    NASA Astrophysics Data System (ADS)

    Wellmann, J. F.; Thiele, S. T.; Lindsay, M. D.; Jessell, M. W.

    2015-11-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilise the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a~link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential-fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  19. Direct coupling of a genome-scale microbial in silico model and a groundwater reactive transport model.

    PubMed

    Fang, Yilin; Scheibe, Timothy D; Mahadevan, Radhakrishnan; Garg, Srinath; Long, Philip E; Lovley, Derek R

    2011-03-25

    The activity of microorganisms often plays an important role in dynamic natural attenuation or engineered bioremediation of subsurface contaminants, such as chlorinated solvents, metals, and radionuclides. To evaluate and/or design bioremediated systems, quantitative reactive transport models are needed. State-of-the-art reactive transport models often ignore the microbial effects or simulate the microbial effects with static growth yield and constant reaction rate parameters over simulated conditions, while in reality microorganisms can dynamically modify their functionality (such as utilization of alternative respiratory pathways) in response to spatial and temporal variations in environmental conditions. Constraint-based genome-scale microbial in silico models, using genomic data and multiple-pathway reaction networks, have been shown to be able to simulate transient metabolism of some well studied microorganisms and identify growth rate, substrate uptake rates, and byproduct rates under different growth conditions. These rates can be identified and used to replace specific microbially-mediated reaction rates in a reactive transport model using local geochemical conditions as constraints. We previously demonstrated the potential utility of integrating a constraint-based microbial metabolism model with a reactive transport simulator as applied to bioremediation of uranium in groundwater. However, that work relied on an indirect coupling approach that was effective for initial demonstration but may not be extensible to more complex problems that are of significant interest (e.g., communities of microbial species and multiple constraining variables). Here, we extend that work by presenting and demonstrating a method of directly integrating a reactive transport model (FORTRAN code) with constraint-based in silico models solved with IBM ILOG CPLEX linear optimizer base system (C library). The models were integrated with BABEL, a language interoperability tool. The modeling system is designed in such a way that constraint-based models targeting different microorganisms or competing organism communities can be easily plugged into the system. Constraint-based modeling is very costly given the size of a genome-scale reaction network. To save computation time, a binary tree is traversed to examine the concentration and solution pool generated during the simulation in order to decide whether the constraint-based model should be called. We also show preliminary results from the integrated model including a comparison of the direct and indirect coupling approaches and evaluated the ability of the approach to simulate field experiment. Published by Elsevier B.V.

  20. Identification of treatment responders based on multiple longitudinal outcomes with applications to multiple sclerosis patients.

    PubMed

    Kondo, Yumi; Zhao, Yinshan; Petkau, John

    2017-05-30

    Identification of treatment responders is a challenge in comparative studies where treatment efficacy is measured by multiple longitudinally collected continuous and count outcomes. Existing procedures often identify responders on the basis of only a single outcome. We propose a novel multiple longitudinal outcome mixture model that assumes that, conditionally on a cluster label, each longitudinal outcome is from a generalized linear mixed effect model. We utilize a Monte Carlo expectation-maximization algorithm to obtain the maximum likelihood estimates of our high-dimensional model and classify patients according to their estimated posterior probability of being a responder. We demonstrate the flexibility of our novel procedure on two multiple sclerosis clinical trial datasets with distinct data structures. Our simulation study shows that incorporating multiple outcomes improves the responder identification performance; this can occur even if some of the outcomes are ineffective. Our general procedure facilitates the identification of responders who are comprehensively defined by multiple outcomes from various distributions. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Sea Surface Salinity Variability from Simulations and Observations: Preparing for Aquarius

    NASA Technical Reports Server (NTRS)

    Jacob, S. Daniel; LeVine, David M.

    2010-01-01

    Oceanic fresh water transport has been shown to play an important role in the global hydrological cycle. Sea surface salinity (SSS) is representative of the surface fresh water fluxes and the upcoming Aquarius mission scheduled to be launched in December 2010 will provide excellent spatial and temporal SSS coverage to better estimate the net exchange. In most ocean general circulation models, SSS is relaxed to climatology to prevent model drift. While SST remains a well observed variable, relaxing to SST reduces the range of SSS variability in the simulations (Fig.1). The main objective of the present study is to simulate surface tracers using a primitive equation ocean model for multiple forcing data sets to identify and establish a baseline SSS variability. The simulated variability scales are compared to those from near-surface argo salinity measurements.

  2. Electromagnetic Modeling of Human Body Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada

    Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.

  3. Study of cosmic ray events with high muon multiplicity using the ALICE detector at the CERN Large Hadron Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collaboration: ALICE Collaboration

    2016-01-01

    ALICE is one of four large experiments at the CERN Large Hadron Collider near Geneva, specially designed to study particle production in ultra-relativistic heavy-ion collisions. Located 52 meters underground with 28 meters of overburden rock, it has also been used to detect muons produced by cosmic ray interactions in the upper atmosphere. In this paper, we present the multiplicity distribution of these atmospheric muons and its comparison with Monte Carlo simulations. This analysis exploits the large size and excellent tracking capability of the ALICE Time Projection Chamber. A special emphasis is given to the study of high multiplicity events containingmore » more than 100 reconstructed muons and corresponding to a muon areal density ρ{sub μ} > 5.9 m{sup −2}. Similar events have been studied in previous underground experiments such as ALEPH and DELPHI at LEP. While these experiments were able to reproduce the measured muon multiplicity distribution with Monte Carlo simulations at low and intermediate multiplicities, their simulations failed to describe the frequency of the highest multiplicity events. In this work we show that the high multiplicity events observed in ALICE stem from primary cosmic rays with energies above 10{sup 16} eV and that the frequency of these events can be successfully described by assuming a heavy mass composition of primary cosmic rays in this energy range. The development of the resulting air showers was simulated using the latest version of QGSJET to model hadronic interactions. This observation places significant constraints on alternative, more exotic, production mechanisms for these events.« less

  4. QUANTIFYING SEASONAL SHIFTS IN NITROGEN SOURCES TO OREGON ESTUARIES: PART II: TRANSPORT MODELING

    EPA Science Inventory

    Identifying the sources of dissolved inorganic nitrogen (DIN) in estuaries is complicated by the multiple sources, temporal variability in inputs, and variations in transport. We used a hydrodynamic model to simulate the transport and uptake of three sources of DIN (oceanic, riv...

  5. Simulating Mercury And Methyl Mercury Stream Concentrations At Multiple Scales in a Wetland Influenced Coastal Plain Watershed (McTier Creek, SC, USA)

    EPA Science Inventory

    Use of Mechanistic Models to?Improve Understanding: Differential, mass balance, process-based Spatial and temporal resolution Necessary simplifications of system complexity Combing field monitoring and modeling efforts Balance between capturing complexity and maintaining...

  6. Optimal averaging of soil moisture predictions from ensemble land surface model simulations

    USDA-ARS?s Scientific Manuscript database

    The correct interpretation of ensemble 3 soil moisture information obtained from the parallel implementation of multiple land surface models (LSMs) requires information concerning the LSM ensemble’s mutual error covariance. Here we propose a new technique for obtaining such information using an inst...

  7. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  8. Multimodel ensembles of wheat growth: many models are better than one.

    PubMed

    Martre, Pierre; Wallach, Daniel; Asseng, Senthold; Ewert, Frank; Jones, James W; Rötter, Reimund P; Boote, Kenneth J; Ruane, Alex C; Thorburn, Peter J; Cammarano, Davide; Hatfield, Jerry L; Rosenzweig, Cynthia; Aggarwal, Pramod K; Angulo, Carlos; Basso, Bruno; Bertuzzi, Patrick; Biernath, Christian; Brisson, Nadine; Challinor, Andrew J; Doltra, Jordi; Gayler, Sebastian; Goldberg, Richie; Grant, Robert F; Heng, Lee; Hooker, Josh; Hunt, Leslie A; Ingwersen, Joachim; Izaurralde, Roberto C; Kersebaum, Kurt Christian; Müller, Christoph; Kumar, Soora Naresh; Nendel, Claas; O'leary, Garry; Olesen, Jørgen E; Osborne, Tom M; Palosuo, Taru; Priesack, Eckart; Ripoche, Dominique; Semenov, Mikhail A; Shcherbak, Iurii; Steduto, Pasquale; Stöckle, Claudio O; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Travasso, Maria; Waha, Katharina; White, Jeffrey W; Wolf, Joost

    2015-02-01

    Crop models of crop growth are increasingly used to quantify the impact of global changes due to climate or crop management. Therefore, accuracy of simulation results is a major concern. Studies with ensembles of crop models can give valuable information about model accuracy and uncertainty, but such studies are difficult to organize and have only recently begun. We report on the largest ensemble study to date, of 27 wheat models tested in four contrasting locations for their accuracy in simulating multiple crop growth and yield variables. The relative error averaged over models was 24-38% for the different end-of-season variables including grain yield (GY) and grain protein concentration (GPC). There was little relation between error of a model for GY or GPC and error for in-season variables. Thus, most models did not arrive at accurate simulations of GY and GPC by accurately simulating preceding growth dynamics. Ensemble simulations, taking either the mean (e-mean) or median (e-median) of simulated values, gave better estimates than any individual model when all variables were considered. Compared to individual models, e-median ranked first in simulating measured GY and third in GPC. The error of e-mean and e-median declined with an increasing number of ensemble members, with little decrease beyond 10 models. We conclude that multimodel ensembles can be used to create new estimators with improved accuracy and consistency in simulating growth dynamics. We argue that these results are applicable to other crop species, and hypothesize that they apply more generally to ecological system models. © 2014 John Wiley & Sons Ltd.

  9. Multimodel Ensembles of Wheat Growth: More Models are Better than One

    NASA Technical Reports Server (NTRS)

    Martre, Pierre; Wallach, Daniel; Asseng, Senthold; Ewert, Frank; Jones, James W.; Rotter, Reimund P.; Boote, Kenneth J.; Ruane, Alex C.; Thorburn, Peter J.; Cammarano, Davide; hide

    2015-01-01

    Crop models of crop growth are increasingly used to quantify the impact of global changes due to climate or crop management. Therefore, accuracy of simulation results is a major concern. Studies with ensembles of crop models can give valuable information about model accuracy and uncertainty, but such studies are difficult to organize and have only recently begun. We report on the largest ensemble study to date, of 27 wheat models tested in four contrasting locations for their accuracy in simulating multiple crop growth and yield variables. The relative error averaged over models was 24-38% for the different end-of-season variables including grain yield (GY) and grain protein concentration (GPC). There was little relation between error of a model for GY or GPC and error for in-season variables. Thus, most models did not arrive at accurate simulations of GY and GPC by accurately simulating preceding growth dynamics. Ensemble simulations, taking either the mean (e-mean) or median (e-median) of simulated values, gave better estimates than any individual model when all variables were considered. Compared to individual models, e-median ranked first in simulating measured GY and third in GPC. The error of e-mean and e-median declined with an increasing number of ensemble members, with little decrease beyond 10 models. We conclude that multimodel ensembles can be used to create new estimators with improved accuracy and consistency in simulating growth dynamics. We argue that these results are applicable to other crop species, and hypothesize that they apply more generally to ecological system models.

  10. Multimodel Ensembles of Wheat Growth: Many Models are Better than One

    NASA Technical Reports Server (NTRS)

    Martre, Pierre; Wallach, Daniel; Asseng, Senthold; Ewert, Frank; Jones, James W.; Rotter, Reimund P.; Boote, Kenneth J.; Ruane, Alexander C.; Thorburn, Peter J.; Cammarano, Davide; hide

    2015-01-01

    Crop models of crop growth are increasingly used to quantify the impact of global changes due to climate or crop management. Therefore, accuracy of simulation results is a major concern. Studies with ensembles of crop model scan give valuable information about model accuracy and uncertainty, but such studies are difficult to organize and have only recently begun. We report on the largest ensemble study to date, of 27 wheat models tested in four contrasting locations for their accuracy in simulating multiple crop growth and yield variables. The relative error averaged over models was 2438 for the different end-of-season variables including grain yield (GY) and grain protein concentration (GPC). There was little relation between error of a model for GY or GPC and error for in-season variables. Thus, most models did not arrive at accurate simulations of GY and GPC by accurately simulating preceding growth dynamics. Ensemble simulations, taking either the mean (e-mean) or median (e-median) of simulated values, gave better estimates than any individual model when all variables were considered. Compared to individual models, e-median ranked first in simulating measured GY and third in GPC. The error of e-mean and e-median declined with an increasing number of ensemble members, with little decrease beyond 10 models. We conclude that multimodel ensembles can be used to create new estimators with improved accuracy and consistency in simulating growth dynamics. We argue that these results are applicable to other crop species, and hypothesize that they apply more generally to ecological system models.

  11. Steady-states for shear flows of a liquid-crystal model: Multiplicity, stability, and hysteresis

    NASA Astrophysics Data System (ADS)

    Dorn, Tim; Liu, Weishi

    In this work, we study shear flows of a fluid layer between two solid blocks via a liquid-crystal type model proposed in [C.H.A. Cheng, L.H. Kellogg, S. Shkoller, D.L. Turcotte, A liquid-crystal model for friction, Proc. Natl. Acad. Sci. USA 21 (2007) 1-5] for an understanding of frictions. A characterization on the existence and multiplicity of steady-states is provided. Stability issue of the steady-states is examined mainly focusing on bifurcations of zero eigenvalues. The stability result suggests that this simple model exhibits hysteresis, and it is supported by a numerical simulation.

  12. Computational modeling of cardiovascular response to orthostatic stress

    NASA Technical Reports Server (NTRS)

    Heldt, Thomas; Shim, Eun B.; Kamm, Roger D.; Mark, Roger G.

    2002-01-01

    The objective of this study is to develop a model of the cardiovascular system capable of simulating the short-term (< or = 5 min) transient and steady-state hemodynamic responses to head-up tilt and lower body negative pressure. The model consists of a closed-loop lumped-parameter representation of the circulation connected to set-point models of the arterial and cardiopulmonary baroreflexes. Model parameters are largely based on literature values. Model verification was performed by comparing the simulation output under baseline conditions and at different levels of orthostatic stress to sets of population-averaged hemodynamic data reported in the literature. On the basis of experimental evidence, we adjusted some model parameters to simulate experimental data. Orthostatic stress simulations are not statistically different from experimental data (two-sided test of significance with Bonferroni adjustment for multiple comparisons). Transient response characteristics of heart rate to tilt also compare well with reported data. A case study is presented on how the model is intended to be used in the future to investigate the effects of post-spaceflight orthostatic intolerance.

  13. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less

  14. Simulation of gaseous pollutant dispersion around an isolated building using the k-ω SST (shear stress transport) turbulence model.

    PubMed

    Yu, Hesheng; Thé, Jesse

    2017-05-01

    The dispersion of gaseous pollutant around buildings is complex due to complex turbulence features such as flow detachment and zones of high shear. Computational fluid dynamics (CFD) models are one of the most promising tools to describe the pollutant distribution in the near field of buildings. Reynolds-averaged Navier-Stokes (RANS) models are the most commonly used CFD techniques to address turbulence transport of the pollutant. This research work studies the use of [Formula: see text] closure model for the gas dispersion around a building by fully resolving the viscous sublayer for the first time. The performance of standard [Formula: see text] model is also included for comparison, along with results of an extensively validated Gaussian dispersion model, the U.S. Environmental Protection Agency (EPA) AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model). This study's CFD models apply the standard [Formula: see text] and the [Formula: see text] turbulence models to obtain wind flow field. A passive concentration transport equation is then calculated based on the resolved flow field to simulate the distribution of pollutant concentrations. The resultant simulation of both wind flow and concentration fields are validated rigorously by extensive data using multiple validation metrics. The wind flow field can be acceptably modeled by the [Formula: see text] model. However, the [Formula: see text] model fails to simulate the gas dispersion. The [Formula: see text] model outperforms [Formula: see text] in both flow and dispersion simulations, with higher hit rates for dimensionless velocity components and higher "factor of 2" of observations (FAC2) for normalized concentration. All these validation metrics of [Formula: see text] model pass the quality assurance criteria recommended by The Association of German Engineers (Verein Deutscher Ingenieure, VDI) guideline. Furthermore, these metrics are better than or the same as those in the literature. Comparison between the performances of [Formula: see text] and AERMOD shows that the CFD simulation is superior to Gaussian-type model for pollutant dispersion in the near wake of obstacles. AERMOD can perform as a screening tool for near-field gas dispersion due to its expeditious calculation and the ability to handle complicated cases. The utilization of [Formula: see text] to simulate gaseous pollutant dispersion around an isolated building is appropriate and is expected to be suitable for complex urban environment. Multiple validation metrics of [Formula: see text] turbulence model in CFD quantitatively indicated that this turbulence model was appropriate for the simulation of gas dispersion around buildings. CFD is, therefore, an attractive alternative to wind tunnel for modeling gas dispersion in urban environment due to its excellent performance, and lower cost.

  15. Evaluating the Sensitivity of Agricultural Model Performance to Different Climate Inputs: Supplemental Material

    NASA Technical Reports Server (NTRS)

    Glotter, Michael J.; Ruane, Alex C.; Moyer, Elisabeth J.; Elliott, Joshua W.

    2015-01-01

    Projections of future food production necessarily rely on models, which must themselves be validated through historical assessments comparing modeled and observed yields. Reliable historical validation requires both accurate agricultural models and accurate climate inputs. Problems with either may compromise the validation exercise. Previous studies have compared the effects of different climate inputs on agricultural projections but either incompletely or without a ground truth of observed yields that would allow distinguishing errors due to climate inputs from those intrinsic to the crop model. This study is a systematic evaluation of the reliability of a widely used crop model for simulating U.S. maize yields when driven by multiple observational data products. The parallelized Decision Support System for Agrotechnology Transfer (pDSSAT) is driven with climate inputs from multiple sources reanalysis, reanalysis that is bias corrected with observed climate, and a control dataset and compared with observed historical yields. The simulations show that model output is more accurate when driven by any observation-based precipitation product than when driven by non-bias-corrected reanalysis. The simulations also suggest, in contrast to previous studies, that biased precipitation distribution is significant for yields only in arid regions. Some issues persist for all choices of climate inputs: crop yields appear to be oversensitive to precipitation fluctuations but under sensitive to floods and heat waves. These results suggest that the most important issue for agricultural projections may be not climate inputs but structural limitations in the crop models themselves.

  16. Evaluating the sensitivity of agricultural model performance to different climate inputs

    PubMed Central

    Glotter, Michael J.; Moyer, Elisabeth J.; Ruane, Alex C.; Elliott, Joshua W.

    2017-01-01

    Projections of future food production necessarily rely on models, which must themselves be validated through historical assessments comparing modeled to observed yields. Reliable historical validation requires both accurate agricultural models and accurate climate inputs. Problems with either may compromise the validation exercise. Previous studies have compared the effects of different climate inputs on agricultural projections, but either incompletely or without a ground truth of observed yields that would allow distinguishing errors due to climate inputs from those intrinsic to the crop model. This study is a systematic evaluation of the reliability of a widely-used crop model for simulating U.S. maize yields when driven by multiple observational data products. The parallelized Decision Support System for Agrotechnology Transfer (pDSSAT) is driven with climate inputs from multiple sources – reanalysis, reanalysis bias-corrected with observed climate, and a control dataset – and compared to observed historical yields. The simulations show that model output is more accurate when driven by any observation-based precipitation product than when driven by un-bias-corrected reanalysis. The simulations also suggest, in contrast to previous studies, that biased precipitation distribution is significant for yields only in arid regions. However, some issues persist for all choices of climate inputs: crop yields appear oversensitive to precipitation fluctuations but undersensitive to floods and heat waves. These results suggest that the most important issue for agricultural projections may be not climate inputs but structural limitations in the crop models themselves. PMID:29097985

  17. Driver fatigue detection through multiple entropy fusion analysis in an EEG-based system

    PubMed Central

    Min, Jianliang; Wang, Ping

    2017-01-01

    Driver fatigue is an important contributor to road accidents, and fatigue detection has major implications for transportation safety. The aim of this research is to analyze the multiple entropy fusion method and evaluate several channel regions to effectively detect a driver's fatigue state based on electroencephalogram (EEG) records. First, we fused multiple entropies, i.e., spectral entropy, approximate entropy, sample entropy and fuzzy entropy, as features compared with autoregressive (AR) modeling by four classifiers. Second, we captured four significant channel regions according to weight-based electrodes via a simplified channel selection method. Finally, the evaluation model for detecting driver fatigue was established with four classifiers based on the EEG data from four channel regions. Twelve healthy subjects performed continuous simulated driving for 1–2 hours with EEG monitoring on a static simulator. The leave-one-out cross-validation approach obtained an accuracy of 98.3%, a sensitivity of 98.3% and a specificity of 98.2%. The experimental results verified the effectiveness of the proposed method, indicating that the multiple entropy fusion features are significant factors for inferring the fatigue state of a driver. PMID:29220351

  18. Detecting seasonal variations of soil parameters via field measurements and stochastic simulations in the hillslope

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; An, Hyunuk; Kim, Sanghyun

    2015-04-01

    Soil moisture, a critical factor in hydrologic systems, plays a key role in synthesizing interactions among soil, climate, hydrological response, solute transport and ecosystem dynamics. The spatial and temporal distribution of soil moisture at a hillslope scale is essential for understanding hillslope runoff generation processes. In this study, we implement Monte Carlo simulations in the hillslope scale using a three-dimensional surface-subsurface integrated model (3D model). Numerical simulations are compared with multiple soil moistures which had been measured using TDR(Mini_TRASE) for 22 locations in 2 or 3 depths during a whole year at a hillslope (area: 2100 square meters) located in Bongsunsa Watershed, South Korea. In stochastic simulations via Monte Carlo, uncertainty of the soil parameters and input forcing are considered and model ensembles showing good performance are selected separately for several seasonal periods. The presentation will be focused on the characterization of seasonal variations of model parameters based on simulations with field measurements. In addition, structural limitations of the contemporary modeling method will be discussed.

  19. GPU based 3D feature profile simulation of high-aspect ratio contact hole etch process under fluorocarbon plasmas

    NASA Astrophysics Data System (ADS)

    Chun, Poo-Reum; Lee, Se-Ah; Yook, Yeong-Geun; Choi, Kwang-Sung; Cho, Deog-Geun; Yu, Dong-Hun; Chang, Won-Seok; Kwon, Deuk-Chul; Im, Yeon-Ho

    2013-09-01

    Although plasma etch profile simulation has been attracted much interest for developing reliable plasma etching, there still exist big gaps between current research status and predictable modeling due to the inherent complexity of plasma process. As an effort to address this issue, we present 3D feature profile simulation coupled with well-defined plasma-surface kinetic model for silicon dioxide etching process under fluorocarbon plasmas. To capture the realistic plasma surface reaction behaviors, a polymer layer based surface kinetic model was proposed to consider the simultaneous polymer deposition and oxide etching. Finally, the realistic plasma surface model was used for calculation of speed function for 3D topology simulation, which consists of multiple level set based moving algorithm, and ballistic transport module. In addition, the time consumable computations in the ballistic transport calculation were improved drastically by GPU based numerical computation, leading to the real time computation. Finally, we demonstrated that the surface kinetic model could be coupled successfully for 3D etch profile simulations in high-aspect ratio contact hole plasma etching.

  20. Joint estimation over multiple individuals improves behavioural state inference from animal movement data.

    PubMed

    Jonsen, Ian

    2016-02-08

    State-space models provide a powerful way to scale up inference of movement behaviours from individuals to populations when the inference is made across multiple individuals. Here, I show how a joint estimation approach that assumes individuals share identical movement parameters can lead to improved inference of behavioural states associated with different movement processes. I use simulated movement paths with known behavioural states to compare estimation error between nonhierarchical and joint estimation formulations of an otherwise identical state-space model. Behavioural state estimation error was strongly affected by the degree of similarity between movement patterns characterising the behavioural states, with less error when movements were strongly dissimilar between states. The joint estimation model improved behavioural state estimation relative to the nonhierarchical model for simulated data with heavy-tailed Argos location errors. When applied to Argos telemetry datasets from 10 Weddell seals, the nonhierarchical model estimated highly uncertain behavioural state switching probabilities for most individuals whereas the joint estimation model yielded substantially less uncertainty. The joint estimation model better resolved the behavioural state sequences across all seals. Hierarchical or joint estimation models should be the preferred choice for estimating behavioural states from animal movement data, especially when location data are error-prone.

Top