Sample records for general simulation framework

  1. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  2. Gas turbine system simulation: An object-oriented approach

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Follen, Gregory J.; Putt, Charles W.

    1993-01-01

    A prototype gas turbine engine simulation has been developed that offers a generalized framework for the simulation of engines subject to steady-state and transient operating conditions. The prototype is in preliminary form, but it successfully demonstrates the viability of an object-oriented approach for generalized simulation applications. Although object oriented programming languages are-relative to FORTRAN-somewhat austere, it is proposed that gas turbine simulations of an interdisciplinary nature will benefit significantly in terms of code reliability, maintainability, and manageability. This report elucidates specific gas turbine simulation obstacles that an object-oriented framework can overcome and describes the opportunity for interdisciplinary simulation that the approach offers.

  3. A general CFD framework for fault-resilient simulations based on multi-resolution information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-10-01

    We develop a general CFD framework for multi-resolution simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy, in space-time, simulated fields. We combine approximation theory and domain decomposition together with statistical learning techniques, e.g. coKriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation (a) on a small number of spatial "patches" distributed across the domain, simulated by finite differences at fine resolution and (b) on the entire domain simulated at very low resolution, thus fusing multi-resolution models to obtain the final answer. Second, we simulate the flow in a lid-driven cavity in an analogous fashion, by fusing finite difference solutions obtained with fine and low resolution assuming gappy data sets. We investigate the influence of various parameters for this framework, including the correlation kernel, the size of a buffer employed in estimating boundary conditions, the coarseness of the resolution of auxiliary data, and the communication frequency across different patches in fusing the information at different resolution levels. In addition to its robustness and resilience, the new framework can be employed to generalize previous multiscale approaches involving heterogeneous discretizations or even fundamentally different flow descriptions, e.g. in continuum-atomistic simulations.

  4. Games and Simulations in Online Learning: Research and Development Frameworks

    ERIC Educational Resources Information Center

    Gibson, David; Aldrich, Clark; Prensky, Marc

    2007-01-01

    Games and Simulations in Online Learning: Research and Development Frameworks examines the potential of games and simulations in online learning, and how the future could look as developers learn to use the emerging capabilities of the Semantic Web. It presents a general understanding of how the Semantic Web will impact education and how games and…

  5. Operational framework for quantum measurement simulability

    NASA Astrophysics Data System (ADS)

    Guerini, Leonardo; Bavaresco, Jessica; Terra Cunha, Marcelo; Acín, Antonio

    2017-09-01

    We introduce a framework for simulating quantum measurements based on classical processing of a set of accessible measurements. Well-known concepts such as joint measurability and projective simulability naturally emerge as particular cases of our framework, but our study also leads to novel results and questions. First, a generalisation of joint measurability is derived, which yields a hierarchy for the incompatibility of sets of measurements. A similar hierarchy is defined based on the number of outcomes necessary to perform a simulation of a given measurement. This general approach also allows us to identify connections between different kinds of simulability and, in particular, we characterise the qubit measurements that are projective-simulable in terms of joint measurability. Finally, we discuss how our framework can be interpreted in the context of resource theories.

  6. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  7. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  8. Quantifying uncertainty and computational complexity for pore-scale simulations

    NASA Astrophysics Data System (ADS)

    Chen, C.; Yuan, Z.; Wang, P.; Yang, X.; Zhenyan, L.

    2016-12-01

    Pore-scale simulation is an essential tool to understand the complex physical process in many environmental problems, from multi-phase flow in the subsurface to fuel cells. However, in practice, factors such as sample heterogeneity, data sparsity and in general, our insufficient knowledge of the underlying process, render many simulation parameters and hence the prediction results uncertain. Meanwhile, most pore-scale simulations (in particular, direct numerical simulation) incur high computational cost due to finely-resolved spatio-temporal scales, which further limits our data/samples collection. To address those challenges, we propose a novel framework based on the general polynomial chaos (gPC) and build a surrogate model representing the essential features of the underlying system. To be specific, we apply the novel framework to analyze the uncertainties of the system behavior based on a series of pore-scale numerical experiments, such as flow and reactive transport in 2D heterogeneous porous media and 3D packed beds. Comparing with recent pore-scale uncertainty quantification studies using Monte Carlo techniques, our new framework requires fewer number of realizations and hence considerably reduce the overall computational cost, while maintaining the desired accuracy.

  9. Developing interprofessional health competencies in a virtual world

    PubMed Central

    King, Sharla; Chodos, David; Stroulia, Eleni; Carbonaro, Mike; MacKenzie, Mark; Reid, Andrew; Torres, Lisa; Greidanus, Elaine

    2012-01-01

    Background Virtual worlds provide a promising means of delivering simulations for developing interprofessional health skills. However, developing and implementing a virtual world simulation is a challenging process, in part because of the novelty of virtual worlds as a simulation platform and also because of the degree of collaboration required among technical and subject experts. Thus, it can be difficult to ensure that the simulation is both technically satisfactory and educationally appropriate. Methods To address this challenge, we propose the use of de Freitas and Oliver's four-dimensional framework as a means of guiding the development process. We give an overview of the framework and describe how its principles can be applied to the development of virtual world simulations. Results We present two virtual world simulation pilot projects that adopted this approach, and describe our development experience in these projects. We directly connect this experience to the four-dimensional framework, thus validating the framework's applicability to the projects and to the context of virtual world simulations in general. Conclusions We present a series of recommendations for developing virtual world simulations for interprofessional health education. These recommendations are based on the four-dimensional framework and are also informed by our experience with the pilot projects. PMID:23195649

  10. Small-scale multi-axial hybrid simulation of a shear-critical reinforced concrete frame

    NASA Astrophysics Data System (ADS)

    Sadeghian, Vahid; Kwon, Oh-Sung; Vecchio, Frank

    2017-10-01

    This study presents a numerical multi-scale simulation framework which is extended to accommodate hybrid simulation (numerical-experimental integration). The framework is enhanced with a standardized data exchange format and connected to a generalized controller interface program which facilitates communication with various types of laboratory equipment and testing configurations. A small-scale experimental program was conducted using a six degree-of-freedom hydraulic testing equipment to verify the proposed framework and provide additional data for small-scale testing of shearcritical reinforced concrete structures. The specimens were tested in a multi-axial hybrid simulation manner under a reversed cyclic loading condition simulating earthquake forces. The physical models were 1/3.23-scale representations of a beam and two columns. A mixed-type modelling technique was employed to analyze the remainder of the structures. The hybrid simulation results were compared against those obtained from a large-scale test and finite element analyses. The study found that if precautions are taken in preparing model materials and if the shear-related mechanisms are accurately considered in the numerical model, small-scale hybrid simulations can adequately simulate the behaviour of shear-critical structures. Although the findings of the study are promising, to draw general conclusions additional test data are required.

  11. Event-based simulation of networks with pulse delayed coupling

    NASA Astrophysics Data System (ADS)

    Klinshov, Vladimir; Nekorkin, Vladimir

    2017-10-01

    Pulse-mediated interactions are common in networks of different nature. Here we develop a general framework for simulation of networks with pulse delayed coupling. We introduce the discrete map governing the dynamics of such networks and describe the computation algorithm for its numerical simulation.

  12. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  13. Efficient particle-in-cell simulation of auroral plasma phenomena using a CUDA enabled graphics processing unit

    NASA Astrophysics Data System (ADS)

    Sewell, Stephen

    This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.

  14. Use of a PhET Interactive Simulation in General Chemistry Laboratory: Models of the Hydrogen Atom

    ERIC Educational Resources Information Center

    Clark, Ted M.; Chamberlain, Julia M.

    2014-01-01

    An activity supporting the PhET interactive simulation, Models of the Hydrogen Atom, has been designed and used in the laboratory portion of a general chemistry course. This article describes the framework used to successfully accomplish implementation on a large scale. The activity guides students through a comparison and analysis of the six…

  15. Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations

    DOE PAGES

    Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting; ...

    2018-03-28

    Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less

  16. Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting

    Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less

  17. Software Geometry in Simulations

    NASA Astrophysics Data System (ADS)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  18. Etomica: an object-oriented framework for molecular simulation.

    PubMed

    Schultz, Andrew J; Kofke, David A

    2015-03-30

    We describe the design of an object-oriented library of software components that are suitable for constructing simulations of systems of interacting particles. The emphasis of the discussion is on the general design of the components and how they interact, and less on details of the programming interface or its implementation. Example code is provided as an aid to understanding object-oriented programming structures and to demonstrate how the framework is applied. © 2015 Wiley Periodicals, Inc.

  19. A general framework for numerical simulation of improvised explosive device (IED)-detection scenarios using density functional theory (DFT) and terahertz (THz) spectra.

    PubMed

    Shabaev, Andrew; Lambrakos, Samuel G; Bernstein, Noam; Jacobs, Verne L; Finkenstadt, Daniel

    2011-04-01

    We have developed a general framework for numerical simulation of various types of scenarios that can occur for the detection of improvised explosive devices (IEDs) through the use of excitation using incident electromagnetic waves. A central component model of this framework is an S-matrix representation of a multilayered composite material system. Each layer of the system is characterized by an average thickness and an effective electric permittivity function. The outputs of this component are the reflectivity and the transmissivity as functions of frequency and angle of the incident electromagnetic wave. The input of the component is a parameterized analytic-function representation of the electric permittivity as a function of frequency, which is provided by another component model of the framework. The permittivity function is constructed by fitting response spectra calculated using density functional theory (DFT) and parameter adjustment according to any additional information that may be available, e.g., experimentally measured spectra or theory-based assumptions concerning spectral features. A prototype simulation is described that considers response characteristics for THz excitation of the high explosive β-HMX. This prototype simulation includes a description of a procedure for calculating response spectra using DFT as input to the Smatrix model. For this purpose, the DFT software NRLMOL was adopted. © 2011 Society for Applied Spectroscopy

  20. Noise in Neuronal and Electronic Circuits: A General Modeling Framework and Non-Monte Carlo Simulation Techniques.

    PubMed

    Kilinc, Deniz; Demir, Alper

    2017-08-01

    The brain is extremely energy efficient and remarkably robust in what it does despite the considerable variability and noise caused by the stochastic mechanisms in neurons and synapses. Computational modeling is a powerful tool that can help us gain insight into this important aspect of brain mechanism. A deep understanding and computational design tools can help develop robust neuromorphic electronic circuits and hybrid neuroelectronic systems. In this paper, we present a general modeling framework for biological neuronal circuits that systematically captures the nonstationary stochastic behavior of ion channels and synaptic processes. In this framework, fine-grained, discrete-state, continuous-time Markov chain models of both ion channels and synaptic processes are treated in a unified manner. Our modeling framework features a mechanism for the automatic generation of the corresponding coarse-grained, continuous-state, continuous-time stochastic differential equation models for neuronal variability and noise. Furthermore, we repurpose non-Monte Carlo noise analysis techniques, which were previously developed for analog electronic circuits, for the stochastic characterization of neuronal circuits both in time and frequency domain. We verify that the fast non-Monte Carlo analysis methods produce results with the same accuracy as computationally expensive Monte Carlo simulations. We have implemented the proposed techniques in a prototype simulator, where both biological neuronal and analog electronic circuits can be simulated together in a coupled manner.

  1. A Dynamic Multi-Projection-Contour Approximating Framework for the 3D Reconstruction of Buildings by Super-Generalized Optical Stereo-Pairs.

    PubMed

    Yan, Yiming; Su, Nan; Zhao, Chunhui; Wang, Liguo

    2017-09-19

    In this paper, a novel framework of the 3D reconstruction of buildings is proposed, focusing on remote sensing super-generalized stereo-pairs (SGSPs). As we all know, 3D reconstruction cannot be well performed using nonstandard stereo pairs, since reliable stereo matching could not be achieved when the image-pairs are collected at a great difference of views, and we always failed to obtain dense 3D points for regions of buildings, and cannot do further 3D shape reconstruction. We defined SGSPs as two or more optical images collected in less constrained views but covering the same buildings. It is even more difficult to reconstruct the 3D shape of a building by SGSPs using traditional frameworks. As a result, a dynamic multi-projection-contour approximating (DMPCA) framework was introduced for SGSP-based 3D reconstruction. The key idea is that we do an optimization to find a group of parameters of a simulated 3D model and use a binary feature-image that minimizes the total differences between projection-contours of the building in the SGSPs and that in the simulated 3D model. Then, the simulated 3D model, defined by the group of parameters, could approximate the actual 3D shape of the building. Certain parameterized 3D basic-unit-models of typical buildings were designed, and a simulated projection system was established to obtain a simulated projection-contour in different views. Moreover, the artificial bee colony algorithm was employed to solve the optimization. With SGSPs collected by the satellite and our unmanned aerial vehicle, the DMPCA framework was verified by a group of experiments, which demonstrated the reliability and advantages of this work.

  2. HexSim - A general purpose framework for spatially-explicit, individual-based modeling

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...

  3. Dyadic brain modelling, mirror systems and the ontogenetic ritualization of ape gesture

    PubMed Central

    Arbib, Michael; Ganesh, Varsha; Gasser, Brad

    2014-01-01

    The paper introduces dyadic brain modelling, offering both a framework for modelling the brains of interacting agents and a general framework for simulating and visualizing the interactions generated when the brains (and the two bodies) are each coded up in computational detail. It models selected neural mechanisms in ape brains supportive of social interactions, including putative mirror neuron systems inspired by macaque neurophysiology but augmented by increased access to proprioceptive state. Simulation results for a reduced version of the model show ritualized gesture emerging from interactions between a simulated child and mother ape. PMID:24778382

  4. Dyadic brain modelling, mirror systems and the ontogenetic ritualization of ape gesture.

    PubMed

    Arbib, Michael; Ganesh, Varsha; Gasser, Brad

    2014-01-01

    The paper introduces dyadic brain modelling, offering both a framework for modelling the brains of interacting agents and a general framework for simulating and visualizing the interactions generated when the brains (and the two bodies) are each coded up in computational detail. It models selected neural mechanisms in ape brains supportive of social interactions, including putative mirror neuron systems inspired by macaque neurophysiology but augmented by increased access to proprioceptive state. Simulation results for a reduced version of the model show ritualized gesture emerging from interactions between a simulated child and mother ape.

  5. The Framework for 0-D Atmospheric Modeling (F0AM) v3.1

    NASA Technical Reports Server (NTRS)

    Wolfe, Glenn M.; Marvin, Margaret R.; Roberts, Sandra J.; Travis, Katherine R.; Liao, Jin

    2016-01-01

    The Framework for 0-D Atmospheric Modeling(F0AM) is a flexible and user-friendly MATLAB-based platform for simulation of atmospheric chemistry systems. The F0AM interface incorporates front-end configuration of observational constraints and model setups, making it readily adaptable to simulation of photochemical chambers, Lagrangian plumes, and steady-state or time-evolving solar cycles. Six different chemical mechanisms and three options for calculation of photolysis frequencies are currently available. Example simulations are presented to illustrate model capabilities and, more generally, highlight some of the advantages and challenges of 0-D box modeling.

  6. General framework for constraints in molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Kneller, Gerald R.

    2017-06-01

    The article presents a theoretical framework for molecular dynamics simulations of complex systems subject to any combination of holonomic and non-holonomic constraints. Using the concept of constrained inverse matrices both the particle accelerations and the associated constraint forces can be determined from given external forces and kinematical conditions. The formalism enables in particular the construction of explicit kinematical conditions which lead to the well-known Nosé-Hoover type equations of motion for the simulation of non-standard molecular dynamics ensembles. Illustrations are given for a few examples and an outline is presented for a numerical implementation of the method.

  7. A Generalized Framework for Different Drought Indices: Testing its Suitability in a Simulation of the last two Millennia for Europe

    NASA Astrophysics Data System (ADS)

    Raible, Christoph C.; Baerenbold, Oliver; Gomez-Navarro, Juan Jose

    2016-04-01

    Over the past decades, different drought indices have been suggested in the literature. This study tackles the problem of how to characterize drought by defining a general framework and proposing a generalized family of drought indices that is flexible regarding the use of different water balance models. The sensitivity of various indices and its skill to represent drought conditions is evaluated using a regional model simulation in Europe spanning the last two millennia as test bed. The framework combines an exponentially damped memory with a normalization method based on quantile mapping. Both approaches are more robust and physically meaningful compared to the existing methods used to define drought indices. Still, framework is flexible with respect to the water balance, enabling users to adapt the index formulation to the data availability of different locations. Based on the framework, indices with different complex water balances are compared with each other. The comparison shows that a drought index considering only precipitation in the water balance is sufficient for Western to Central Europe. However, in the Mediterranean temperature effects via evapotranspiration need to be considered in order to produce meaningful indices representative of actual water deficit. Similarly, our results indicate that in north-eastern Europe and Scandinavia, snow and runoff effects needs to be considered in the index definition to obtain accurate results.

  8. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  9. A generalized framework for nucleosynthesis calculations

    NASA Astrophysics Data System (ADS)

    Sprouse, Trevor; Mumpower, Matthew; Aprahamian, Ani

    2014-09-01

    Simulating astrophysical events is a difficult process, requiring a detailed pairing of knowledge from both astrophysics and nuclear physics. Astrophysics guides the thermodynamic evolution of an astrophysical event. We present a nucleosynthesis framework written in Fortran that combines as inputs a thermodynamic evolution and nuclear data to time evolve the abundances of nuclear species. Through our coding practices, we have emphasized the applicability of our framework to any astrophysical event, including those involving nuclear fission. Because these calculations are often very complicated, our framework dynamically optimizes itself based on the conditions at each time step in order to greatly minimize total computation time. To highlight the power of this new approach, we demonstrate the use of our framework to simulate both Big Bang nucleosynthesis and r-process nucleosynthesis with speeds competitive with current solutions dedicated to either process alone.

  10. KMCLib 1.1: Extended random number support and technical updates to the KMCLib general framework for kinetic Monte-Carlo simulations

    NASA Astrophysics Data System (ADS)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2015-11-01

    We here present a revised version, v1.1, of the KMCLib general framework for kinetic Monte-Carlo (KMC) simulations. The generation of random numbers in KMCLib now relies on the C++11 standard library implementation, and support has been added for the user to choose from a set of C++11 implemented random number generators. The Mersenne-twister, the 24 and 48 bit RANLUX and a 'minimal-standard' PRNG are supported. We have also included the possibility to use true random numbers via the C++11 std::random_device generator. This release also includes technical updates to support the use of an extended range of operating systems and compilers.

  11. Representing Water Scarcity in Future Agricultural Assessments

    NASA Technical Reports Server (NTRS)

    Winter, Jonathan M.; Lopez, Jose R.; Ruane, Alexander C.; Young, Charles A.; Scanlon, Bridget R.; Rosenzweig, Cynthia

    2017-01-01

    Globally, irrigated agriculture is both essential for food production and the largest user of water. A major challenge for hydrologic and agricultural research communities is assessing the sustainability of irrigated croplands under climate variability and change. Simulations of irrigated croplands generally lack key interactions between water supply, water distribution, and agricultural water demand. In this article, we explore the critical interface between water resources and agriculture by motivating, developing, and illustrating the application of an integrated modeling framework to advance simulations of irrigated croplands. We motivate the framework by examining historical dynamics of irrigation water withdrawals in the United States and quantitatively reviewing previous modeling studies of irrigated croplands with a focus on representations of water supply, agricultural water demand, and impacts on crop yields when water demand exceeds water supply. We then describe the integrated modeling framework for simulating irrigated croplands, which links trends and scenarios with water supply, water allocation, and agricultural water demand. Finally, we provide examples of efforts that leverage the framework to improve simulations of irrigated croplands as well as identify opportunities for interventions that increase agricultural productivity, resiliency, and sustainability.

  12. Early Validation of Failure Detection, Isolation, and Recovery Design Using Heterogeneous Modelling and Simulation

    NASA Astrophysics Data System (ADS)

    van der Plas, Peter; Guerriero, Suzanne; Cristiano, Leorato; Rugina, Ana

    2012-08-01

    Modelling and simulation can support a number of use cases across the spacecraft development life-cycle. Given the increasing complexity of space missions, the observed general trend is for a more extensive usage of simulation already in the early phases. A major perceived advantage is that modelling and simulation can enable the validation of critical aspects of the spacecraft design before the actual development is started, as such reducing the risk in later phases.Failure Detection, Isolation, and Recovery (FDIR) is one of the areas with a high potential to benefit from early modelling and simulation. With the increasing level of required spacecraft autonomy, FDIR specifications can grow in such a way that the traditional document-based review process soon becomes inadequate.This paper shows that FDIR modelling and simulation in a system context can provide a powerful tool to support the FDIR verification process. It is highlighted that FDIR modelling at this early stage requires heterogeneous modelling tools and languages, in order to provide an adequate functional description of the different components (i.e. FDIR functions, environment, equipment, etc.) to be modelled.For this reason, an FDIR simulation framework is proposed in this paper. This framework is based on a number of tools already available in the Avionics Systems Laboratory at ESTEC, which are the Avionics Test Bench Functional Engineering Simulator (ATB FES), Matlab/Simulink, TASTE, and Real Time Developer Studio (RTDS).The paper then discusses the application of the proposed simulation framework to a real case-study, i.e. the FDIR modelling of a satellite in support of actual ESA mission. Challenges and benefits of the approach are described. Finally, lessons learned and the generality of the proposed approach are discussed.

  13. Predicting the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol mixtures via molecular simulation.

    PubMed

    Paluch, Andrew S; Parameswaran, Sreeja; Liu, Shuai; Kolavennu, Anasuya; Mobley, David L

    2015-01-28

    We present a general framework to predict the excess solubility of small molecular solids (such as pharmaceutical solids) in binary solvents via molecular simulation free energy calculations at infinite dilution with conventional molecular models. The present study used molecular dynamics with the General AMBER Force Field to predict the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol solvents. The simulations are able to predict the existence of solubility enhancement and the results are in good agreement with available experimental data. The accuracy of the predictions in addition to the generality of the method suggests that molecular simulations may be a valuable design tool for solvent selection in drug development processes.

  14. Predicting the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol mixtures via molecular simulation

    NASA Astrophysics Data System (ADS)

    Paluch, Andrew S.; Parameswaran, Sreeja; Liu, Shuai; Kolavennu, Anasuya; Mobley, David L.

    2015-01-01

    We present a general framework to predict the excess solubility of small molecular solids (such as pharmaceutical solids) in binary solvents via molecular simulation free energy calculations at infinite dilution with conventional molecular models. The present study used molecular dynamics with the General AMBER Force Field to predict the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol solvents. The simulations are able to predict the existence of solubility enhancement and the results are in good agreement with available experimental data. The accuracy of the predictions in addition to the generality of the method suggests that molecular simulations may be a valuable design tool for solvent selection in drug development processes.

  15. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.

  16. A Computational Framework for Bioimaging Simulation.

    PubMed

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  17. Molecular dynamics simulation of framework flexibility effects on noble gas diffusion in HKUST-1 and ZIF-8

    DOE PAGES

    Parkes, Marie V.; Demir, Hakan; Teich-McGoldrick, Stephanie L.; ...

    2014-03-28

    Molecular dynamics simulations were used to investigate trends in noble gas (Ar, Kr, Xe) diffusion in the metal-organic frameworks HKUST-1 and ZIF-8. Diffusion occurs primarily through inter-cage jump events, with much greater diffusion of guest atoms in HKUST-1 compared to ZIF-8 due to the larger cage and window sizes in the former. We compare diffusion coefficients calculated for both rigid and flexible frameworks. For rigid framework simulations, in which the framework atoms were held at their crystallographic or geometry optimized coordinates, sometimes dramatic differences in guest diffusion were seen depending on the initial framework structure or the choice of frameworkmore » force field parameters. When framework flexibility effects were included, argon and krypton diffusion increased significantly compared to rigid-framework simulations using general force field parameters. Additionally, for argon and krypton in ZIF-8, guest diffusion increased with loading, demonstrating that guest-guest interactions between cages enhance inter-cage diffusion. No inter-cage jump events were seen for xenon atoms in ZIF-8 regardless of force field or initial structure, and the loading dependence of xenon diffusion in HKUST-1 is different for rigid and flexible frameworks. Diffusion of krypton and xenon in HKUST-1 depends on two competing effects: the steric effect that decreases diffusion as loading increases, and the “small cage effect” that increases diffusion as loading increases. Finally, a detailed analysis of the window size in ZIF-8 reveals that the window increases beyond its normal size to permit passage of a (nominally) larger krypton atom.« less

  18. Development and application of numerical techniques for general-relativistic magnetohydrodynamics simulations of black hole accretion

    NASA Astrophysics Data System (ADS)

    White, Christopher Joseph

    We describe the implementation of sophisticated numerical techniques for general-relativistic magnetohydrodynamics simulations in the Athena++ code framework. Improvements over many existing codes include the use of advanced Riemann solvers and of staggered-mesh constrained transport. Combined with considerations for computational performance and parallel scalability, these allow us to investigate black hole accretion flows with unprecedented accuracy. The capability of the code is demonstrated by exploring magnetically arrested disks.

  19. Predicting the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol mixtures via molecular simulation

    PubMed Central

    Paluch, Andrew S.; Parameswaran, Sreeja; Liu, Shuai; Kolavennu, Anasuya; Mobley, David L.

    2015-01-01

    We present a general framework to predict the excess solubility of small molecular solids (such as pharmaceutical solids) in binary solvents via molecular simulation free energy calculations at infinite dilution with conventional molecular models. The present study used molecular dynamics with the General AMBER Force Field to predict the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol solvents. The simulations are able to predict the existence of solubility enhancement and the results are in good agreement with available experimental data. The accuracy of the predictions in addition to the generality of the method suggests that molecular simulations may be a valuable design tool for solvent selection in drug development processes. PMID:25637996

  20. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  1. Robust Decision Making in a Nonlinear World

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Thomas, Rick P.

    2012-01-01

    The authors propose a general modeling framework called the general monotone model (GeMM), which allows one to model psychological phenomena that manifest as nonlinear relations in behavior data without the need for making (overly) precise assumptions about functional form. Using both simulated and real data, the authors illustrate that GeMM…

  2. Unsteady Analyses of Valve Systems in Rocket Engine Testing Environments

    NASA Technical Reports Server (NTRS)

    Shipman, Jeremy; Hosangadi, Ashvin; Ahuja, Vineet

    2004-01-01

    This paper discusses simulation technology used to support the testing of rocket propulsion systems by performing high fidelity analyses of feed system components. A generalized multi-element framework has been used to perform simulations of control valve systems. This framework provides the flexibility to resolve the structural and functional complexities typically associated with valve-based high pressure feed systems that are difficult to deal with using traditional Computational Fluid Dynamics (CFD) methods. In order to validate this framework for control valve systems, results are presented for simulations of a cryogenic control valve at various plug settings and compared to both experimental data and simulation results obtained at NASA Stennis Space Center. A detailed unsteady analysis has also been performed for a pressure regulator type control valve used to support rocket engine and component testing at Stennis Space Center. The transient simulation captures the onset of a modal instability that has been observed in the operation of the valve. A discussion of the flow physics responsible for the instability and a prediction of the dominant modes associated with the fluctuations is presented.

  3. An Overview of Atmospheric Composition OSSE Activities at NASA's Global Modeling and Assimilation Office

    NASA Technical Reports Server (NTRS)

    daSilva, Arlinda

    2012-01-01

    A model-based Observing System Simulation Experiment (OSSE) is a framework for numerical experimentation in which observables are simulated from fields generated by an earth system model, including a parameterized description of observational error characteristics. Simulated observations can be used for sampling studies, quantifying errors in analysis or retrieval algorithms, and ultimately being a planning tool for designing new observing missions. While this framework has traditionally been used to assess the impact of observations on numerical weather prediction, it has a much broader applicability, in particular to aerosols and chemical constituents. In this talk we will give a general overview of Observing System Simulation Experiments (OSSE) activities at NASA's Global Modeling and Assimilation Office, with focus on its emerging atmospheric composition component.

  4. A Collection of Nonlinear Aircraft Simulations in MATLAB

    NASA Technical Reports Server (NTRS)

    Garza, Frederico R.; Morelli, Eugene A.

    2003-01-01

    Nonlinear six degree-of-freedom simulations for a variety of aircraft were created using MATLAB. Data for aircraft geometry, aerodynamic characteristics, mass / inertia properties, and engine characteristics were obtained from open literature publications documenting wind tunnel experiments and flight tests. Each nonlinear simulation was implemented within a common framework in MATLAB, and includes an interface with another commercially-available program to read pilot inputs and produce a three-dimensional (3-D) display of the simulated airplane motion. Aircraft simulations include the General Dynamics F-16 Fighting Falcon, Convair F-106B Delta Dart, Grumman F-14 Tomcat, McDonnell Douglas F-4 Phantom, NASA Langley Free-Flying Aircraft for Sub-scale Experimental Research (FASER), NASA HL-20 Lifting Body, NASA / DARPA X-31 Enhanced Fighter Maneuverability Demonstrator, and the Vought A-7 Corsair II. All nonlinear simulations and 3-D displays run in real time in response to pilot inputs, using contemporary desktop personal computer hardware. The simulations can also be run in batch mode. Each nonlinear simulation includes the full nonlinear dynamics of the bare airframe, with a scaled direct connection from pilot inputs to control surface deflections to provide adequate pilot control. Since all the nonlinear simulations are implemented entirely in MATLAB, user-defined control laws can be added in a straightforward fashion, and the simulations are portable across various computing platforms. Routines for trim, linearization, and numerical integration are included. The general nonlinear simulation framework and the specifics for each particular aircraft are documented.

  5. A Generic Multibody Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Neuhaus, Jason Richard; Kenney, Patrick Sean

    2006-01-01

    Flight simulation of dynamic atmospheric vehicles with parachute systems is a complex task that is not easily modeled in many simulation frameworks. In the past, the performance of vehicles with parachutes was analyzed by simulations dedicated to parachute operations and were generally not used for any other portion of the vehicle flight trajectory. This approach required multiple simulation resources to completely analyze the performance of the vehicle. Recently, improved software engineering practices and increased computational power have allowed a single simulation to model the entire flight profile of a vehicle employing a parachute.

  6. skelesim: an extensible, general framework for population genetic simulation in R.

    PubMed

    Parobek, Christian M; Archer, Frederick I; DePrenger-Levin, Michelle E; Hoban, Sean M; Liggins, Libby; Strand, Allan E

    2017-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares' complex capabilities, composing code and input files, a daunting bioinformatics barrier and a steep conceptual learning curve. skelesim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics and organizing data output, in a reproducible pipeline within the R environment. skelesim is designed to be an extensible framework that can 'wrap' around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skelesim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skelesim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skelesim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). © 2016 John Wiley & Sons Ltd.

  7. skeleSim: an extensible, general framework for population genetic simulation in R

    PubMed Central

    Parobek, Christian M.; Archer, Frederick I.; DePrenger-Levin, Michelle E.; Hoban, Sean M.; Liggins, Libby; Strand, Allan E.

    2016-01-01

    Simulations are a key tool in molecular ecology for inference and forecasting, as well as for evaluating new methods. Due to growing computational power and a diversity of software with different capabilities, simulations are becoming increasingly powerful and useful. However, the widespread use of simulations by geneticists and ecologists is hindered by difficulties in understanding these softwares’ complex capabilities, composing code and input files, a daunting bioinformatics barrier, and a steep conceptual learning curve. skeleSim (an R package) guides users in choosing appropriate simulations, setting parameters, calculating genetic summary statistics, and organizing data output, in a reproducible pipeline within the R environment. skeleSim is designed to be an extensible framework that can ‘wrap’ around any simulation software (inside or outside the R environment) and be extended to calculate and graph any genetic summary statistics. Currently, skeleSim implements coalescent and forward-time models available in the fastsimcoal2 and rmetasim simulation engines to produce null distributions for multiple population genetic statistics and marker types, under a variety of demographic conditions. skeleSim is intended to make simulations easier while still allowing full model complexity to ensure that simulations play a fundamental role in molecular ecology investigations. skeleSim can also serve as a teaching tool: demonstrating the outcomes of stochastic population genetic processes; teaching general concepts of simulations; and providing an introduction to the R environment with a user-friendly graphical user interface (using shiny). PMID:27736016

  8. Generalized interactions using virtual tools within the spring framework: probing, piercing, cauterizing and ablating

    NASA Technical Reports Server (NTRS)

    Montgomery, Kevin; Bruyns, Cynthia D.

    2002-01-01

    We present schemes for real-time generalized interactions such as probing, piercing, cauterizing and ablating virtual tissues. These methods have been implemented in a robust, real-time (haptic rate) surgical simulation environment allowing us to model procedures including animal dissection, microsurgery, hysteroscopy, and cleft lip repair.

  9. A motion sensing-based framework for robotic manipulation.

    PubMed

    Deng, Hao; Xia, Zeyang; Weng, Shaokui; Gan, Yangzhou; Fang, Peng; Xiong, Jing

    2016-01-01

    To data, outside of the controlled environments, robots normally perform manipulation tasks operating with human. This pattern requires the robot operators with high technical skills training for varied teach-pendant operating system. Motion sensing technology, which enables human-machine interaction in a novel and natural interface using gestures, has crucially inspired us to adopt this user-friendly and straightforward operation mode on robotic manipulation. Thus, in this paper, we presented a motion sensing-based framework for robotic manipulation, which recognizes gesture commands captured from motion sensing input device and drives the action of robots. For compatibility, a general hardware interface layer was also developed in the framework. Simulation and physical experiments have been conducted for preliminary validation. The results have shown that the proposed framework is an effective approach for general robotic manipulation with motion sensing control.

  10. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  11. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    DOE PAGES

    Incerti, S.; Suerfu, B.; Xu, J.; ...

    2016-02-16

    We report that a revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research andmore » other low energy physics fields.« less

  12. Use of the quasi-geostrophic dynamical framework to reconstruct the 3-D ocean state in a high-resolution realistic simulation of North Atlantic.

    NASA Astrophysics Data System (ADS)

    Fresnay, Simon; Ponte, Aurélien

    2017-04-01

    The quasi-geostrophic (QG) framework has been, is and will be still for years to come a cornerstone method linking observations with estimates of the ocean circulation and state. We have used here the QG framework to reconstruct dynamical variables of the 3-D ocean in a state-of-the-art high-resolution (1/60 deg, 300 vertical levels) numerical simulation of the North Atlantic (NATL60). The work was carried out in 3 boxes of the simulation: Gulf Stream, Azores and Reykjaness Ridge. In a first part, general diagnostics describing the eddying dynamics have been performed and show that the QG scaling verifies in general, at depths distant from mixed layer and bathymetric gradients. Correlations with surface observables variables (e.g. temperature, sea level) were computed and estimates of quasi-geostrophic potential vorticity (QGPV) were reconstructed by the means of regression laws. It is shown that that reconstruction of QGPV exhibits valuable skill for a restricted scale range, mainly using sea level as the variable of regression. Additional discussion is given, based on the flow balanced with QGPV. This work is part of the DIMUP project, aiming to improve our ability to operationnaly estimate the ocean state.

  13. Selected Topics in Overset Technology Development and Applications At NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    This paper presents a general overview of overset technology development and applications at NASA Ames Research Center. The topics include: 1) Overview of overset activities at NASA Ames; 2) Recent developments in Chimera Grid Tools; 3) A general framework for multiple component dynamics; 4) A general script module for automating liquid rocket sub-systems simulations; and 5) Critical future work.

  14. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.

    PubMed

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.

  15. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator

    PubMed Central

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730

  16. Covert rapid action-memory simulation (CRAMS): a hypothesis of hippocampal-prefrontal interactions for adaptive behavior.

    PubMed

    Wang, Jane X; Cohen, Neal J; Voss, Joel L

    2015-01-01

    Effective choices generally require memory, yet little is known regarding the cognitive or neural mechanisms that allow memory to influence choices. We outline a new framework proposing that covert memory processing of hippocampus interacts with action-generation processing of prefrontal cortex in order to arrive at optimal, memory-guided choices. Covert, rapid action-memory simulation (CRAMS) is proposed here as a framework for understanding cognitive and/or behavioral choices, whereby prefrontal-hippocampal interactions quickly provide multiple simulations of potential outcomes used to evaluate the set of possible choices. We hypothesize that this CRAMS process is automatic, obligatory, and covert, meaning that many cycles of action-memory simulation occur in response to choice conflict without an individual's necessary intention and generally without awareness of the simulations, leading to adaptive behavior with little perceived effort. CRAMS is thus distinct from influential proposals that adaptive memory-based behavior in humans requires consciously experienced memory-based construction of possible future scenarios and deliberate decisions among possible future constructions. CRAMS provides an account of why hippocampus has been shown to make critical contributions to the short-term control of behavior, and it motivates several new experimental approaches and hypotheses that could be used to better understand the ubiquitous role of prefrontal-hippocampal interactions in situations that require adaptively using memory to guide choices. Importantly, this framework provides a perspective that allows for testing decision-making mechanisms in a manner that translates well across human and nonhuman animal model systems. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Integrated corridor management analysis, modeling and simulation (AMS) methodology.

    DOT National Transportation Integrated Search

    2008-03-01

    This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...

  18. The evaluation of a framework for measuring the non-technical ward round skills of final year nursing students: An observational study.

    PubMed

    Murray, Kara; McKenzie, Karen; Kelleher, Michael

    2016-10-01

    The importance of non-technical skills (NTS) to patient outcomes is increasingly being recognised, however, there is limited research into how such skills can be taught and evaluated in student nurses in relation toward rounds. This pilot study describes an evaluation of a NTS framework that could potentially be used to measure ward round skills of student nurses. The study used an observational design. Potential key NTS were identified from existing literature and NTS taxonomies. The proposed framework was then used to evaluate whether the identified NTS were evident in a series of ward round simulations that final year general nursing students undertook as part of their training. Finally, the views of a small group of qualified nurse educators, qualified nurses and general nursing students were sought about whether the identified NTS were important and relevant to practice. The proposed NTS framework included seven categories: Communication, Decision Making, Situational Awareness, Teamwork and Task Management, Student Initiative and Responsiveness to Patient. All were rated as important and relevant to practice. The pilot study suggests that the proposed NTS framework could be used as a means of evaluating student nurse competencies in respect of many non-technical skills required for a successful ward round. Further work is required to establish the validity of the framework in educational settings and to determine the extent to which it is of use in a non-simulated ward round setting. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  20. Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology

    NASA Astrophysics Data System (ADS)

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-11-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  1. Multipolar Ewald Methods, 2: Applications Using a Quantum Mechanical Force Field

    PubMed Central

    2015-01-01

    A fully quantum mechanical force field (QMFF) based on a modified “divide-and-conquer” (mDC) framework is applied to a series of molecular simulation applications, using a generalized Particle Mesh Ewald method extended to multipolar charge densities. Simulation results are presented for three example applications: liquid water, p-nitrophenylphosphate reactivity in solution, and crystalline N,N-dimethylglycine. Simulations of liquid water using a parametrized mDC model are compared to TIP3P and TIP4P/Ew water models and experiment. The mDC model is shown to be superior for cluster binding energies and generally comparable for bulk properties. Examination of the dissociative pathway for dephosphorylation of p-nitrophenylphosphate shows that the mDC method evaluated with the DFTB3/3OB and DFTB3/OPhyd semiempirical models bracket the experimental barrier, whereas DFTB2 and AM1/d-PhoT QM/MM simulations exhibit deficiencies in the barriers, the latter for which is related, in part, to the anomalous underestimation of the p-nitrophenylate leaving group pKa. Simulations of crystalline N,N-dimethylglycine are performed and the overall structure and atomic fluctuations are compared with the experiment and the general AMBER force field (GAFF). The QMFF, which was not parametrized for this application, was shown to be in better agreement with crystallographic data than GAFF. Our simulations highlight some of the application areas that may benefit from using new QMFFs, and they demonstrate progress toward the development of accurate QMFFs using the recently developed mDC framework. PMID:25691830

  2. LIPID11: A Modular Framework for Lipid Simulations using Amber

    PubMed Central

    Skjevik, Åge A.; Madej, Benjamin D.; Walker, Ross C.; eigen, Knut T

    2013-01-01

    Accurate simulation of complex lipid bilayers has long been a goal in condensed phase molecular dynamics (MD). Structure and function of membrane-bound proteins are highly dependent on the lipid bilayer environment and are challenging to study through experimental methods. Within Amber, there has been limited focus on lipid simulations, although some success has been seen with the use of the General Amber Force Field (GAFF). However, to date there are no dedicated Amber lipid force fields. In this paper we describe a new charge derivation strategy for lipids consistent with the Amber RESP approach, and a new atom and residue naming and type convention. In the first instance, we have combined this approach with GAFF parameters. The result is LIPID11, a flexible, modular framework for the simulation of lipids that is fully compatible with the existing Amber force fields. The charge derivation procedure, capping strategy and nomenclature for LIPID11, along with preliminary simulation results and a discussion of the planned long-term parameter development are presented here. Our findings suggest that Lipid11 is a modular framework feasible for phospholipids and a flexible starting point for the development of a comprehensive, Amber-compatible lipid force field. PMID:22916730

  3. Variational coarse-graining procedure for dynamic homogenization

    NASA Astrophysics Data System (ADS)

    Liu, Chenchen; Reina, Celia

    2017-07-01

    We present a variational coarse-graining framework for heterogeneous media in the spirit of FE2 methods, that allows for a seamless transition from the traditional static scenario to dynamic loading conditions, while being applicable to general material behavior as well as to discrete or continuous representations of the material and its deformation, e.g., finite element discretizations or atomistic systems. The method automatically delivers the macroscopic equations of motion together with the generalization of Hill's averaging relations to the dynamic setting. These include the expression of the macroscopic stresses and linear momentum as a function of the microscopic fields. We further demonstrate with a proof of concept example, that the proposed theoretical framework can be used to perform multiscale numerical simulations. The results are compared with standard single-scale finite element simulations, showcasing the capability of the method to capture the dispersive nature of the medium in the range of frequencies permitted by the multiscale strategy.

  4. A stochastic agent-based model of pathogen propagation in dynamic multi-relational social networks

    PubMed Central

    Khan, Bilal; Dombrowski, Kirk; Saad, Mohamed

    2015-01-01

    We describe a general framework for modeling and stochastic simulation of epidemics in realistic dynamic social networks, which incorporates heterogeneity in the types of individuals, types of interconnecting risk-bearing relationships, and types of pathogens transmitted across them. Dynamism is supported through arrival and departure processes, continuous restructuring of risk relationships, and changes to pathogen infectiousness, as mandated by natural history; dynamism is regulated through constraints on the local agency of individual nodes and their risk behaviors, while simulation trajectories are validated using system-wide metrics. To illustrate its utility, we present a case study that applies the proposed framework towards a simulation of HIV in artificial networks of intravenous drug users (IDUs) modeled using data collected in the Social Factors for HIV Risk survey. PMID:25859056

  5. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  6. Pitting corrosion as a mixed system: coupled deterministic-probabilistic simulation of pit growth

    NASA Astrophysics Data System (ADS)

    Ibrahim, Israr B. M.; Fonna, S.; Pidaparti, R.

    2018-05-01

    Stochastic behavior of pitting corrosion poses a unique challenge in its computational analysis. However, it also stems from electrochemical activity causing general corrosion. In this paper, a framework for corrosion pit growth simulation based on the coupling of the Cellular Automaton (CA) and Boundary Element Methods (BEM) is presented. The framework assumes that pitting corrosion is controlled by electrochemical activity inside the pit cavity. The BEM provides the prediction of electrochemical activity given the geometrical data and polarization curves, while the CA is used to simulate the evolution of pit shapes based on electrochemical activity provided by BEM. To demonstrate the methodology, a sample case of local corrosion cells formed in pitting corrosion with varied dimensions and polarization functions is considered. Results show certain shapes tend to grow in certain types of environments. Some pit shapes appear to pose a higher risk by being potentially significant stress raisers or potentially increasing the rate of corrosion under the surface. Furthermore, these pits are comparable to commonly observed pit shapes in general corrosion environments.

  7. Accurate and efficient integration for molecular dynamics simulations at constant temperature and pressure

    NASA Astrophysics Data System (ADS)

    Lippert, Ross A.; Predescu, Cristian; Ierardi, Douglas J.; Mackenzie, Kenneth M.; Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.

    2013-10-01

    In molecular dynamics simulations, control over temperature and pressure is typically achieved by augmenting the original system with additional dynamical variables to create a thermostat and a barostat, respectively. These variables generally evolve on timescales much longer than those of particle motion, but typical integrator implementations update the additional variables along with the particle positions and momenta at each time step. We present a framework that replaces the traditional integration procedure with separate barostat, thermostat, and Newtonian particle motion updates, allowing thermostat and barostat updates to be applied infrequently. Such infrequent updates provide a particularly substantial performance advantage for simulations parallelized across many computer processors, because thermostat and barostat updates typically require communication among all processors. Infrequent updates can also improve accuracy by alleviating certain sources of error associated with limited-precision arithmetic. In addition, separating the barostat, thermostat, and particle motion update steps reduces certain truncation errors, bringing the time-average pressure closer to its target value. Finally, this framework, which we have implemented on both general-purpose and special-purpose hardware, reduces software complexity and improves software modularity.

  8. Heartbeat-based error diagnosis framework for distributed embedded systems

    NASA Astrophysics Data System (ADS)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2012-01-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  9. Heartbeat-based error diagnosis framework for distributed embedded systems

    NASA Astrophysics Data System (ADS)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2011-12-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  10. GENET note no. 1

    NASA Technical Reports Server (NTRS)

    Yeh, J. W.

    1971-01-01

    The general features of the GENET system for simulating networks are described. A set of features is presented which are desirable for network simulations and which are expected to be achieved by this system. Among these features are: (1) two level network modeling; and (2) problem oriented operations. Several typical network systems are modeled in GENET framework to illustrate various of the features and to show its applicability.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.

  12. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  13. 1D-3D hybrid modeling-from multi-compartment models to full resolution models in space and time.

    PubMed

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator-which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics.

  14. Development of an integrated economic and ecological framework for ecosystem-based fisheries management in New England

    NASA Astrophysics Data System (ADS)

    Jin, D.; Hoagland, P.; Dalton, T. M.; Thunberg, E. M.

    2012-09-01

    We present an integrated economic-ecological framework designed to help assess the implementation of ecosystem-based fisheries management (EBFM) in New England. We develop the framework by linking a computable general equilibrium (CGE) model of a coastal economy to an end-to-end (E2E) model of a marine food web for Georges Bank. We focus on the New England region using coastal county economic data for a restricted set of industry sectors and marine ecological data for three top level trophic feeding guilds: planktivores, benthivores, and piscivores. We undertake numerical simulations to model the welfare effects of changes in alternative combinations of yields from feeding guilds and alternative manifestations of biological productivity. We estimate the economic and distributional effects of these alternative simulations across a range of consumer income levels. This framework could be used to extend existing methodologies for assessing the impacts on human communities of groundfish stock rebuilding strategies, such as those expected through the implementation of the sector management program in the US northeast fishery. We discuss other possible applications of and modifications and limitations to the framework.

  15. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    NASA Astrophysics Data System (ADS)

    Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars

    2013-08-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.

  16. A constitutive model for magnetostriction based on thermodynamic framework

    NASA Astrophysics Data System (ADS)

    Ho, Kwangsoo

    2016-08-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto-mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature.

  17. 1D-3D hybrid modeling—from multi-compartment models to full resolution models in space and time

    PubMed Central

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M.; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator—which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics. PMID:25120463

  18. Coupled Thermo-Hydro-Mechanical Numerical Framework for Simulating Unconventional Formations

    NASA Astrophysics Data System (ADS)

    Garipov, T. T.; White, J. A.; Lapene, A.; Tchelepi, H.

    2016-12-01

    Unconventional deposits are found in all world oil provinces. Modeling these systems is challenging, however, due to complex thermo-hydro-mechanical processes that govern their behavior. As a motivating example, we consider in situ thermal processing of oil shale deposits. When oil shale is heated to sufficient temperatures, kerogen can be converted to oil and gas products over a relatively short timespan. This phase change dramatically impact both the mechanical and hydrologic properties of the rock, leading to strongly coupled THMC interactions. Here, we present a numerical framework for simulating tightly-coupled chemistry, geomechanics, and multiphase flow within a reservoir simulator (the AD-GPRS General Purpose Research Simulator). We model changes in constitutive behavior of the rock using a thermoplasticity model that accounts for microstructural evolution. The multi-component, multiphase flow and transport processes of both mass and heat are modeled at the macroscopic (e.g., Darcy) scale. The phase compositions and properties are described by a cubic equation of state; Arrhenius-type chemical reactions are used to represent kerogen conversion. The system of partial differential equations is discretized using a combination of finite-volumes and finite-elements, respectively, for the flow and mechanics problems. Fully implicit and sequentially implicit method are used to solve resulting nonlinear problem. The proposed framework is verified against available analytical and numerical benchmark cases. We demonstrate the efficiency, performance, and capabilities of the proposed simulation framework by analyzing near well deformation in an oil shale formation.

  19. A Simulation Study of Acoustic-Assisted Tracking of Whales for Mark-Recapture Surveys

    PubMed Central

    Peel, David; Miller, Brian S.; Kelly, Natalie; Dawson, Steve; Slooten, Elisabeth; Double, Michael C.

    2014-01-01

    Collecting enough data to obtain reasonable abundance estimates of whales is often difficult, particularly when studying rare species. Passive acoustics can be used to detect whale sounds and are increasingly used to estimate whale abundance. Much of the existing effort centres on the use of acoustics to estimate abundance directly, e.g. analysing detections in a distance sampling framework. Here, we focus on acoustics as a tool incorporated within mark-recapture surveys. In this context, acoustic tools are used to detect and track whales, which are then photographed or biopsied to provide data for mark-recapture analyses. The purpose of incorporating acoustics is to increase the encounter rate beyond using visual searching only. While this general approach is not new, its utility is rarely quantified. This paper predicts the “acoustically-assisted” encounter rate using a discrete-time individual-based simulation of whales and survey vessel. We validate the simulation framework using existing data from studies of sperm whales. We then use the framework to predict potential encounter rates in a study of Antarctic blue whales. We also investigate the effects of a number of the key parameters on encounter rate. Mean encounter rates from the simulation of sperm whales matched well with empirical data. Variance of encounter rate, however, was underestimated. The simulation of Antarctic blue whales found that passive acoustics should provide a 1.7–3.0 fold increase in encounter rate over visual-only methods. Encounter rate was most sensitive to acoustic detection range, followed by vocalisation rate. During survey planning and design, some indication of the relationship between expected sample size and effort is paramount; this simulation framework can be used to predict encounter rates and establish this relationship. For a case in point, the simulation framework indicates unequivocally that real-time acoustic tracking should be considered for quantifying the abundance of Antarctic blue whales via mark-recapture methods. PMID:24827919

  20. A simulation study of acoustic-assisted tracking of whales for mark-recapture surveys.

    PubMed

    Peel, David; Miller, Brian S; Kelly, Natalie; Dawson, Steve; Slooten, Elisabeth; Double, Michael C

    2014-01-01

    Collecting enough data to obtain reasonable abundance estimates of whales is often difficult, particularly when studying rare species. Passive acoustics can be used to detect whale sounds and are increasingly used to estimate whale abundance. Much of the existing effort centres on the use of acoustics to estimate abundance directly, e.g. analysing detections in a distance sampling framework. Here, we focus on acoustics as a tool incorporated within mark-recapture surveys. In this context, acoustic tools are used to detect and track whales, which are then photographed or biopsied to provide data for mark-recapture analyses. The purpose of incorporating acoustics is to increase the encounter rate beyond using visual searching only. While this general approach is not new, its utility is rarely quantified. This paper predicts the "acoustically-assisted" encounter rate using a discrete-time individual-based simulation of whales and survey vessel. We validate the simulation framework using existing data from studies of sperm whales. We then use the framework to predict potential encounter rates in a study of Antarctic blue whales. We also investigate the effects of a number of the key parameters on encounter rate. Mean encounter rates from the simulation of sperm whales matched well with empirical data. Variance of encounter rate, however, was underestimated. The simulation of Antarctic blue whales found that passive acoustics should provide a 1.7-3.0 fold increase in encounter rate over visual-only methods. Encounter rate was most sensitive to acoustic detection range, followed by vocalisation rate. During survey planning and design, some indication of the relationship between expected sample size and effort is paramount; this simulation framework can be used to predict encounter rates and establish this relationship. For a case in point, the simulation framework indicates unequivocally that real-time acoustic tracking should be considered for quantifying the abundance of Antarctic blue whales via mark-recapture methods.

  1. A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations

    DOE PAGES

    Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; ...

    2015-06-01

    Extensive research efforts have been invested in reducing model errors to improve the predictive ability of biogeochemical earth and environmental system simulators, with applications ranging from contaminant transport and remediation to impacts of biogeochemical elemental cycling (e.g., carbon and nitrogen) on local ecosystems and regional to global climate. While the bulk of this research has focused on improving model parameterizations in the face of observational limitations, the more challenging type of model error/uncertainty to identify and quantify is model structural error which arises from incorrect mathematical representations of (or failure to consider) important physical, chemical, or biological processes, properties, ormore » system states in model formulations. While improved process understanding can be achieved through scientific study, such understanding is usually developed at small scales. Process-based numerical models are typically designed for a particular characteristic length and time scale. For application-relevant scales, it is generally necessary to introduce approximations and empirical parameterizations to describe complex systems or processes. This single-scale approach has been the best available to date because of limited understanding of process coupling combined with practical limitations on system characterization and computation. While computational power is increasing significantly and our understanding of biological and environmental processes at fundamental scales is accelerating, using this information to advance our knowledge of the larger system behavior requires the development of multiscale simulators. Accordingly there has been much recent interest in novel multiscale methods in which microscale and macroscale models are explicitly coupled in a single hybrid multiscale simulation. A limited number of hybrid multiscale simulations have been developed for biogeochemical earth systems, but they mostly utilize application-specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.« less

  2. Parallelizing Timed Petri Net simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1993-01-01

    The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.

  3. Scalable High-order Methods for Multi-Scale Problems: Analysis, Algorithms and Application

    DTIC Science & Technology

    2016-02-26

    Karniadakis, “Resilient algorithms for reconstructing and simulating gappy flow fields in CFD ”, Fluid Dynamic Research, vol. 47, 051402, 2015. 2. Y. Yu, H...simulation, domain decomposition, CFD , gappy data, estimation theory, and gap-tooth algorithm. 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...objective of this project was to develop a general CFD framework for multifidelity simula- tions to target multiscale problems but also resilience in

  4. A Fuzzy Logic Optimal Control Law Solution to the CMMCA Tracking Problem

    DTIC Science & Technology

    1993-03-01

    or from a transfer function. Many times, however, the resulting algorithms are so complex as to be completely or essentially useless. Applications...implemented in a nearly real time computer simulation. Located within the LQ framework are all the performance data for both the ClMCA and the CX...repuired nor desired. 34 - / k more general and less exacting framework was used. In order to concentrate on tho theory and problem solution, it was

  5. A Petascale Non-Hydrostatic Atmospheric Dynamical Core in the HOMME Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tufo, Henry

    The High-Order Method Modeling Environment (HOMME) is a framework for building scalable, conserva- tive atmospheric models for climate simulation and general atmospheric-modeling applications. Its spatial discretizations are based on Spectral-Element (SE) and Discontinuous Galerkin (DG) methods. These are local methods employing high-order accurate spectral basis-functions that have been shown to perform well on massively parallel supercomputers at any resolution and scale particularly well at high resolutions. HOMME provides the framework upon which the CAM-SE community atmosphere model dynamical-core is constructed. In its current incarnation, CAM-SE employs the hydrostatic primitive-equations (PE) of motion, which limits its resolution to simulations coarser thanmore » 0.1 per grid cell. The primary objective of this project is to remove this resolution limitation by providing HOMME with the capabilities needed to build nonhydrostatic models that solve the compressible Euler/Navier-Stokes equations.« less

  6. A Framework for Optimal Control Allocation with Structural Load Constraints

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Taylor, Brian R.; Jutte, Christine V.; Burken, John J.; Trinh, Khanh V.; Bodson, Marc

    2010-01-01

    Conventional aircraft generally employ mixing algorithms or lookup tables to determine control surface deflections needed to achieve moments commanded by the flight control system. Control allocation is the problem of converting desired moments into control effector commands. Next generation aircraft may have many multipurpose, redundant control surfaces, adding considerable complexity to the control allocation problem. These issues can be addressed with optimal control allocation. Most optimal control allocation algorithms have control surface position and rate constraints. However, these constraints are insufficient to ensure that the aircraft's structural load limits will not be exceeded by commanded surface deflections. In this paper, a framework is proposed to enable a flight control system with optimal control allocation to incorporate real-time structural load feedback and structural load constraints. A proof of concept simulation that demonstrates the framework in a simulation of a generic transport aircraft is presented.

  7. Continuity-based model interfacing for plant-wide simulation: a general approach.

    PubMed

    Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A

    2006-08-01

    In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.

  8. A comparison between rate-and-state friction and microphysical models, based on numerical simulations of fault slip

    NASA Astrophysics Data System (ADS)

    van den Ende, M. P. A.; Chen, J.; Ampuero, J.-P.; Niemeijer, A. R.

    2018-05-01

    Rate-and-state friction (RSF) is commonly used for the characterisation of laboratory friction experiments, such as velocity-step tests. However, the RSF framework provides little physical basis for the extrapolation of these results to the scales and conditions of natural fault systems, and so open questions remain regarding the applicability of the experimentally obtained RSF parameters for predicting seismic cycle transients. As an alternative to classical RSF, microphysics-based models offer means for interpreting laboratory and field observations, but are generally over-simplified with respect to heterogeneous natural systems. In order to bridge the temporal and spatial gap between the laboratory and nature, we have implemented existing microphysical model formulations into an earthquake cycle simulator. Through this numerical framework, we make a direct comparison between simulations exhibiting RSF-controlled fault rheology, and simulations in which the fault rheology is dictated by the microphysical model. Even though the input parameters for the RSF simulation are directly derived from the microphysical model, the microphysics-based simulations produce significantly smaller seismic event sizes than the RSF-based simulation, and suggest a more stable fault slip behaviour. Our results reveal fundamental limitations in using classical rate-and-state friction for the extrapolation of laboratory results. The microphysics-based approach offers a more complete framework in this respect, and may be used for a more detailed study of the seismic cycle in relation to material properties and fault zone pressure-temperature conditions.

  9. Multivariate cross-frequency coupling via generalized eigendecomposition

    PubMed Central

    Cohen, Michael X

    2017-01-01

    This paper presents a new framework for analyzing cross-frequency coupling in multichannel electrophysiological recordings. The generalized eigendecomposition-based cross-frequency coupling framework (gedCFC) is inspired by source-separation algorithms combined with dynamics of mesoscopic neurophysiological processes. It is unaffected by factors that confound traditional CFC methods—such as non-stationarities, non-sinusoidality, and non-uniform phase angle distributions—attractive properties considering that brain activity is neither stationary nor perfectly sinusoidal. The gedCFC framework opens new opportunities for conceptualizing CFC as network interactions with diverse spatial/topographical distributions. Five specific methods within the gedCFC framework are detailed, these are validated in simulated data and applied in several empirical datasets. gedCFC accurately recovers physiologically plausible CFC patterns embedded in noise that causes traditional CFC methods to perform poorly. The paper also demonstrates that spike-field coherence in multichannel local field potential data can be analyzed using the gedCFC framework, which provides significant advantages over traditional spike-field coherence analyses. Null-hypothesis testing is also discussed. DOI: http://dx.doi.org/10.7554/eLife.21792.001 PMID:28117662

  10. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part I: Template-Based Generic Programming

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.

    2012-01-01

    An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.

  11. OpenDanubia - An integrated, modular simulation system to support regional water resource management

    NASA Astrophysics Data System (ADS)

    Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.

    2012-04-01

    The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure and the generic core system (Core Framework, Actor Framework) allows the application in new regions and the selection of a reduced number of modules for simulation. As part of the Open Source Initiative in GLOWA-Danube (opendanubia.glowa-danube.de) a comprehensive documentation for the system installation was created and both the program code of the framework and of all major components is licensed under the GNU General Public License. In addition, some helpful programs and scripts necessary for the operation and processing of input and result data sets are provided.

  12. Quasi-classical approaches to vibronic spectra revisited

    NASA Astrophysics Data System (ADS)

    Karsten, Sven; Ivanov, Sergei D.; Bokarev, Sergey I.; Kühn, Oliver

    2018-03-01

    The framework to approach quasi-classical dynamics in the electronic ground state is well established and is based on the Kubo-transformed time correlation function (TCF), being the most classical-like quantum TCF. Here we discuss whether the choice of the Kubo-transformed TCF as a starting point for simulating vibronic spectra is as unambiguous as it is for vibrational ones. Employing imaginary-time path integral techniques in combination with the interaction representation allowed us to formulate a method for simulating vibronic spectra in the adiabatic regime that takes nuclear quantum effects and dynamics on multiple potential energy surfaces into account. Further, a generalized quantum TCF is proposed that contains many well-established TCFs, including the Kubo one, as particular cases. Importantly, it also provides a framework to construct new quantum TCFs. Applying the developed methodology to the generalized TCF leads to a plethora of simulation protocols, which are based on the well-known TCFs as well as on new ones. Their performance is investigated on 1D anharmonic model systems at finite temperatures. It is shown that the protocols based on the new TCFs may lead to superior results with respect to those based on the common ones. The strategies to find the optimal approach are discussed.

  13. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  14. Consistent forcing scheme in the cascaded lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  15. Assessing the Benefits and Costs of Motion for C-17 Flight Simulators: Technical Appendixes.

    DTIC Science & Technology

    1986-06-01

    Conference, NAECON, 1983. 4’ U-. - 182 - Instructional System Development, AF Manual 50-2, USAF, May 25, 1979. Irish , P.A., and G.H. Buckland, "Effects of...control augmentation system ; (4) the fidelity of different siirulator motion cueing alternatives; (5) a suggested methodology for assessinq the...evaluating the benefits and costs of incorporating motion systems in C-17 transport aircraft flight simulators and in developing a general framework

  16. UAV Swarm Tactics: An Agent-Based Simulation and Markov Process Analysis

    DTIC Science & Technology

    2013-06-01

    CRN Common Random Numbers CSV Comma Separated Values DoE Design of Experiment GLM Generalized Linear Model HVT High Value Target JAR Java ARchive JMF... Java Media Framework JRE Java runtime environment Mason Multi-Agent Simulator Of Networks MOE Measure Of Effectiveness MOP Measures Of Performance...with every set several times, and to write a CSV file with the results. Rather than scripting the agent behavior deterministically, the agents should

  17. A framework for the direct evaluation of large deviations in non-Markovian processes

    NASA Astrophysics Data System (ADS)

    Cavallaro, Massimo; Harris, Rosemary J.

    2016-11-01

    We propose a general framework to simulate stochastic trajectories with arbitrarily long memory dependence and efficiently evaluate large deviation functions associated to time-extensive observables. This extends the ‘cloning’ procedure of Giardiná et al (2006 Phys. Rev. Lett. 96 120603) to non-Markovian systems. We demonstrate the validity of this method by testing non-Markovian variants of an ion-channel model and the totally asymmetric exclusion process, recovering results obtainable by other means.

  18. Generalized interactions using virtual tools within the spring framework: cutting

    NASA Technical Reports Server (NTRS)

    Montgomery, Kevin; Bruyns, Cynthia D.

    2002-01-01

    We present schemes for real-time generalized mesh cutting. Starting with the a basic example, we describe the details of implementing cutting on single and multiple surface objects as well as hybrid and volumetric meshes using virtual tools with single and multiple cutting surfaces. These methods have been implemented in a robust surgical simulation environment allowing us to model procedures ranging from animal dissection to cleft lip correction.

  19. DISPATCH: a numerical simulation framework for the exa-scale era - I. Fundamentals

    NASA Astrophysics Data System (ADS)

    Nordlund, Åke; Ramsey, Jon P.; Popovas, Andrius; Küffmeier, Michael

    2018-06-01

    We introduce a high-performance simulation framework that permits the semi-independent, task-based solution of sets of partial differential equations, typically manifesting as updates to a collection of `patches' in space-time. A hybrid MPI/OpenMP execution model is adopted, where work tasks are controlled by a rank-local `dispatcher' which selects, from a set of tasks generally much larger than the number of physical cores (or hardware threads), tasks that are ready for updating. The definition of a task can vary, for example, with some solving the equations of ideal magnetohydrodynamics (MHD), others non-ideal MHD, radiative transfer, or particle motion, and yet others applying particle-in-cell (PIC) methods. Tasks do not have to be grid based, while tasks that are, may use either Cartesian or orthogonal curvilinear meshes. Patches may be stationary or moving. Mesh refinement can be static or dynamic. A feature of decisive importance for the overall performance of the framework is that time-steps are determined and applied locally; this allows potentially large reductions in the total number of updates required in cases when the signal speed varies greatly across the computational domain, and therefore a corresponding reduction in computing time. Another feature is a load balancing algorithm that operates `locally' and aims to simultaneously minimize load and communication imbalance. The framework generally relies on already existing solvers, whose performance is augmented when run under the framework, due to more efficient cache usage, vectorization, local time-stepping, plus near-linear and, in principle, unlimited OpenMP and MPI scaling.

  20. Accurate and general treatment of electrostatic interaction in Hamiltonian adaptive resolution simulations

    NASA Astrophysics Data System (ADS)

    Heidari, M.; Cortes-Huerto, R.; Donadio, D.; Potestio, R.

    2016-10-01

    In adaptive resolution simulations the same system is concurrently modeled with different resolution in different subdomains of the simulation box, thereby enabling an accurate description in a small but relevant region, while the rest is treated with a computationally parsimonious model. In this framework, electrostatic interaction, whose accurate treatment is a crucial aspect in the realistic modeling of soft matter and biological systems, represents a particularly acute problem due to the intrinsic long-range nature of Coulomb potential. In the present work we propose and validate the usage of a short-range modification of Coulomb potential, the Damped shifted force (DSF) model, in the context of the Hamiltonian adaptive resolution simulation (H-AdResS) scheme. This approach, which is here validated on bulk water, ensures a reliable reproduction of the structural and dynamical properties of the liquid, and enables a seamless embedding in the H-AdResS framework. The resulting dual-resolution setup is implemented in the LAMMPS simulation package, and its customized version employed in the present work is made publicly available.

  1. Locomotion Dynamics for Bio-inspired Robots with Soft Appendages: Application to Flapping Flight and Passive Swimming

    NASA Astrophysics Data System (ADS)

    Boyer, Frédéric; Porez, Mathieu; Morsli, Ferhat; Morel, Yannick

    2017-08-01

    In animal locomotion, either in fish or flying insects, the use of flexible terminal organs or appendages greatly improves the performance of locomotion (thrust and lift). In this article, we propose a general unified framework for modeling and simulating the (bio-inspired) locomotion of robots using soft organs. The proposed approach is based on the model of Mobile Multibody Systems (MMS). The distributed flexibilities are modeled according to two major approaches: the Floating Frame Approach (FFA) and the Geometrically Exact Approach (GEA). Encompassing these two approaches in the Newton-Euler modeling formalism of robotics, this article proposes a unique modeling framework suited to the fast numerical integration of the dynamics of a MMS in both the FFA and the GEA. This general framework is applied on two illustrative examples drawn from bio-inspired locomotion: the passive swimming in von Karman Vortex Street, and the hovering flight with flexible flapping wings.

  2. A Framework to Learn Physics from Atomically Resolved Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlcek, L.; Maksov, A.; Pan, M.

    Here, we present a generalized framework for physics extraction, i.e., knowledge, from atomically resolved images, and show its utility by applying it to a model system of segregation of chalcogen atoms in an FeSe 0.45Te 0.55 superconductor system. We emphasize that the framework can be used for any imaging data for which a generative physical model exists. Consider that a generative physical model can produce a very large number of configurations, not all of which are observable. By applying a microscope function to a sub-set of this generated data, we form a simulated dataset on which statistics can be computed.

  3. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auld, Joshua; Hope, Michael; Ley, Hubert

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less

  4. A generalized theoretical framework for the description of spin decoupling in solid-state MAS NMR: Offset effect on decoupling performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Kong Ooi; Meier, Beat H., E-mail: beme@ethz.ch, E-mail: maer@ethz.ch; Ernst, Matthias, E-mail: beme@ethz.ch, E-mail: maer@ethz.ch

    2016-09-07

    We present a generalized theoretical framework that allows the approximate but rapid analysis of residual couplings of arbitrary decoupling sequences in solid-state NMR under magic-angle spinning conditions. It is a generalization of the tri-modal Floquet analysis of TPPM decoupling [Scholz et al., J. Chem. Phys. 130, 114510 (2009)] where three characteristic frequencies are used to describe the pulse sequence. Such an approach can be used to describe arbitrary periodic decoupling sequences that differ only in the magnitude of the Fourier coefficients of the interaction-frame transformation. It allows a ∼100 times faster calculation of second-order residual couplings as a function ofmore » pulse sequence parameters than full spin-dynamics simulations. By comparing the theoretical calculations with full numerical simulations, we show the potential of the new approach to examine the performance of decoupling sequences. We exemplify the usefulness of this framework by analyzing the performance of commonly used high-power decoupling sequences and low-power decoupling sequences such as amplitude-modulated XiX (AM-XiX) and its super-cycled variant SC-AM-XiX. In addition, the effect of chemical-shift offset is examined for both high- and low-power decoupling sequences. The results show that the cross-terms between the dipolar couplings are the main contributions to the line broadening when offset is present. We also show that the SC-AM-XIX shows a better offset compensation.« less

  5. A generalized theoretical framework for the description of spin decoupling in solid-state MAS NMR: Offset effect on decoupling performance.

    PubMed

    Tan, Kong Ooi; Agarwal, Vipin; Meier, Beat H; Ernst, Matthias

    2016-09-07

    We present a generalized theoretical framework that allows the approximate but rapid analysis of residual couplings of arbitrary decoupling sequences in solid-state NMR under magic-angle spinning conditions. It is a generalization of the tri-modal Floquet analysis of TPPM decoupling [Scholz et al., J. Chem. Phys. 130, 114510 (2009)] where three characteristic frequencies are used to describe the pulse sequence. Such an approach can be used to describe arbitrary periodic decoupling sequences that differ only in the magnitude of the Fourier coefficients of the interaction-frame transformation. It allows a ∼100 times faster calculation of second-order residual couplings as a function of pulse sequence parameters than full spin-dynamics simulations. By comparing the theoretical calculations with full numerical simulations, we show the potential of the new approach to examine the performance of decoupling sequences. We exemplify the usefulness of this framework by analyzing the performance of commonly used high-power decoupling sequences and low-power decoupling sequences such as amplitude-modulated XiX (AM-XiX) and its super-cycled variant SC-AM-XiX. In addition, the effect of chemical-shift offset is examined for both high- and low-power decoupling sequences. The results show that the cross-terms between the dipolar couplings are the main contributions to the line broadening when offset is present. We also show that the SC-AM-XIX shows a better offset compensation.

  6. Particle acceleration in solar active regions being in the state of self-organized criticality.

    NASA Astrophysics Data System (ADS)

    Vlahos, Loukas

    We review the recent observational results on flare initiation and particle acceleration in solar active regions. Elaborating a statistical approach to describe the spatiotemporally intermittent electric field structures formed inside a flaring solar active region, we investigate the efficiency of such structures in accelerating charged particles (electrons and protons). The large-scale magnetic configuration in the solar atmosphere responds to the strong turbulent flows that convey perturbations across the active region by initiating avalanche-type processes. The resulting unstable structures correspond to small-scale dissipation regions hosting strong electric fields. Previous research on particle acceleration in strongly turbulent plasmas provides a general framework for addressing such a problem. This framework combines various electromagnetic field configurations obtained by magnetohydrodynamical (MHD) or cellular automata (CA) simulations, or by employing a statistical description of the field’s strength and configuration with test particle simulations. We work on data-driven 3D magnetic field extrapolations, based on a self-organized criticality models (SOC). A relativistic test-particle simulation traces each particle’s guiding center within these configurations. Using the simulated particle-energy distributions we test our results against observations, in the framework of the collisional thick target model (CTTM) of solar hard X-ray (HXR) emission and compare our results with the current observations.

  7. WavePropaGator: interactive framework for X-ray free-electron laser optics design and simulations.

    PubMed

    Samoylova, Liubov; Buzmakov, Alexey; Chubar, Oleg; Sinn, Harald

    2016-08-01

    This article describes the WavePropaGator ( WPG ) package, a new interactive software framework for coherent and partially coherent X-ray wavefront propagation simulations. The package has been developed at European XFEL for users at the existing and emerging free-electron laser (FEL) facilities, as well as at the third-generation synchrotron sources and future diffraction-limited storage rings. The WPG addresses the needs of beamline scientists and user groups to facilitate the design, optimization and improvement of X-ray optics to meet their experimental requirements. The package uses the Synchrotron Radiation Workshop ( SRW ) C/C++ library and its Python binding for numerical wavefront propagation simulations. The framework runs reliably under Linux, Microsoft Windows 7 and Apple Mac OS X and is distributed under an open-source license. The available tools allow for varying source parameters and optics layouts and visualizing the results interactively. The wavefront history structure can be used for tracking changes in every particular wavefront during propagation. The batch propagation mode enables processing of multiple wavefronts in workflow mode. The paper presents a general description of the package and gives some recent application examples, including modeling of full X-ray FEL beamlines and start-to-end simulation of experiments.

  8. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    PubMed

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.

  9. An integrated assessment modeling framework for uncertainty studies in global and regional climate change: the MIT IGSM-CAM (version 1.0)

    NASA Astrophysics Data System (ADS)

    Monier, E.; Scott, J. R.; Sokolov, A. P.; Forest, C. E.; Schlosser, C. A.

    2013-12-01

    This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) incorporates a human activity model, it is possible to analyze uncertainties in emissions resulting from both uncertainties in the underlying socio-economic characteristics of the economic model and in the choice of climate-related policies. Another major feature is the flexibility to vary key climate parameters controlling the climate system response to changes in greenhouse gases and aerosols concentrations, e.g., climate sensitivity, ocean heat uptake rate, and strength of the aerosol forcing. The IGSM-CAM is not only able to realistically simulate the present-day mean climate and the observed trends at the global and continental scale, but it also simulates ENSO variability with realistic time scales, seasonality and patterns of SST anomalies, albeit with stronger magnitudes than observed. The IGSM-CAM shares the same general strengths and limitations as the Coupled Model Intercomparison Project Phase 3 (CMIP3) models in simulating present-day annual mean surface temperature and precipitation. Over land, the IGSM-CAM shows similar biases to the NCAR Community Climate System Model (CCSM) version 3, which shares the same atmospheric model. This study also presents 21st century simulations based on two emissions scenarios (unconstrained scenario and stabilization scenario at 660 ppm CO2-equivalent) similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios, and three sets of climate parameters. Results of the simulations with the chosen climate parameters provide a good approximation for the median, and the 5th and 95th percentiles of the probability distribution of 21st century changes in global mean surface air temperature from previous work with the IGSM. Because the IGSM-CAM framework only considers one particular climate model, it cannot be used to assess the structural modeling uncertainty arising from differences in the parameterization suites of climate models. However, comparison of the IGSM-CAM projections with simulations of 31 CMIP5 models under the RCP4.5 and RCP8.5 scenarios show that the range of warming at the continental scale shows very good agreement between the two ensemble simulations, except over Antarctica, where the IGSM-CAM overestimates the warming. This demonstrates that by sampling the climate system response, the IGSM-CAM, even though it relies on one single climate model, can essentially reproduce the range of future continental warming simulated by more than 30 different models. Precipitation changes projected in the IGSM-CAM simulations and the CMIP5 multi-model ensemble both display a large uncertainty at the continental scale. The two ensemble simulations show good agreement over Asia and Europe. However, the ranges of precipitation changes do not overlap - but display similar size - over Africa and South America, two continents where models generally show little agreement in the sign of precipitation changes and where CCSM3 tends to be an outlier. Overall, the IGSM-CAM provides an efficient and consistent framework to explore the large uncertainty in future projections of global and regional climate change associated with uncertainty in the climate response and projected emissions.

  10. Pressure calculation in hybrid particle-field simulations

    NASA Astrophysics Data System (ADS)

    Milano, Giuseppe; Kawakatsu, Toshihiro

    2010-12-01

    In the framework of a recently developed scheme for a hybrid particle-field simulation techniques where self-consistent field (SCF) theory and particle models (molecular dynamics) are combined [J. Chem. Phys. 130, 214106 (2009)], we developed a general formulation for the calculation of instantaneous pressure and stress tensor. The expressions have been derived from statistical mechanical definition of the pressure starting from the expression for the free energy functional in the SCF theory. An implementation of the derived formulation suitable for hybrid particle-field molecular dynamics-self-consistent field simulations is described. A series of test simulations on model systems are reported comparing the calculated pressure with those obtained from standard molecular dynamics simulations based on pair potentials.

  11. A framework for human-hydrologic system model development integrating hydrology and water management: application to the Cutzamala water system in Mexico

    NASA Astrophysics Data System (ADS)

    Wi, S.; Freeman, S.; Brown, C.

    2017-12-01

    This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.

  12. Modeling the Transfer Function for the Dark Energy Survey

    DOE PAGES

    Chang, C.

    2015-03-04

    We present a forward-modeling simulation framework designed to model the data products from the Dark Energy Survey (DES). This forward-model process can be thought of as a transfer function—a mapping from cosmological/astronomical signals to the final data products used by the scientists. Using output from the cosmological simulations (the Blind Cosmology Challenge), we generate simulated images (the Ultra Fast Image Simulator) and catalogs representative of the DES data. In this work we demonstrate the framework by simulating the 244 deg 2 coadd images and catalogs in five bands for the DES Science Verification data. The simulation output is compared withmore » the corresponding data to show that major characteristics of the images and catalogs can be captured. We also point out several directions of future improvements. Two practical examples—star-galaxy classification and proximity effects on object detection—are then used to illustrate how one can use the simulations to address systematics issues in data analysis. With clear understanding of the simplifications in our model, we show that one can use the simulations side-by-side with data products to interpret the measurements. This forward modeling approach is generally applicable for other upcoming and future surveys. It provides a powerful tool for systematics studies that is sufficiently realistic and highly controllable.« less

  13. Simulability of observables in general probabilistic theories

    NASA Astrophysics Data System (ADS)

    Filippov, Sergey N.; Heinosaari, Teiko; Leppäjärvi, Leevi

    2018-06-01

    The existence of incompatibility is one of the most fundamental features of quantum theory and can be found at the core of many of the theory's distinguishing features, such as Bell inequality violations and the no-broadcasting theorem. A scheme for obtaining new observables from existing ones via classical operations, the so-called simulation of observables, has led to an extension of the notion of compatibility for measurements. We consider the simulation of observables within the operational framework of general probabilistic theories and introduce the concept of simulation irreducibility. While a simulation irreducible observable can only be simulated by itself, we show that any observable can be simulated by simulation irreducible observables, which in the quantum case correspond to extreme rank-1 positive-operator-valued measures. We also consider cases where the set of simulators is restricted in one of two ways: in terms of either the number of simulating observables or their number of outcomes. The former is seen to be closely connected to compatibility and k compatibility, whereas the latter leads to a partial characterization for dichotomic observables. In addition to the quantum case, we further demonstrate these concepts in state spaces described by regular polygons.

  14. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.

    PubMed

    Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M

    2016-12-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.

  15. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2006-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  16. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  17. A consensus-based framework for design, validation, and implementation of simulation-based training curricula in surgery.

    PubMed

    Zevin, Boris; Levy, Jeffrey S; Satava, Richard M; Grantcharov, Teodor P

    2012-10-01

    Simulation-based training can improve technical and nontechnical skills in surgery. To date, there is no consensus on the principles for design, validation, and implementation of a simulation-based surgical training curriculum. The aim of this study was to define such principles and formulate them into an interoperable framework using international expert consensus based on the Delphi method. Literature was reviewed, 4 international experts were queried, and consensus conference of national and international members of surgical societies was held to identify the items for the Delphi survey. Forty-five international experts in surgical education were invited to complete the online survey by ranking each item on a Likert scale from 1 to 5. Consensus was predefined as Cronbach's α ≥0.80. Items that 80% of experts ranked as ≥4 were included in the final framework. Twenty-four international experts with training in general surgery (n = 11), orthopaedic surgery (n = 2), obstetrics and gynecology (n = 3), urology (n = 1), plastic surgery (n = 1), pediatric surgery (n = 1), otolaryngology (n = 1), vascular surgery (n = 1), military (n = 1), and doctorate-level educators (n = 2) completed the iterative online Delphi survey. Consensus among participants was achieved after one round of the survey (Cronbach's α = 0.91). The final framework included predevelopment analysis; cognitive, psychomotor, and team-based training; curriculum validation evaluation and improvement; and maintenance of training. The Delphi methodology allowed for determination of international expert consensus on the principles for design, validation, and implementation of a simulation-based surgical training curriculum. These principles were formulated into a framework that can be used internationally across surgical specialties as a step-by-step guide for the development and validation of future simulation-based training curricula. Copyright © 2012 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Direct simulation Monte Carlo method for gas flows in micro-channels with bends with added curvature

    NASA Astrophysics Data System (ADS)

    Tisovský, Tomáš; Vít, Tomáš

    Gas flows in micro-channels are simulated using an open source Direct Simulation Monte Carlo (DSMC) code dsmcFOAM for general application to rarefied gas flow written within the framework of the open source C++ toolbox called OpenFOAM. Aim of this paper is to investigate the flow in micro-channel with bend with added curvature. Results are compared with flows in channel without added curvature and equivalent straight channel. Effects of micro-channel bend was already thoroughly investigated by White et al. Geometry proposed by White is also used here for refference.

  19. Quasi-optical simulation of the electron cyclotron plasma heating in a mirror magnetic trap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shalashov, A. G., E-mail: ags@appl.sci-nnov.ru; Balakin, A. A.; Khusainov, T. A.

    The resonance microwave plasma heating in a large-scale open magnetic trap is simulated taking into account all the basic wave effects during the propagation of short-wavelength wave beams (diffraction, dispersion, and aberration) within the framework of the consistent quasi-optical approximation of Maxwell’s equations. The quasi-optical method is generalized to the case of inhomogeneous media with absorption dispersion, a new form of the quasi-optical equation is obtained, the efficient method for numerical integration is found, and simulation results are verified on the GDT facility (Novosibirsk).

  20. Robot, computer problem solving system

    NASA Technical Reports Server (NTRS)

    Becker, J. D.

    1972-01-01

    The development of a computer problem solving system is reported that considers physical problems faced by an artificial robot moving around in a complex environment. Fundamental interaction constraints with a real environment are simulated for the robot by visual scan and creation of an internal environmental model. The programming system used in constructing the problem solving system for the simulated robot and its simulated world environment is outlined together with the task that the system is capable of performing. A very general framework for understanding the relationship between an observed behavior and an adequate description of that behavior is included.

  1. Prognostic residual mean flow in an ocean general circulation model and its relation to prognostic Eulerian mean flow

    DOE PAGES

    Saenz, Juan A.; Chen, Qingshan; Ringler, Todd

    2015-05-19

    Recent work has shown that taking the thickness-weighted average (TWA) of the Boussinesq equations in buoyancy coordinates results in exact equations governing the prognostic residual mean flow where eddy–mean flow interactions appear in the horizontal momentum equations as the divergence of the Eliassen–Palm flux tensor (EPFT). It has been proposed that, given the mathematical tractability of the TWA equations, the physical interpretation of the EPFT, and its relation to potential vorticity fluxes, the TWA is an appropriate framework for modeling ocean circulation with parameterized eddies. The authors test the feasibility of this proposition and investigate the connections between the TWAmore » framework and the conventional framework used in models, where Eulerian mean flow prognostic variables are solved for. Using the TWA framework as a starting point, this study explores the well-known connections between vertical transfer of horizontal momentum by eddy form drag and eddy overturning by the bolus velocity, used by Greatbatch and Lamb and Gent and McWilliams to parameterize eddies. After implementing the TWA framework in an ocean general circulation model, we verify our analysis by comparing the flows in an idealized Southern Ocean configuration simulated using the TWA and conventional frameworks with the same mesoscale eddy parameterization.« less

  2. Artificial intelligence framework for simulating clinical decision-making: a Markov decision process approach.

    PubMed

    Bennett, Casey C; Hauser, Kris

    2013-01-01

    In the modern healthcare system, rapidly expanding costs/complexity, the growing myriad of treatment options, and exploding information streams that often do not effectively reach the front lines hinder the ability to choose optimal treatment decisions over time. The goal in this paper is to develop a general purpose (non-disease-specific) computational/artificial intelligence (AI) framework to address these challenges. This framework serves two potential functions: (1) a simulation environment for exploring various healthcare policies, payment methodologies, etc., and (2) the basis for clinical artificial intelligence - an AI that can "think like a doctor". This approach combines Markov decision processes and dynamic decision networks to learn from clinical data and develop complex plans via simulation of alternative sequential decision paths while capturing the sometimes conflicting, sometimes synergistic interactions of various components in the healthcare system. It can operate in partially observable environments (in the case of missing observations or data) by maintaining belief states about patient health status and functions as an online agent that plans and re-plans as actions are performed and new observations are obtained. This framework was evaluated using real patient data from an electronic health record. The results demonstrate the feasibility of this approach; such an AI framework easily outperforms the current treatment-as-usual (TAU) case-rate/fee-for-service models of healthcare. The cost per unit of outcome change (CPUC) was $189 vs. $497 for AI vs. TAU (where lower is considered optimal) - while at the same time the AI approach could obtain a 30-35% increase in patient outcomes. Tweaking certain AI model parameters could further enhance this advantage, obtaining approximately 50% more improvement (outcome change) for roughly half the costs. Given careful design and problem formulation, an AI simulation framework can approximate optimal decisions even in complex and uncertain environments. Future work is described that outlines potential lines of research and integration of machine learning algorithms for personalized medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Interactive Resource Planning—An Anticipative Concept in the Simulation-Based Decision Support System EXPOSIM

    NASA Astrophysics Data System (ADS)

    Leopold-Wildburger, Ulrike; Pickl, Stefan

    2008-10-01

    In our research we intend to use experiments to study human behavior in a simulation environment based on a simple Lotka-Volterra predator-prey ecology. The aim is to study the influence of participants' harvesting strategies and certain personality traits derived from [1] on the outcome in terms of sustainability and economic performance. Such an approach is embedded in a research program which intends to develop and understand interactive resource planning processes. We present the general framework as well as the new decision support system EXPOSIM. The key element is the combination of experimental design, analytical understanding of time-discrete systems (especially Lotka-Volterra systems) and economic performance. In the first part, the general role of laboratory experiments is discussed. The second part summarizes the concept of sustainable development. It is taken from [18]. As we use Lotka-Volterra systems as the basis for our simulations a theoretical framework is described afterwards. It is possible to determine optimal behavior for those systems. The empirical setting is based on the empirical approach that the subjects are put into the position of a decision-maker. They are able to model the environment in such a way that harvesting can be observed. We suggest an experimental setting which might lead to new insights in an anticipatory sense.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Juliane

    MISO is an optimization framework for solving computationally expensive mixed-integer, black-box, global optimization problems. MISO uses surrogate models to approximate the computationally expensive objective function. Hence, derivative information, which is generally unavailable for black-box simulation objective functions, is not needed. MISO allows the user to choose the initial experimental design strategy, the type of surrogate model, and the sampling strategy.

  5. A generalized Poisson solver for first-principles device simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bani-Hashemian, Mohammad Hossein; VandeVondele, Joost, E-mail: joost.vandevondele@mat.ethz.ch; Brück, Sascha

    2016-01-28

    Electronic structure calculations of atomistic systems based on density functional theory involve solving the Poisson equation. In this paper, we present a plane-wave based algorithm for solving the generalized Poisson equation subject to periodic or homogeneous Neumann conditions on the boundaries of the simulation cell and Dirichlet type conditions imposed at arbitrary subdomains. In this way, source, drain, and gate voltages can be imposed across atomistic models of electronic devices. Dirichlet conditions are enforced as constraints in a variational framework giving rise to a saddle point problem. The resulting system of equations is then solved using a stationary iterative methodmore » in which the generalized Poisson operator is preconditioned with the standard Laplace operator. The solver can make use of any sufficiently smooth function modelling the dielectric constant, including density dependent dielectric continuum models. For all the boundary conditions, consistent derivatives are available and molecular dynamics simulations can be performed. The convergence behaviour of the scheme is investigated and its capabilities are demonstrated.« less

  6. Non-technical skills evaluation in the critical care air ambulance environment: introduction of an adapted rating instrument--an observational study.

    PubMed

    Myers, Julia A; Powell, David M C; Psirides, Alex; Hathaway, Karyn; Aldington, Sarah; Haney, Michael F

    2016-03-08

    In the isolated and dynamic health-care setting of critical care air ambulance transport, the quality of clinical care is strongly influenced by non-technical skills such as anticipating, recognising and understanding, decision making, and teamwork. However there are no published reports identifying or applying a non-technical skills framework specific to an intensive care air ambulance setting. The objective of this study was to adapt and evaluate a non-technical skills rating framework for the air ambulance clinical environment. In the first phase of the project the anaesthetists' non-technical skills (ANTS) framework was adapted to the air ambulance setting, using data collected directly from clinician groups, published literature, and field observation. In the second phase experienced and inexperienced inter-hospital transport clinicians completed a simulated critical care air transport scenario, and their non-technical skills performance was independently rated by two blinded assessors. Observed and self-rated general clinical performance ratings were also collected. Rank-based statistical tests were used to examine differences in the performance of experienced and inexperienced clinicians, and relationships between different assessment approaches and assessors. The framework developed during phase one was referred to as an aeromedical non-technical skills framework, or AeroNOTS. During phase two 16 physicians from speciality training programmes in intensive care, emergency medicine and anaesthesia took part in the clinical simulation study. Clinicians with inter-hospital transport experience performed more highly than those without experience, according to both AeroNOTS non-technical skills ratings (p = 0.001) and general performance ratings (p = 0.003). Self-ratings did not distinguish experienced from inexperienced transport clinicians (p = 0.32) and were not strongly associated with either observed general performance (r(s) = 0.4, p = 0.11) or observed non-technical skills performance (r(s) = 0.4, p = 0.1). This study describes a framework which characterises the non-technical skills required by critical care air ambulance clinicians, and distinguishes higher and lower levels of performance. The AeroNOTS framework could be used to facilitate education and training in non-technical skills for air ambulance clinicians, and further evaluation of this rating system is merited.

  7. Interaction of light with hematite hierarchical structures: Experiments and simulations

    NASA Astrophysics Data System (ADS)

    Distaso, Monica; Zhuromskyy, Oleksander; Seemann, Benjamin; Pflug, Lukas; Mačković, Mirza; Encina, Ezequiel; Taylor, Robin Klupp; Müller, Rolf; Leugering, Günter; Spiecker, Erdmann; Peschel, Ulf; Peukert, Wolfgang

    2017-03-01

    Mesocrystalline particles have been recognized as a class of multifunctional materials with potential applications in different fields. However, the internal organization of nanocomposite mesocrystals and its influence on the final properties have not yet been investigated. In this paper, a novel strategy based on electrodynamic simulations is developed to shed light on how the internal structure of mesocrystals influences their optical properties. In a first instance, a unified design protocol is reported for the fabrication of hematite/PVP particles with different morphologies such as pseudo-cubes, rods-like and apple-like structures and controlled particle size distributions. The optical properties of hematite/PVP mesocrystals are effectively simulated by taking their aggregate and nanocomposite structure into consideration. The superposition T-Matrix approach accounts for the aggregate nature of mesocrystalline particles and validate the effective medium approximation used in the framework of the Mie theory and electromagnetic simulation such as Finite Element Method. The approach described in our paper provides the framework to understand and predict the optical properties of mesocrystals and more general, of hierarchical nanostructured particles.

  8. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  9. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  10. Establishing a Numerical Modeling Framework for Hydrologic Engineering Analyses of Extreme Storm Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. Themore » evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.« less

  11. FEAST fundamental framework for electronic structure calculations: Reformulation and solution of the muffin-tin problem

    NASA Astrophysics Data System (ADS)

    Levin, Alan R.; Zhang, Deyin; Polizzi, Eric

    2012-11-01

    In a recent article Polizzi (2009) [15], the FEAST algorithm has been presented as a general purpose eigenvalue solver which is ideally suited for addressing the numerical challenges in electronic structure calculations. Here, FEAST is presented beyond the “black-box” solver as a fundamental modeling framework which can naturally address the original numerical complexity of the electronic structure problem as formulated by Slater in 1937 [3]. The non-linear eigenvalue problem arising from the muffin-tin decomposition of the real-space domain is first derived and then reformulated to be solved exactly within the FEAST framework. This new framework is presented as a fundamental and practical solution for performing both accurate and scalable electronic structure calculations, bypassing the various issues of using traditional approaches such as linearization and pseudopotential techniques. A finite element implementation of this FEAST framework along with simulation results for various molecular systems is also presented and discussed.

  12. Analysis of Gas-Particle Flows through Multi-Scale Simulations

    NASA Astrophysics Data System (ADS)

    Gu, Yile

    Multi-scale structures are inherent in gas-solid flows, which render the modeling efforts challenging. On one hand, detailed simulations where the fine structures are resolved and particle properties can be directly specified can account for complex flow behaviors, but they are too computationally expensive to apply for larger systems. On the other hand, coarse-grained simulations demand much less computations but they necessitate constitutive models which are often not readily available for given particle properties. The present study focuses on addressing this issue, as it seeks to provide a general framework through which one can obtain the required constitutive models from detailed simulations. To demonstrate the viability of this general framework in which closures can be proposed for different particle properties, we focus on the van der Waals force of interaction between particles. We start with Computational Fluid Dynamics (CFD) - Discrete Element Method (DEM) simulations where the fine structures are resolved and van der Waals force between particles can be directly specified, and obtain closures for stress and drag that are required for coarse-grained simulations. Specifically, we develop a new cohesion model that appropriately accounts for van der Waals force between particles to be used for CFD-DEM simulations. We then validate this cohesion model and the CFD-DEM approach by showing that it can qualitatively capture experimental results where the addition of small particles to gas fluidization reduces bubble sizes. Based on the DEM and CFD-DEM simulation results, we propose stress models that account for the van der Waals force between particles. Finally, we apply machine learning, specifically neural networks, to obtain a drag model that captures the effects from fine structures and inter-particle cohesion. We show that this novel approach using neural networks, which can be readily applied for other closures other than drag here, can take advantage of the large amount of data generated from simulations, and therefore offer superior modeling performance over traditional approaches.

  13. TESSIM: a simulator for the Athena-X-IFU

    NASA Astrophysics Data System (ADS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; den Hartog, R. H.; Bandler, S. R.; de Plaa, J.; den Herder, J.-W. A.

    2016-07-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS- files which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http://www.sternwarte.uni-erlangen.de/research/sixte/).

  14. TESSIM: A Simulator for the Athena-X-IFU

    NASA Technical Reports Server (NTRS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; Den Hartog, R. H.; Bandler, S. R.; De Plaa, J.; hide

    2016-01-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS-les which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http:www.sternwarte.uni-erlangen.deresearchsixte).

  15. Template-Based Geometric Simulation of Flexible Frameworks

    PubMed Central

    Wells, Stephen A.; Sartbaeva, Asel

    2012-01-01

    Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers. PMID:28817055

  16. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology

    PubMed Central

    Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.

    2016-01-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915

  17. Distribution of Steps with Finite-Range Interactions: Analytic Approximations and Numerical Results

    NASA Astrophysics Data System (ADS)

    GonzáLez, Diego Luis; Jaramillo, Diego Felipe; TéLlez, Gabriel; Einstein, T. L.

    2013-03-01

    While most Monte Carlo simulations assume only nearest-neighbor steps interact elastically, most analytic frameworks (especially the generalized Wigner distribution) posit that each step elastically repels all others. In addition to the elastic repulsions, we allow for possible surface-state-mediated interactions. We investigate analytically and numerically how next-nearest neighbor (NNN) interactions and, more generally, interactions out to q'th nearest neighbor alter the form of the terrace-width distribution and of pair correlation functions (i.e. the sum over n'th neighbor distribution functions, which we investigated recently.[2] For physically plausible interactions, we find modest changes when NNN interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  18. Modelling biological behaviours with the unified modelling language: an immunological case study and critique.

    PubMed

    Read, Mark; Andrews, Paul S; Timmis, Jon; Kumar, Vipin

    2014-10-06

    We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology.

  19. Modelling biological behaviours with the unified modelling language: an immunological case study and critique

    PubMed Central

    Read, Mark; Andrews, Paul S.; Timmis, Jon; Kumar, Vipin

    2014-01-01

    We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology. PMID:25142524

  20. A Non-Incompressible Non-Boussinesq (NINB) framework for studying atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Yan, C.; Archer, C. L.; Xie, S.; Ghaisas, N.

    2015-12-01

    The incompressible assumption is widely used for studying the turbulent atmospheric boundary layer (ABL) and is generally accepted when the Mach number < ~0.3 (velocity < ~100 m/s). Since the tips of modern wind turbine blades can reach and exceed this threshold, neglecting air compressibility will introduce errors. In addition, if air incompressibility does not hold, then the Boussinesq approximation, by which air density is treated as a constant except in the gravity term of the Navier-Stokes equation, is also invalidated. Here, we propose a new theoretical framework, called NINB for Non-Incompressible Non-Boussinesq, in which air is not considered incompressible and air density is treated as a non-turbulent 4D variable. First, the NINB mass, momentum, and energy conservation equations are developed using Reynolds averaging. Second, numerical simulations of the NINB equations, coupled with a k-epsilon turbulence model, are performed with the finite-volume method. Wind turbines are modeled with the actuator-line model using SOWFA (Software for Offshore/onshore Wind Farm Applications). Third, NINB results are compared with the traditional incompressible buoyant simulations performed by SOWFA with the same set up. The results show differences between NINB and traditional simulations in the neutral atmosphere with a wind turbine. The largest differences in wind speed (up to 1 m/s), turbulent kinetic energy (~10%), dissipation rate (~5%), and shear stress (~10%) occur near the turbine tip region. The power generation differences are 5-15% (depending on setup). These preliminary results suggest that compressibility effects are non-negligible around wind turbines and should be taken into account when forecasting wind power. Since only a few extra terms are introduced, the NINB framework may be an alternative to the traditional incompressible Boussinesq framework for studying the turbulent ABL in general (i.e., without turbines) in the absence of shock waves.

  1. An intelligent interactive simulator of clinical reasoning in general surgery.

    PubMed Central

    Wang, S.; el Ayeb, B.; Echavé, V.; Preiss, B.

    1993-01-01

    We introduce an interactive computer environment for teaching in general surgery and for diagnostic assistance. The environment consists of a knowledge-based system coupled with an intelligent interface that allows users to acquire conceptual knowledge and clinical reasoning techniques. Knowledge is represented internally within a probabilistic framework and externally through a interface inspired by Concept Graphics. Given a set of symptoms, the internal knowledge framework computes the most probable set of diseases as well as best alternatives. The interface displays CGs illustrating the results and prompting essential facts of a medical situation or a process. The system is then ready to receive additional information or to suggest further investigation. Based on the new information, the system will narrow the solutions with increased belief coefficients. PMID:8130508

  2. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework

    PubMed Central

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811

  3. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework.

    PubMed

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.

  4. Integrated Instrument Simulator Suites for Earth Science

    NASA Technical Reports Server (NTRS)

    Tanelli, Simone; Tao, Wei-Kuo; Matsui, Toshihisa; Hostetler, Chris; Hair, Johnathan; Butler, Carolyn; Kuo, Kwo-Sen; Niamsuwan, Noppasin; Johnson, Michael P.; Jacob, Joseph C.; hide

    2012-01-01

    The NASA Earth Observing System Simulators Suite (NEOS3) is a modular framework of forward simulations tools for remote sensing of Earth's Atmosphere from space. It was initiated as the Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) under the NASA Advanced Information Systems Technology (AIST) program of the Earth Science Technology Office (ESTO) to enable science users to perform simulations based on advanced atmospheric and simple land surface models, and to rapidly integrate in a broad framework any experimental or innovative tools that they may have developed in this context. The name was changed to NEOS3 when the project was expanded to include more advanced modeling tools for the surface contributions, accounting for scattering and emission properties of layered surface (e.g., soil moisture, vegetation, snow and ice, subsurface layers). NEOS3 relies on a web-based graphic user interface, and a three-stage processing strategy to generate simulated measurements. The user has full control over a wide range of customizations both in terms of a priori assumptions and in terms of specific solvers or models used to calculate the measured signals.This presentation will demonstrate the general architecture, the configuration procedures and illustrate some sample products and the fundamental interface requirements for modules candidate for integration.

  5. A hybrid framework for coupling arbitrary summation-by-parts schemes on general meshes

    NASA Astrophysics Data System (ADS)

    Lundquist, Tomas; Malan, Arnaud; Nordström, Jan

    2018-06-01

    We develop a general interface procedure to couple both structured and unstructured parts of a hybrid mesh in a non-collocated, multi-block fashion. The target is to gain optimal computational efficiency in fluid dynamics simulations involving complex geometries. While guaranteeing stability, the proposed procedure is optimized for accuracy and requires minimal algorithmic modifications to already existing schemes. Initial numerical investigations confirm considerable efficiency gains compared to non-hybrid calculations of up to an order of magnitude.

  6. An Analytical Model for the Performance Analysis of Concurrent Transmission in IEEE 802.15.4

    PubMed Central

    Gezer, Cengiz; Zanella, Alberto; Verdone, Roberto

    2014-01-01

    Interference is a serious cause of performance degradation for IEEE802.15.4 devices. The effect of concurrent transmissions in IEEE 802.15.4 has been generally investigated by means of simulation or experimental activities. In this paper, a mathematical framework for the derivation of chip, symbol and packet error probability of a typical IEEE 802.15.4 receiver in the presence of interference is proposed. Both non-coherent and coherent demodulation schemes are considered by our model under the assumption of the absence of thermal noise. Simulation results are also added to assess the validity of the mathematical framework when the effect of thermal noise cannot be neglected. Numerical results show that the proposed analysis is in agreement with the measurement results on the literature under realistic working conditions. PMID:24658624

  7. An analytical model for the performance analysis of concurrent transmission in IEEE 802.15.4.

    PubMed

    Gezer, Cengiz; Zanella, Alberto; Verdone, Roberto

    2014-03-20

    Interference is a serious cause of performance degradation for IEEE802.15.4 devices. The effect of concurrent transmissions in IEEE 802.15.4 has been generally investigated by means of simulation or experimental activities. In this paper, a mathematical framework for the derivation of chip, symbol and packet error probability of a typical IEEE 802.15.4 receiver in the presence of interference is proposed. Both non-coherent and coherent demodulation schemes are considered by our model under the assumption of the absence of thermal noise. Simulation results are also added to assess the validity of the mathematical framework when the effect of thermal noise cannot be neglected. Numerical results show that the proposed analysis is in agreement with the measurement results on the literature under realistic working conditions.

  8. A two-step hierarchical hypothesis set testing framework, with applications to gene expression data on ordered categories

    PubMed Central

    2014-01-01

    Background In complex large-scale experiments, in addition to simultaneously considering a large number of features, multiple hypotheses are often being tested for each feature. This leads to a problem of multi-dimensional multiple testing. For example, in gene expression studies over ordered categories (such as time-course or dose-response experiments), interest is often in testing differential expression across several categories for each gene. In this paper, we consider a framework for testing multiple sets of hypothesis, which can be applied to a wide range of problems. Results We adopt the concept of the overall false discovery rate (OFDR) for controlling false discoveries on the hypothesis set level. Based on an existing procedure for identifying differentially expressed gene sets, we discuss a general two-step hierarchical hypothesis set testing procedure, which controls the overall false discovery rate under independence across hypothesis sets. In addition, we discuss the concept of the mixed-directional false discovery rate (mdFDR), and extend the general procedure to enable directional decisions for two-sided alternatives. We applied the framework to the case of microarray time-course/dose-response experiments, and proposed three procedures for testing differential expression and making multiple directional decisions for each gene. Simulation studies confirm the control of the OFDR and mdFDR by the proposed procedures under independence and positive correlations across genes. Simulation results also show that two of our new procedures achieve higher power than previous methods. Finally, the proposed methodology is applied to a microarray dose-response study, to identify 17 β-estradiol sensitive genes in breast cancer cells that are induced at low concentrations. Conclusions The framework we discuss provides a platform for multiple testing procedures covering situations involving two (or potentially more) sources of multiplicity. The framework is easy to use and adaptable to various practical settings that frequently occur in large-scale experiments. Procedures generated from the framework are shown to maintain control of the OFDR and mdFDR, quantities that are especially relevant in the case of multiple hypothesis set testing. The procedures work well in both simulations and real datasets, and are shown to have better power than existing methods. PMID:24731138

  9. Electromagnetic gyrokinetic simulation in GTS

    NASA Astrophysics Data System (ADS)

    Ma, Chenhao; Wang, Weixing; Startsev, Edward; Lee, W. W.; Ethier, Stephane

    2017-10-01

    We report the recent development in the electromagnetic simulations for general toroidal geometry based on the particle-in-cell gyrokinetic code GTS. Because of the cancellation problem, the EM gyrokinetic simulation has numerical difficulties in the MHD limit where k⊥ρi -> 0 and/or β >me /mi . Recently several approaches has been developed to circumvent this problem: (1) p∥ formulation with analytical skin term iteratively approximated by simulation particles (Yang Chen), (2) A modified p∥ formulation with ∫ dtE∥ used in place of A∥ (Mishichenko); (3) A conservative theme where the electron density perturbation for the Poisson equation is calculated from an electron continuity equation (Bao) ; (4) double-split-weight scheme with two weights, one for Poisson equation and one for time derivative of Ampere's law, each with different splits designed to remove large terms from Vlasov equation (Startsev). These algorithms are being implemented into GTS framework for general toroidal geometry. The performance of these different algorithms will be compared for various EM modes.

  10. Enhanced embodied response following ambiguous emotional processing.

    PubMed

    Beffara, Brice; Ouellet, Marc; Vermeulen, Nicolas; Basu, Anamitra; Morisseau, Tiffany; Mermillod, Martial

    2012-08-01

    It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.

  11. Direct Method Transcription for a Human-Class Translunar Injection Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Witzberger, Kevin E.; Zeiler, Tom

    2012-01-01

    This paper presents a new trajectory optimization software package developed in the framework of a low-to-high fidelity 3 degrees-of-freedom (DOF)/6-DOF vehicle simulation program named Mission Analysis Simulation Tool in Fortran (MASTIF) and its application to a translunar trajectory optimization problem. The functionality of the developed optimization package is implemented as a new "mode" in generalized settings to make it applicable for a general trajectory optimization problem. In doing so, a direct optimization method using collocation is employed for solving the problem. Trajectory optimization problems in MASTIF are transcribed to a constrained nonlinear programming (NLP) problem and solved with SNOPT, a commercially available NLP solver. A detailed description of the optimization software developed is provided as well as the transcription specifics for the translunar injection (TLI) problem. The analysis includes a 3-DOF trajectory TLI optimization and a 3-DOF vehicle TLI simulation using closed-loop guidance.

  12. DFTB+ and lanthanides

    NASA Astrophysics Data System (ADS)

    Hourahine, B.; Aradi, B.; Frauenheim, T.

    2010-07-01

    DFTB+ is a recent general purpose implementation of density-functional based tight binding. One of the early motivators to develop this code was to investigate lanthanide impurities in nitride semiconductors, leading to a series of successful studies into structure and electrical properties of these systems. Here we describe our general framework to treat the physical effects needed for these problematic impurities within a tight-binding formalism, additionally discussing forces and stresses in DFTB. We also present an approach to evaluate the general case of Slater-Koster transforms and all of their derivatives in Cartesian coordinates. These developments are illustrated by simulating isolated Gd impurities in GaN.

  13. A Framework of Covariance Projection on Constraint Manifold for Data Fusion.

    PubMed

    Bakr, Muhammad Abu; Lee, Sukhan

    2018-05-17

    A general framework of data fusion is presented based on projecting the probability distribution of true states and measurements around the predicted states and actual measurements onto the constraint manifold. The constraint manifold represents the constraints to be satisfied among true states and measurements, which is defined in the extended space with all the redundant sources of data such as state predictions and measurements considered as independent variables. By the general framework, we mean that it is able to fuse any correlated data sources while directly incorporating constraints and identifying inconsistent data without any prior information. The proposed method, referred to here as the Covariance Projection (CP) method, provides an unbiased and optimal solution in the sense of minimum mean square error (MMSE), if the projection is based on the minimum weighted distance on the constraint manifold. The proposed method not only offers a generalization of the conventional formula for handling constraints and data inconsistency, but also provides a new insight into data fusion in terms of a geometric-algebraic point of view. Simulation results are provided to show the effectiveness of the proposed method in handling constraints and data inconsistency.

  14. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  15. Next generation extended Lagrangian first principles molecular dynamics

    NASA Astrophysics Data System (ADS)

    Niklasson, Anders M. N.

    2017-08-01

    Extended Lagrangian Born-Oppenheimer molecular dynamics [A. M. N. Niklasson, Phys. Rev. Lett. 100, 123004 (2008)] is formulated for general Hohenberg-Kohn density-functional theory and compared with the extended Lagrangian framework of first principles molecular dynamics by Car and Parrinello [Phys. Rev. Lett. 55, 2471 (1985)]. It is shown how extended Lagrangian Born-Oppenheimer molecular dynamics overcomes several shortcomings of regular, direct Born-Oppenheimer molecular dynamics, while improving or maintaining important features of Car-Parrinello simulations. The accuracy of the electronic degrees of freedom in extended Lagrangian Born-Oppenheimer molecular dynamics, with respect to the exact Born-Oppenheimer solution, is of second-order in the size of the integration time step and of fourth order in the potential energy surface. Improved stability over recent formulations of extended Lagrangian Born-Oppenheimer molecular dynamics is achieved by generalizing the theory to finite temperature ensembles, using fractional occupation numbers in the calculation of the inner-product kernel of the extended harmonic oscillator that appears as a preconditioner in the electronic equations of motion. Material systems that normally exhibit slow self-consistent field convergence can be simulated using integration time steps of the same order as in direct Born-Oppenheimer molecular dynamics, but without the requirement of an iterative, non-linear electronic ground-state optimization prior to the force evaluations and without a systematic drift in the total energy. In combination with proposed low-rank and on the fly updates of the kernel, this formulation provides an efficient and general framework for quantum-based Born-Oppenheimer molecular dynamics simulations.

  16. Next generation extended Lagrangian first principles molecular dynamics.

    PubMed

    Niklasson, Anders M N

    2017-08-07

    Extended Lagrangian Born-Oppenheimer molecular dynamics [A. M. N. Niklasson, Phys. Rev. Lett. 100, 123004 (2008)] is formulated for general Hohenberg-Kohn density-functional theory and compared with the extended Lagrangian framework of first principles molecular dynamics by Car and Parrinello [Phys. Rev. Lett. 55, 2471 (1985)]. It is shown how extended Lagrangian Born-Oppenheimer molecular dynamics overcomes several shortcomings of regular, direct Born-Oppenheimer molecular dynamics, while improving or maintaining important features of Car-Parrinello simulations. The accuracy of the electronic degrees of freedom in extended Lagrangian Born-Oppenheimer molecular dynamics, with respect to the exact Born-Oppenheimer solution, is of second-order in the size of the integration time step and of fourth order in the potential energy surface. Improved stability over recent formulations of extended Lagrangian Born-Oppenheimer molecular dynamics is achieved by generalizing the theory to finite temperature ensembles, using fractional occupation numbers in the calculation of the inner-product kernel of the extended harmonic oscillator that appears as a preconditioner in the electronic equations of motion. Material systems that normally exhibit slow self-consistent field convergence can be simulated using integration time steps of the same order as in direct Born-Oppenheimer molecular dynamics, but without the requirement of an iterative, non-linear electronic ground-state optimization prior to the force evaluations and without a systematic drift in the total energy. In combination with proposed low-rank and on the fly updates of the kernel, this formulation provides an efficient and general framework for quantum-based Born-Oppenheimer molecular dynamics simulations.

  17. Voxel2MCNP: a framework for modeling, simulation and evaluation of radiation transport scenarios for Monte Carlo codes.

    PubMed

    Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian

    2013-08-21

    The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX's MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application.

  18. Action Recommendation for Cyber Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhury, Sutanay; Rodriguez, Luke R.; Curtis, Darren S.

    2015-09-01

    This paper presents an unifying graph-based model for representing the infrastructure, behavior and missions of an enterprise. We describe how the model can be used to achieve resiliency against a wide class of failures and attacks. We introduce an algorithm for recommending resilience establishing actions based on dynamic updates to the models. Without loss of generality, we show the effectiveness of the algorithm for preserving latency based quality of service (QoS). Our models and the recommendation algorithms are implemented in a software framework that we seek to release as an open source framework for simulating resilient cyber systems.

  19. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both previously developed models for individual aspects of pathosystems and new ones. Complex models are deconstructed into separate ‘knowledge sources’ originating from different specialist areas of expertise and these can be shared and reassembled into multidisciplinary models. The framework thus provides a beneficial tool for a potential diverse and dynamic research community. PMID:24925323

  20. Simulations and Evaluation of Mesoscale Convective Systems in a Multi-scale Modeling Framework (MMF)

    NASA Astrophysics Data System (ADS)

    Chern, J. D.; Tao, W. K.

    2017-12-01

    It is well known that the mesoscale convective systems (MCS) produce more than 50% of rainfall in most tropical regions and play important roles in regional and global water cycles. Simulation of MCSs in global and climate models is a very challenging problem. Typical MCSs have horizontal scale of a few hundred kilometers. Models with a domain of several hundred kilometers and fine enough resolution to properly simulate individual clouds are required to realistically simulate MCSs. The multiscale modeling framework (MMF), which replaces traditional cloud parameterizations with cloud-resolving models (CRMs) within a host atmospheric general circulation model (GCM), has shown some capabilities of simulating organized MCS-like storm signals and propagations. However, its embedded CRMs typically have small domain (less than 128 km) and coarse resolution ( 4 km) that cannot realistically simulate MCSs and individual clouds. In this study, a series of simulations were performed using the Goddard MMF. The impacts of the domain size and model grid resolution of the embedded CRMs on simulating MCSs are examined. The changes of cloud structure, occurrence, and properties such as cloud types, updraft and downdraft, latent heating profile, and cold pool strength in the embedded CRMs are examined in details. The simulated MCS characteristics are evaluated against satellite measurements using the Goddard Satellite Data Simulator Unit. The results indicate that embedded CRMs with large domain and fine resolution tend to produce better simulations compared to those simulations with typical MMF configuration (128 km domain size and 4 km model grid spacing).

  1. Generalized random sign and alert delay models for imperfect maintenance.

    PubMed

    Dijoux, Yann; Gaudoin, Olivier

    2014-04-01

    This paper considers the modelling of the process of Corrective and condition-based Preventive Maintenance, for complex repairable systems. In order to take into account the dependency between both types of maintenance and the possibility of imperfect maintenance, Generalized Competing Risks models have been introduced in "Doyen and Gaudoin (J Appl Probab 43:825-839, 2006)". In this paper, we study two classes of these models, the Generalized Random Sign and Generalized Alert Delay models. A Generalized Competing Risks model can be built as a generalization of a particular Usual Competing Risks model, either by using a virtual age framework or not. The models properties are studied and their parameterizations are discussed. Finally, simulation results and an application to real data are presented.

  2. Particle Acceleration in a Statistically Modeled Solar Active-Region Corona

    NASA Astrophysics Data System (ADS)

    Toutounzi, A.; Vlahos, L.; Isliker, H.; Dimitropoulou, M.; Anastasiadis, A.; Georgoulis, M.

    2013-09-01

    Elaborating a statistical approach to describe the spatiotemporally intermittent electric field structures formed inside a flaring solar active region, we investigate the efficiency of such structures in accelerating charged particles (electrons). The large-scale magnetic configuration in the solar atmosphere responds to the strong turbulent flows that convey perturbations across the active region by initiating avalanche-type processes. The resulting unstable structures correspond to small-scale dissipation regions hosting strong electric fields. Previous research on particle acceleration in strongly turbulent plasmas provides a general framework for addressing such a problem. This framework combines various electromagnetic field configurations obtained by magnetohydrodynamical (MHD) or cellular automata (CA) simulations, or by employing a statistical description of the field's strength and configuration with test particle simulations. Our objective is to complement previous work done on the subject. As in previous efforts, a set of three probability distribution functions describes our ad-hoc electromagnetic field configurations. In addition, we work on data-driven 3D magnetic field extrapolations. A collisional relativistic test-particle simulation traces each particle's guiding center within these configurations. We also find that an interplay between different electron populations (thermal/non-thermal, ambient/injected) in our simulations may also address, via a re-acceleration mechanism, the so called `number problem'. Using the simulated particle-energy distributions at different heights of the cylinder we test our results against observations, in the framework of the collisional thick target model (CTTM) of solar hard X-ray (HXR) emission. The above work is supported by the Hellenic National Space Weather Research Network (HNSWRN) via the THALIS Programme.

  3. GNSS-ISR data fusion: General framework with application to the high-latitude ionosphere

    NASA Astrophysics Data System (ADS)

    Semeter, Joshua; Hirsch, Michael; Lind, Frank; Coster, Anthea; Erickson, Philip; Pankratius, Victor

    2016-03-01

    A mathematical framework is presented for the fusion of electron density measured by incoherent scatter radar (ISR) and total electron content (TEC) measured using global navigation satellite systems (GNSS). Both measurements are treated as projections of an unknown density field (for GNSS-TEC the projection is tomographic; for ISR the projection is a weighted average over a local spatial region) and discrete inverse theory is applied to obtain a higher fidelity representation of the field than could be obtained from either modality individually. The specific implementation explored herein uses the interpolated ISR density field as initial guess to the combined inverse problem, which is subsequently solved using maximum entropy regularization. Simulations involving a dense meridional network of GNSS receivers near the Poker Flat ISR demonstrate the potential of this approach to resolve sub-beam structure in ISR measurements. Several future directions are outlined, including (1) data fusion using lower level (lag product) ISR data, (2) consideration of the different temporal sampling rates, (3) application of physics-based regularization, (4) consideration of nonoptimal observing geometries, and (5) use of an ISR simulation framework for optimal experiment design.

  4. Deorbitalization strategies for meta-generalized-gradient-approximation exchange-correlation functionals

    NASA Astrophysics Data System (ADS)

    Mejia-Rodriguez, Daniel; Trickey, S. B.

    2017-11-01

    We explore the simplification of widely used meta-generalized-gradient approximation (mGGA) exchange-correlation functionals to the Laplacian level of refinement by use of approximate kinetic-energy density functionals (KEDFs). Such deorbitalization is motivated by the prospect of reducing computational cost while recovering a strictly Kohn-Sham local potential framework (rather than the usual generalized Kohn-Sham treatment of mGGAs). A KEDF that has been rather successful in solid simulations proves to be inadequate for deorbitalization, but we produce other forms which, with parametrization to Kohn-Sham results (not experimental data) on a small training set, yield rather good results on standard molecular test sets when used to deorbitalize the meta-GGA made very simple, Tao-Perdew-Staroverov-Scuseria, and strongly constrained and appropriately normed functionals. We also study the difference between high-fidelity and best-performing deorbitalizations and discuss possible implications for use in ab initio molecular dynamics simulations of complicated condensed phase systems.

  5. Time-ordered product expansions for computational stochastic system biology.

    PubMed

    Mjolsness, Eric

    2013-06-01

    The time-ordered product framework of quantum field theory can also be used to understand salient phenomena in stochastic biochemical networks. It is used here to derive Gillespie's stochastic simulation algorithm (SSA) for chemical reaction networks; consequently, the SSA can be interpreted in terms of Feynman diagrams. It is also used here to derive other, more general simulation and parameter-learning algorithms including simulation algorithms for networks of stochastic reaction-like processes operating on parameterized objects, and also hybrid stochastic reaction/differential equation models in which systems of ordinary differential equations evolve the parameters of objects that can also undergo stochastic reactions. Thus, the time-ordered product expansion can be used systematically to derive simulation and parameter-fitting algorithms for stochastic systems.

  6. Allowing for crystalline structure effects in Geant4

    DOE PAGES

    Bagli, Enrico; Asai, Makoto; Dotti, Andrea; ...

    2017-03-24

    In recent years, the Geant4 toolkit for the Monte Carlo simulation of radiation with matter has seen large growth in its divers user community. A fundamental aspect of a successful physics experiment is the availability of a reliable and precise simulation code. Geant4 currently does not allow for the simulation of particle interactions with anything other than amorphous matter. To overcome this limitation, the GECO (GEant4 Crystal Objects) project developed a general framework for managing solid-state structures in the Geant4 kernel and validate it against experimental data. As a result, accounting for detailed geometrical structures allows, for example, simulation ofmore » diffraction from crystal planes or the channeling of charged particle.« less

  7. Model structure of the stream salmonid simulator (S3)—A dynamic model for simulating growth, movement, and survival of juvenile salmonids

    USGS Publications Warehouse

    Perry, Russell W.; Plumb, John M.; Jones, Edward C.; Som, Nicholas A.; Hetrick, Nicholas J.; Hardy, Thomas B.

    2018-04-06

    Fisheries and water managers often use population models to aid in understanding the effect of alternative water management or restoration actions on anadromous fish populations. We developed the Stream Salmonid Simulator (S3) to help resource managers evaluate the effect of management alternatives on juvenile salmonid populations. S3 is a deterministic stage-structured population model that tracks daily growth, movement, and survival of juvenile salmon. A key theme of the model is that river flow affects habitat availability and capacity, which in turn drives density dependent population dynamics. To explicitly link population dynamics to habitat quality and quantity, the river environment is constructed as a one-dimensional series of linked habitat units, each of which has an associated daily time series of discharge, water temperature, and usable habitat area or carrying capacity. The physical characteristics of each habitat unit and the number of fish occupying each unit, in turn, drive survival and growth within each habitat unit and movement of fish among habitat units.The purpose of this report is to outline the underlying general structure of the S3 model that is common among different applications of the model. We have developed applications of the S3 model for juvenile fall Chinook salmon (Oncorhynchus tshawytscha) in the lower Klamath River. Thus, this report is a companion to current application of the S3 model to the Trinity River (in review). The general S3 model structure provides a biological and physical framework for the salmonid freshwater life cycle. This framework captures important demographics of juvenile salmonids aimed at translating management alternatives into simulated population responses. Although the S3 model is built on this common framework, the model has been constructed to allow much flexibility in application of the model to specific river systems. The ability for practitioners to include system-specific information for the physical stream structure, survival, growth, and movement processes ensures that simulations provide results that are relevant to the questions asked about the population under study.

  8. Simulating Sand Behavior through Terrain Subdivision and Particle Refinement

    NASA Astrophysics Data System (ADS)

    Clothier, M.

    2013-12-01

    Advances in computer graphics, GPUs, and parallel processing hardware have provided researchers with new methods to visualize scientific data. In fact, these advances have spurred new research opportunities between computer graphics and other disciplines, such as Earth sciences. Through collaboration, Earth and planetary scientists have benefited by using these advances in hardware technology to process large amounts of data for visualization and analysis. At Oregon State University, we are collaborating with the Oregon Space Grant and IGERT Ecosystem Informatics programs to investigate techniques for simulating the behavior of sand. In addition, we have also been collaborating with the Jet Propulsion Laboratory's DARTS Lab to exchange ideas on our research. The DARTS Lab specializes in the simulation of planetary vehicles, such as the Mars rovers. One aspect of their work is testing these vehicles in a virtual "sand box" to test their performance in different environments. Our research builds upon this idea to create a sand simulation framework to allow for more complex and diverse environments. As a basis for our framework, we have focused on planetary environments, such as the harsh, sandy regions on Mars. To evaluate our framework, we have used simulated planetary vehicles, such as a rover, to gain insight into the performance and interaction between the surface sand and the vehicle. Unfortunately, simulating the vast number of individual sand particles and their interaction with each other has been a computationally complex problem in the past. However, through the use of high-performance computing, we have developed a technique to subdivide physically active terrain regions across a large landscape. To achieve this, we only subdivide terrain regions where sand particles are actively participating with another object or force, such as a rover wheel. This is similar to a Level of Detail (LOD) technique, except that the density of subdivisions are determined by their proximity to the interacting object or force with the sand. To illustrate an example, as a rover wheel moves forward and approaches a particular sand region, that region will continue to subdivide until individual sand particles are represented. Conversely, if the rover wheel moves away, previously subdivided sand regions will recombine. Thus, individual sand particles are available when an interacting force is present but stored away if there is not. As such, this technique allows for many particles to be represented without the computational complexity. We have also further generalized these subdivision regions in our sand framework into any volumetric area suitable for use in the simulation. This allows for more compact subdivision regions and has fine-tuned our framework so that more emphasis can be placed on regions of actively participating sand. We feel that this increases the framework's usefulness across scientific applications and can provide for other research opportunities within the earth and planetary sciences. Through continued collaboration with our academic partners, we continue to build upon our sand simulation framework and look for other opportunities to utilize this research.

  9. Interpreting the NLN Jeffries Framework in the context of Nurse Educator preparation.

    PubMed

    Young, Patricia K; Shellenbarger, Teresa

    2012-08-01

    The NLN Jeffries Framework describing simulation in nursing education has been used widely to guide construction of human patient simulation scenarios and serve as a theoretical framework for research on the use of simulation. This framework was developed with a focus on prelicensure nursing education. However, use of human patient simulation scenarios is also a way of providing practice experiences for graduate students learning the educator role. High-fidelity human patient simulation offers nurse educator faculty a unique opportunity to cultivate the practical knowledge of teaching in an interactive and dynamic environment. This article describes how the components of The NLN Jeffries Framework can help to guide simulation design for nurse educator preparation. Adapting the components of the framework-which include teacher, student, educational practices, design characteristics, and outcomes-helps to ensure that future faculty gain hands-on experience with nurse educator core competencies. Copyright 2012, SLACK Incorporated.

  10. Evolutionary squeaky wheel optimization: a new framework for analysis.

    PubMed

    Li, Jingpeng; Parkes, Andrew J; Burke, Edmund K

    2011-01-01

    Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.

  11. Improved discrete swarm intelligence algorithms for endmember extraction from hyperspectral remote sensing images

    NASA Astrophysics Data System (ADS)

    Su, Yuanchao; Sun, Xu; Gao, Lianru; Li, Jun; Zhang, Bing

    2016-10-01

    Endmember extraction is a key step in hyperspectral unmixing. A new endmember extraction framework is proposed for hyperspectral endmember extraction. The proposed approach is based on the swarm intelligence (SI) algorithm, where discretization is used to solve the SI algorithm because pixels in a hyperspectral image are naturally defined within a discrete space. Moreover, a "distance" factor is introduced into the objective function to limit the endmember numbers which is generally limited in real scenarios, while traditional SI algorithms likely produce superabundant spectral signatures, which generally belong to the same classes. Three endmember extraction methods are proposed based on the artificial bee colony, ant colony optimization, and particle swarm optimization algorithms. Experiments with both simulated and real hyperspectral images indicate that the proposed framework can improve the accuracy of endmember extraction.

  12. Annual Research Briefs

    NASA Technical Reports Server (NTRS)

    Spinks, Debra (Compiler)

    1997-01-01

    This report contains the 1997 annual progress reports of the research fellows and students supported by the Center for Turbulence Research (CTR). Titles include: Invariant modeling in large-eddy simulation of turbulence; Validation of large-eddy simulation in a plain asymmetric diffuser; Progress in large-eddy simulation of trailing-edge turbulence and aeronautics; Resolution requirements in large-eddy simulations of shear flows; A general theory of discrete filtering for LES in complex geometry; On the use of discrete filters for large eddy simulation; Wall models in large eddy simulation of separated flow; Perspectives for ensemble average LES; Anisotropic grid-based formulas for subgrid-scale models; Some modeling requirements for wall models in large eddy simulation; Numerical simulation of 3D turbulent boundary layers using the V2F model; Accurate modeling of impinging jet heat transfer; Application of turbulence models to high-lift airfoils; Advances in structure-based turbulence modeling; Incorporating realistic chemistry into direct numerical simulations of turbulent non-premixed combustion; Effects of small-scale structure on turbulent mixing; Turbulent premixed combustion in the laminar flamelet and the thin reaction zone regime; Large eddy simulation of combustion instabilities in turbulent premixed burners; On the generation of vorticity at a free-surface; Active control of turbulent channel flow; A generalized framework for robust control in fluid mechanics; Combined immersed-boundary/B-spline methods for simulations of flow in complex geometries; and DNS of shock boundary-layer interaction - preliminary results for compression ramp flow.

  13. Simulation Methods for Design of Networked Power Electronics and Information Systems

    DTIC Science & Technology

    2014-07-01

    Insertion of latency in every branch and at every node permits the system model to be efficiently distributed across many separate computing cores. An... the system . We demonstrated extensibility and generality of the Virtual Test Bed (VTB) framework to support multiple solvers and their associated...Information Systems Objectives The overarching objective of this program is to develop methods for fast

  14. Beyond a Fad: Why Video Games Should Be Part of 21st Century Libraries

    ERIC Educational Resources Information Center

    Buchanan, Kym; Elzen, Angela M. Vanden

    2012-01-01

    We believe video games have a place in libraries. We start by describing two provocative video games. Next, we offer a framework for the general mission of libraries, including access, motivation, and guidance. As a medium, video games have some distinguishing traits: they are visual, interactive, and based on simulations. We explain how these…

  15. Simulation-Based Joint Estimation of Body Deformation and Elasticity Parameters for Medical Image Analysis

    PubMed Central

    Foskey, Mark; Niethammer, Marc; Krajcevski, Pavel; Lin, Ming C.

    2014-01-01

    Estimation of tissue stiffness is an important means of noninvasive cancer detection. Existing elasticity reconstruction methods usually depend on a dense displacement field (inferred from ultrasound or MR images) and known external forces. Many imaging modalities, however, cannot provide details within an organ and therefore cannot provide such a displacement field. Furthermore, force exertion and measurement can be difficult for some internal organs, making boundary forces another missing parameter. We propose a general method for estimating elasticity and boundary forces automatically using an iterative optimization framework, given the desired (target) output surface. During the optimization, the input model is deformed by the simulator, and an objective function based on the distance between the deformed surface and the target surface is minimized numerically. The optimization framework does not depend on a particular simulation method and is therefore suitable for different physical models. We show a positive correlation between clinical prostate cancer stage (a clinical measure of severity) and the recovered elasticity of the organ. Since the surface correspondence is established, our method also provides a non-rigid image registration, where the quality of the deformation fields is guaranteed, as they are computed using a physics-based simulation. PMID:22893381

  16. Executing Medical Guidelines on the Web: Towards Next Generation Healthcare

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Des, J.; Fernandez-Prieto, M. J.; Perez, R.; Paniagua, H.

    There is still a lack of full integration between current Electronic Health Records (EHRs) and medical guidelines that encapsulate evidence-based medicine. Thus, general practitioners (GPs) and specialised physicians still have to read document-based medical guidelines and decide among various options for managing common non-life-threatening conditions where the selection of the most appropriate therapeutic option for each individual patient can be a difficult task. This paper presents a simulation framework and computational test-bed, called V.A.F. Framework, for supporting simulations of clinical situations that boosted the integration between Health Level Seven (HL7) and Semantic Web technologies (OWL, SWRL, and OWL-S) to achieve content layer interoperability between online clinical cases and medical guidelines, and therefore, it proves that higher integration between EHRs and evidence-based medicine can be accomplished which could lead to a next generation of healthcare systems that provide more support to physicians and increase patients' safety.

  17. Benchmarking global land surface models in CMIP5: analysis of ecosystem water use efficiency (WUE) and Budyko framework

    NASA Astrophysics Data System (ADS)

    Li, Longhui

    2015-04-01

    Twelve Earth System Models (ESMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are evaluated in terms of ecosystem water use efficiency (WUE) and Budyko framework. Simulated values of GPP and ET from ESMs were validated against with FLUXNET measurements, and the slope of linear regression between the measurement and the model ranged from 0.24 in CanESM2 to 0.8 in GISS-E2 for GPP, and from 0.51 to 0.86 for ET. The performances of 12 ESMs in simulating ET are generally better than GPP. Compared with flux-tower-based estimates by Jung et al. [Journal of Geophysical Research 116 (2011) G00J07] (JU11), all ESMs could capture the latitudinal variations of GPP and ET, but the majority of models extremely overestimated GPP and ET, particularly around the equator. The 12 ESMs showed much larger variations in latitudinal WUE. 4 of 12 ESMs predicted global annual GPP of higher than 150 Pg C year-1, and the other 8 ESMs predicted global GPP with ±15% error of the JU11 GPP. In contrast, all EMSs predicted moderate bias for global ET. The coefficient of variation (CV) of ET (0.11) is significantly less than that of GPP (0.25). More than half of 12 ESMs generally comply with the Budyko framework but some models deviated much. Spatial analysis of error in GPP and ET indicated that model results largely differ among models at different regions. This study suggested that the estimate of ET was much better than GPP. Incorporating the convergence of WUE and the Budyko framework into ESMs as constraints in the next round of CMIP scheme is expected to decrease the uncertainties of carbon and water fluxes estimates.

  18. Coevolutionary dynamics in large, but finite populations

    NASA Astrophysics Data System (ADS)

    Traulsen, Arne; Claussen, Jens Christian; Hauert, Christoph

    2006-07-01

    Coevolving and competing species or game-theoretic strategies exhibit rich and complex dynamics for which a general theoretical framework based on finite populations is still lacking. Recently, an explicit mean-field description in the form of a Fokker-Planck equation was derived for frequency-dependent selection with two strategies in finite populations based on microscopic processes [A. Traulsen, J. C. Claussen, and C. Hauert, Phys. Rev. Lett. 95, 238701 (2005)]. Here we generalize this approach in a twofold way: First, we extend the framework to an arbitrary number of strategies and second, we allow for mutations in the evolutionary process. The deterministic limit of infinite population size of the frequency-dependent Moran process yields the adjusted replicator-mutator equation, which describes the combined effect of selection and mutation. For finite populations, we provide an extension taking random drift into account. In the limit of neutral selection, i.e., whenever the process is determined by random drift and mutations, the stationary strategy distribution is derived. This distribution forms the background for the coevolutionary process. In particular, a critical mutation rate uc is obtained separating two scenarios: above uc the population predominantly consists of a mixture of strategies whereas below uc the population tends to be in homogeneous states. For one of the fundamental problems in evolutionary biology, the evolution of cooperation under Darwinian selection, we demonstrate that the analytical framework provides excellent approximations to individual based simulations even for rather small population sizes. This approach complements simulation results and provides a deeper, systematic understanding of coevolutionary dynamics.

  19. A Modular Simulation Framework for Assessing Swarm Search Models

    DTIC Science & Technology

    2014-09-01

    SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework

  20. Population genetics and molecular evolution of DNA sequences in transposable elements. I. A simulation framework.

    PubMed

    Kijima, T E; Innan, Hideki

    2013-11-01

    A population genetic simulation framework is developed to understand the behavior and molecular evolution of DNA sequences of transposable elements. Our model incorporates random transposition and excision of transposable element (TE) copies, two modes of selection against TEs, and degeneration of transpositional activity by point mutations. We first investigated the relationships between the behavior of the copy number of TEs and these parameters. Our results show that when selection is weak, the genome can maintain a relatively large number of TEs, but most of them are less active. In contrast, with strong selection, the genome can maintain only a limited number of TEs but the proportion of active copies is large. In such a case, there could be substantial fluctuations of the copy number over generations. We also explored how DNA sequences of TEs evolve through the simulations. In general, active copies form clusters around the original sequence, while less active copies have long branches specific to themselves, exhibiting a star-shaped phylogeny. It is demonstrated that the phylogeny of TE sequences could be informative to understand the dynamics of TE evolution.

  1. Non-Gaussian spatiotemporal simulation of multisite daily precipitation: downscaling framework

    NASA Astrophysics Data System (ADS)

    Ben Alaya, M. A.; Ouarda, T. B. M. J.; Chebana, F.

    2018-01-01

    Probabilistic regression approaches for downscaling daily precipitation are very useful. They provide the whole conditional distribution at each forecast step to better represent the temporal variability. The question addressed in this paper is: how to simulate spatiotemporal characteristics of multisite daily precipitation from probabilistic regression models? Recent publications point out the complexity of multisite properties of daily precipitation and highlight the need for using a non-Gaussian flexible tool. This work proposes a reasonable compromise between simplicity and flexibility avoiding model misspecification. A suitable nonparametric bootstrapping (NB) technique is adopted. A downscaling model which merges a vector generalized linear model (VGLM as a probabilistic regression tool) and the proposed bootstrapping technique is introduced to simulate realistic multisite precipitation series. The model is applied to data sets from the southern part of the province of Quebec, Canada. It is shown that the model is capable of reproducing both at-site properties and the spatial structure of daily precipitations. Results indicate the superiority of the proposed NB technique, over a multivariate autoregressive Gaussian framework (i.e. Gaussian copula).

  2. Friendship Dissolution Within Social Networks Modeled Through Multilevel Event History Analysis

    PubMed Central

    Dean, Danielle O.; Bauer, Daniel J.; Prinstein, Mitchell J.

    2018-01-01

    A social network perspective can bring important insight into the processes that shape human behavior. Longitudinal social network data, measuring relations between individuals over time, has become increasingly common—as have the methods available to analyze such data. A friendship duration model utilizing discrete-time multilevel survival analysis with a multiple membership random effect structure is developed and applied here to study the processes leading to undirected friendship dissolution within a larger social network. While the modeling framework is introduced in terms of understanding friendship dissolution, it can be used to understand microlevel dynamics of a social network more generally. These models can be fit with standard generalized linear mixed-model software, after transforming the data to a pair-period data set. An empirical example highlights how the model can be applied to understand the processes leading to friendship dissolution between high school students, and a simulation study is used to test the use of the modeling framework under representative conditions that would be found in social network data. Advantages of the modeling framework are highlighted, and potential limitations and future directions are discussed. PMID:28463022

  3. Performance and accuracy of criticality calculations performed using WARP – A framework for continuous energy Monte Carlo neutron transport in general 3D geometries on GPUs

    DOE PAGES

    Bergmann, Ryan M.; Rowland, Kelly L.; Radnović, Nikola; ...

    2017-05-01

    In this companion paper to "Algorithmic Choices in WARP - A Framework for Continuous Energy Monte Carlo Neutron Transport in General 3D Geometries on GPUs" (doi:10.1016/j.anucene.2014.10.039), the WARP Monte Carlo neutron transport framework for graphics processing units (GPUs) is benchmarked against production-level central processing unit (CPU) Monte Carlo neutron transport codes for both performance and accuracy. We compare neutron flux spectra, multiplication factors, runtimes, speedup factors, and costs of various GPU and CPU platforms running either WARP, Serpent 2.1.24, or MCNP 6.1. WARP compares well with the results of the production-level codes, and it is shown that on the newestmore » hardware considered, GPU platforms running WARP are between 0.8 to 7.6 times as fast as CPU platforms running production codes. Also, the GPU platforms running WARP were between 15% and 50% as expensive to purchase and between 80% to 90% as expensive to operate as equivalent CPU platforms performing at an equal simulation rate.« less

  4. Performance and accuracy of criticality calculations performed using WARP – A framework for continuous energy Monte Carlo neutron transport in general 3D geometries on GPUs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergmann, Ryan M.; Rowland, Kelly L.; Radnović, Nikola

    In this companion paper to "Algorithmic Choices in WARP - A Framework for Continuous Energy Monte Carlo Neutron Transport in General 3D Geometries on GPUs" (doi:10.1016/j.anucene.2014.10.039), the WARP Monte Carlo neutron transport framework for graphics processing units (GPUs) is benchmarked against production-level central processing unit (CPU) Monte Carlo neutron transport codes for both performance and accuracy. We compare neutron flux spectra, multiplication factors, runtimes, speedup factors, and costs of various GPU and CPU platforms running either WARP, Serpent 2.1.24, or MCNP 6.1. WARP compares well with the results of the production-level codes, and it is shown that on the newestmore » hardware considered, GPU platforms running WARP are between 0.8 to 7.6 times as fast as CPU platforms running production codes. Also, the GPU platforms running WARP were between 15% and 50% as expensive to purchase and between 80% to 90% as expensive to operate as equivalent CPU platforms performing at an equal simulation rate.« less

  5. Multiplicative Multitask Feature Learning

    PubMed Central

    Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu

    2016-01-01

    We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735

  6. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak

    PubMed Central

    McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.

    2017-01-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216

  7. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

    PubMed

    Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T

    2017-10-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

  8. Analyzing Evolving Social Network 2 (EVOLVE2)

    DTIC Science & Technology

    2015-04-01

    Facebook friendship graph. We simulated two different interaction models: one-to-one and one-to-many interactions . Both types of models revealed...to an unbiased random walk on the reweighed “ interaction graph” W with entries wij = αiAijαj . The generalized Laplacian framework is flexible enough...Information Intelligence Systems & Analysis Division Information Directorate This report is published in the interest of scientific and technical

  9. Counter Unmanned Aerial System Decision-Aid Logic Process (C-UAS DALP)

    DTIC Science & Technology

    decision -aid or logic process that bridges the middle elements of the kill... of use, location, general logic process , and reference mission. This is the framework for the IDEF0 functional architecture diagrams, decision -aid diagrams, logic process , and modeling and simulation....chain between detection to countermeasure response. This capstone project creates the logic for a decision process that transitions from the

  10. The distribution of density in supersonic turbulence

    NASA Astrophysics Data System (ADS)

    Squire, Jonathan; Hopkins, Philip F.

    2017-11-01

    We propose a model for the statistics of the mass density in supersonic turbulence, which plays a crucial role in star formation and the physics of the interstellar medium (ISM). The model is derived by considering the density to be arranged as a collection of strong shocks of width ˜ M^{-2}, where M is the turbulent Mach number. With two physically motivated parameters, the model predicts all density statistics for M>1 turbulence: the density probability distribution and its intermittency (deviation from lognormality), the density variance-Mach number relation, power spectra and structure functions. For the proposed model parameters, reasonable agreement is seen between model predictions and numerical simulations, albeit within the large uncertainties associated with current simulation results. More generally, the model could provide a useful framework for more detailed analysis of future simulations and observational data. Due to the simple physical motivations for the model in terms of shocks, it is straightforward to generalize to more complex physical processes, which will be helpful in future more detailed applications to the ISM. We see good qualitative agreement between such extensions and recent simulations of non-isothermal turbulence.

  11. Atom based grain extraction and measurement of geometric properties

    NASA Astrophysics Data System (ADS)

    Martine La Boissonière, Gabriel; Choksi, Rustum

    2018-04-01

    We introduce an accurate, self-contained and automatic atom based numerical algorithm to characterize grain distributions in two dimensional Phase Field Crystal (PFC) simulations. We compare the method with hand segmented and known test grain distributions to show that the algorithm is able to extract grains and measure their area, perimeter and other geometric properties with high accuracy. Four input parameters must be set by the user and their influence on the results is described. The method is currently tuned to extract data from PFC simulations in the hexagonal lattice regime but the framework may be extended to more general problems.

  12. Simulation of the microwave heating of a thin multilayered composite material: A parameter analysis

    NASA Astrophysics Data System (ADS)

    Tertrais, Hermine; Barasinski, Anaïs; Chinesta, Francisco

    2018-05-01

    Microwave (MW) technology relies on volumetric heating. Thermal energy is transferred to the material that can absorb it at specific frequencies. The complex physics involved in this process is far from being understood and that is why a simulation tool has been developed in order to solve the electromagnetic and thermal equations in such a complex material as a multilayered composite part. The code is based on the in-plane-out-of-plane separated representation within the Proper Generalized Decomposition framework. To improve the knowledge on the process, a parameter study in carried out in this paper.

  13. Beamforming approaches for untethered, ultrasonic neural dust motes for cortical recording: a simulation study.

    PubMed

    Bertrand, Alexander; Seo, Dongjin; Maksimovic, Filip; Carmena, Jose M; Maharbiz, Michel M; Alon, Elad; Rabaey, Jan M

    2014-01-01

    In this paper, we examine the use of beamforming techniques to interrogate a multitude of neural implants in a distributed, ultrasound-based intra-cortical recording platform known as Neural Dust. We propose a general framework to analyze system design tradeoffs in the ultrasonic beamformer that extracts neural signals from modulated ultrasound waves that are backscattered by free-floating neural dust (ND) motes. Simulations indicate that high-resolution linearly-constrained minimum variance beamforming sufficiently suppresses interference from unselected ND motes and can be incorporated into the ND-based cortical recording system.

  14. Parallel kinetic Monte Carlo simulation framework incorporating accurate models of adsorbate lateral interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nielsen, Jens; D’Avezac, Mayeul; Hetherington, James

    2013-12-14

    Ab initio kinetic Monte Carlo (KMC) simulations have been successfully applied for over two decades to elucidate the underlying physico-chemical phenomena on the surfaces of heterogeneous catalysts. These simulations necessitate detailed knowledge of the kinetics of elementary reactions constituting the reaction mechanism, and the energetics of the species participating in the chemistry. The information about the energetics is encoded in the formation energies of gas and surface-bound species, and the lateral interactions between adsorbates on the catalytic surface, which can be modeled at different levels of detail. The majority of previous works accounted for only pairwise-additive first nearest-neighbor interactions. Moremore » recently, cluster-expansion Hamiltonians incorporating long-range interactions and many-body terms have been used for detailed estimations of catalytic rate [C. Wu, D. J. Schmidt, C. Wolverton, and W. F. Schneider, J. Catal. 286, 88 (2012)]. In view of the increasing interest in accurate predictions of catalytic performance, there is a need for general-purpose KMC approaches incorporating detailed cluster expansion models for the adlayer energetics. We have addressed this need by building on the previously introduced graph-theoretical KMC framework, and we have developed Zacros, a FORTRAN2003 KMC package for simulating catalytic chemistries. To tackle the high computational cost in the presence of long-range interactions we introduce parallelization with OpenMP. We further benchmark our framework by simulating a KMC analogue of the NO oxidation system established by Schneider and co-workers [J. Catal. 286, 88 (2012)]. We show that taking into account only first nearest-neighbor interactions may lead to large errors in the prediction of the catalytic rate, whereas for accurate estimates thereof, one needs to include long-range terms in the cluster expansion.« less

  15. Effective time closures: quantifying the conservation benefits of input control for the Pacific chub mackerel fishery.

    PubMed

    Ichinokawa, Momoko; Okamura, Hiroshi; Watanabe, Chikako; Kawabata, Atsushi; Oozeki, Yoshioki

    2015-09-01

    Restricting human access to a specific wildlife species, community, or ecosystem, i.e., input control, is one of the most popular tools to control human impacts for natural resource management and wildlife conservation. However, quantitative evaluations of input control are generally difficult, because it is unclear how much human impacts can actually be reduced by the control. We present a model framework to quantify the effectiveness of input control using day closures to reduce actual fishing impact by considering the observed fishery dynamics. The model framework was applied to the management of the Pacific stock of the chub mackerel (Scomber japonicus) fishery, in which fishing was suspended for one day following any day when the total mackerel catch exceeded a threshold level. We evaluated the management measure according to the following steps: (1) we fitted the daily observed catch and fishing effort data to a generalized linear model (GLM) or generalized autoregressive state-space model (GASSM), (2) we conducted population dynamics simulations based on annual catches randomly generated from the parameters estimated in the first step, (3) we quantified the effectiveness of day closures by comparing the results of two simulation scenarios with and without day closures, and (4) we conducted additional simulations based on different sets of explanatory variables and statistical models (sensitivity analysis). In the first step, we found that the GASSM explained the observed data far better than the simple GLM. The model parameterized with the estimates from the GASSM demonstrated that the day closures implemented from 2004 to 2009 would have decreased exploitation fractions by ~10% every year and increased the 2009 stock biomass by 37-46% (median), relative to the values without day closures. The sensitivity analysis revealed that the effectiveness of day closures was particularly influenced by autoregressive processes in the fishery data and by positive relationships between fishing effort and total biomass. Those results indicated the importance of human behavioral dynamics under input control in quantifying the conservation benefit of natural resource management and the applicability of our model framework to the evaluation of the input controls that are actually implemented.

  16. Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...

  17. Higher-Order Extended Lagrangian Born–Oppenheimer Molecular Dynamics for Classical Polarizable Models

    DOE PAGES

    Albaugh, Alex; Head-Gordon, Teresa; Niklasson, Anders M. N.

    2018-01-09

    Generalized extended Lagrangian Born−Oppenheimer molecular dynamics (XLBOMD) methods provide a framework for fast iteration-free simulations of models that normally require expensive electronic ground state optimizations prior to the force evaluations at every time step. XLBOMD uses dynamically driven auxiliary degrees of freedom that fluctuate about a variationally optimized ground state of an approximate “shadow” potential which approximates the true reference potential. While the requirements for such shadow potentials are well understood, constructing such potentials in practice has previously been ad hoc, and in this work, we present a systematic development of XLBOMD shadow potentials that match the reference potential tomore » any order. We also introduce a framework for combining friction-like dissipation for the auxiliary degrees of freedom with general-order integration, a combination that was not previously possible. These developments are demonstrated with a simple fluctuating charge model and point induced dipole polarization models.« less

  18. Higher-Order Extended Lagrangian Born-Oppenheimer Molecular Dynamics for Classical Polarizable Models.

    PubMed

    Albaugh, Alex; Head-Gordon, Teresa; Niklasson, Anders M N

    2018-02-13

    Generalized extended Lagrangian Born-Oppenheimer molecular dynamics (XLBOMD) methods provide a framework for fast iteration-free simulations of models that normally require expensive electronic ground state optimizations prior to the force evaluations at every time step. XLBOMD uses dynamically driven auxiliary degrees of freedom that fluctuate about a variationally optimized ground state of an approximate "shadow" potential which approximates the true reference potential. While the requirements for such shadow potentials are well understood, constructing such potentials in practice has previously been ad hoc, and in this work, we present a systematic development of XLBOMD shadow potentials that match the reference potential to any order. We also introduce a framework for combining friction-like dissipation for the auxiliary degrees of freedom with general-order integration, a combination that was not previously possible. These developments are demonstrated with a simple fluctuating charge model and point induced dipole polarization models.

  19. Higher-Order Extended Lagrangian Born–Oppenheimer Molecular Dynamics for Classical Polarizable Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albaugh, Alex; Head-Gordon, Teresa; Niklasson, Anders M. N.

    Generalized extended Lagrangian Born−Oppenheimer molecular dynamics (XLBOMD) methods provide a framework for fast iteration-free simulations of models that normally require expensive electronic ground state optimizations prior to the force evaluations at every time step. XLBOMD uses dynamically driven auxiliary degrees of freedom that fluctuate about a variationally optimized ground state of an approximate “shadow” potential which approximates the true reference potential. While the requirements for such shadow potentials are well understood, constructing such potentials in practice has previously been ad hoc, and in this work, we present a systematic development of XLBOMD shadow potentials that match the reference potential tomore » any order. We also introduce a framework for combining friction-like dissipation for the auxiliary degrees of freedom with general-order integration, a combination that was not previously possible. These developments are demonstrated with a simple fluctuating charge model and point induced dipole polarization models.« less

  20. Extending the FairRoot framework to allow for simulation and reconstruction of free streaming data

    NASA Astrophysics Data System (ADS)

    Al-Turany, M.; Klein, D.; Manafov, A.; Rybalchenko, A.; Uhlig, F.

    2014-06-01

    The FairRoot framework is the standard framework for simulation, reconstruction and data analysis for the FAIR experiments. The framework is designed to optimise the accessibility for beginners and developers, to be flexible and to cope with future developments. FairRoot enhances the synergy between the different physics experiments. As a first step toward simulation of free streaming data, the time based simulation was introduced to the framework. The next step is the event source simulation. This is achieved via a client server system. After digitization the so called "samplers" can be started, where sampler can read the data of the corresponding detector from the simulation files and make it available for the reconstruction clients. The system makes it possible to develop and validate the online reconstruction algorithms. In this work, the design and implementation of the new architecture and the communication layer will be described.

  1. Flexible Residential Smart Grid Simulation Framework

    NASA Astrophysics Data System (ADS)

    Xiang, Wang

    Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.

  2. MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance

    PubMed Central

    2014-01-01

    Background Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. Methods The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. Results MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. Conclusion The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions. PMID:25204441

  3. Framework to model neutral particle flux in convex high aspect ratio structures using one-dimensional radiosity

    NASA Astrophysics Data System (ADS)

    Manstetten, Paul; Filipovic, Lado; Hössinger, Andreas; Weinbub, Josef; Selberherr, Siegfried

    2017-02-01

    We present a computationally efficient framework to compute the neutral flux in high aspect ratio structures during three-dimensional plasma etching simulations. The framework is based on a one-dimensional radiosity approach and is applicable to simulations of convex rotationally symmetric holes and convex symmetric trenches with a constant cross-section. The framework is intended to replace the full three-dimensional simulation step required to calculate the neutral flux during plasma etching simulations. Especially for high aspect ratio structures, the computational effort, required to perform the full three-dimensional simulation of the neutral flux at the desired spatial resolution, conflicts with practical simulation time constraints. Our results are in agreement with those obtained by three-dimensional Monte Carlo based ray tracing simulations for various aspect ratios and convex geometries. With this framework we present a comprehensive analysis of the influence of the geometrical properties of high aspect ratio structures as well as of the particle sticking probability on the neutral particle flux.

  4. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    NASA Astrophysics Data System (ADS)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  5. Argonne Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-01-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distribu...

  6. The Linear Bias in the Zeldovich Approximation and a Relation between the Number Density and the Linear Bias of Dark Halos

    NASA Astrophysics Data System (ADS)

    Fan, Zuhui

    2000-01-01

    The linear bias of the dark halos from a model under the Zeldovich approximation is derived and compared with the fitting formula of simulation results. While qualitatively similar to the Press-Schechter formula, this model gives a better description for the linear bias around the turnaround point. This advantage, however, may be compromised by the large uncertainty of the actual behavior of the linear bias near the turnaround point. For a broad class of structure formation models in the cold dark matter framework, a general relation exists between the number density and the linear bias of dark halos. This relation can be readily tested by numerical simulations. Thus, instead of laboriously checking these models one by one, numerical simulation studies can falsify a whole category of models. The general validity of this relation is important in identifying key physical processes responsible for the large-scale structure formation in the universe.

  7. Dshell++: A Component Based, Reusable Space System Simulation Framework

    NASA Technical Reports Server (NTRS)

    Lim, Christopher S.; Jain, Abhinandan

    2009-01-01

    This paper describes the multi-mission Dshell++ simulation framework for high fidelity, physics-based simulation of spacecraft, robotic manipulation and mobility systems. Dshell++ is a C++/Python library which uses modern script driven object-oriented techniques to allow component reuse and a dynamic run-time interface for complex, high-fidelity simulation of spacecraft and robotic systems. The goal of the Dshell++ architecture is to manage the inherent complexity of physicsbased simulations while supporting component model reuse across missions. The framework provides several features that support a large degree of simulation configurability and usability.

  8. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  9. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  10. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  11. From bricks to buildings: adapting the Medical Research Council framework to develop programs of research in simulation education and training for the health professions.

    PubMed

    Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam

    2014-08-01

    Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.

  12. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels ofmore » parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.« less

  13. Video event classification and image segmentation based on noncausal multidimensional hidden Markov models.

    PubMed

    Ma, Xiang; Schonfeld, Dan; Khokhar, Ashfaq A

    2009-06-01

    In this paper, we propose a novel solution to an arbitrary noncausal, multidimensional hidden Markov model (HMM) for image and video classification. First, we show that the noncausal model can be solved by splitting it into multiple causal HMMs and simultaneously solving each causal HMM using a fully synchronous distributed computing framework, therefore referred to as distributed HMMs. Next we present an approximate solution to the multiple causal HMMs that is based on an alternating updating scheme and assumes a realistic sequential computing framework. The parameters of the distributed causal HMMs are estimated by extending the classical 1-D training and classification algorithms to multiple dimensions. The proposed extension to arbitrary causal, multidimensional HMMs allows state transitions that are dependent on all causal neighbors. We, thus, extend three fundamental algorithms to multidimensional causal systems, i.e., 1) expectation-maximization (EM), 2) general forward-backward (GFB), and 3) Viterbi algorithms. In the simulations, we choose to limit ourselves to a noncausal 2-D model whose noncausality is along a single dimension, in order to significantly reduce the computational complexity. Simulation results demonstrate the superior performance, higher accuracy rate, and applicability of the proposed noncausal HMM framework to image and video classification.

  14. A neurally plausible parallel distributed processing model of event-related potential word reading data.

    PubMed

    Laszlo, Sarah; Plaut, David C

    2012-03-01

    The Parallel Distributed Processing (PDP) framework has significant potential for producing models of cognitive tasks that approximate how the brain performs the same tasks. To date, however, there has been relatively little contact between PDP modeling and data from cognitive neuroscience. In an attempt to advance the relationship between explicit, computational models and physiological data collected during the performance of cognitive tasks, we developed a PDP model of visual word recognition which simulates key results from the ERP reading literature, while simultaneously being able to successfully perform lexical decision-a benchmark task for reading models. Simulations reveal that the model's success depends on the implementation of several neurally plausible features in its architecture which are sufficiently domain-general to be relevant to cognitive modeling more generally. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. A Kernel-Based Low-Rank (KLR) Model for Low-Dimensional Manifold Recovery in Highly Accelerated Dynamic MRI.

    PubMed

    Nakarmi, Ukash; Wang, Yanhua; Lyu, Jingyuan; Liang, Dong; Ying, Leslie

    2017-11-01

    While many low rank and sparsity-based approaches have been developed for accelerated dynamic magnetic resonance imaging (dMRI), they all use low rankness or sparsity in input space, overlooking the intrinsic nonlinear correlation in most dMRI data. In this paper, we propose a kernel-based framework to allow nonlinear manifold models in reconstruction from sub-Nyquist data. Within this framework, many existing algorithms can be extended to kernel framework with nonlinear models. In particular, we have developed a novel algorithm with a kernel-based low-rank model generalizing the conventional low rank formulation. The algorithm consists of manifold learning using kernel, low rank enforcement in feature space, and preimaging with data consistency. Extensive simulation and experiment results show that the proposed method surpasses the conventional low-rank-modeled approaches for dMRI.

  16. Fragmentation-based QM/MM simulations: length dependence of chain dynamics and hydrogen bonding of polyethylene oxide and polyethylene in aqueous solutions.

    PubMed

    Li, Hui; Li, Wei; Li, Shuhua; Ma, Jing

    2008-06-12

    Molecular fragmentation quantum mechanics (QM) calculations have been combined with molecular mechanics (MM) to construct the fragmentation QM/MM method for simulations of dilute solutions of macromolecules. We adopt the electrostatics embedding QM/MM model, where the low-cost generalized energy-based fragmentation calculations are employed for the QM part. Conformation energy calculations, geometry optimizations, and Born-Oppenheimer molecular dynamics simulations of poly(ethylene oxide), PEO(n) (n = 6-20), and polyethylene, PE(n) ( n = 9-30), in aqueous solution have been performed within the framework of both fragmentation and conventional QM/MM methods. The intermolecular hydrogen bonding and chain configurations obtained from the fragmentation QM/MM simulations are consistent with the conventional QM/MM method. The length dependence of chain conformations and dynamics of PEO and PE oligomers in aqueous solutions is also investigated through the fragmentation QM/MM molecular dynamics simulations.

  17. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    DOE PAGES

    Nord, B.; Amara, A.; Refregier, A.; ...

    2016-03-03

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less

  18. Redundancy Maintenance and Garbage Collection Strategies in Peer-to-Peer Storage Systems

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Datta, Anwitaman

    Maintaining redundancy in P2P storage systems is essential for reliability guarantees. Numerous P2P storage system maintenance algorithms have been proposed in the last years, each supposedly improving upon the previous approaches. We perform a systematic comparative study of the various strategies taking also into account the influence of different garbage collection mechanisms, an issue not studied so far. Our experiments show that while some strategies generally perform better than some others, there is no universally best strategy, and their relative superiority depends on various other design choices as well as the specific evaluation criterion. Our results can be used by P2P storage systems designers to make prudent design decisions, and our exploration of the various evaluation metrics also provides a more comprehensive framework to compare algorithms for P2P storage systems. While there are numerous network simulators specifically developed even to simulate peer-to-peer networks, there existed no P2P storage simulators - a byproduct of this work is a generic modular P2P storage system simulator which we provide as open-source. Different redundancy, maintenance, placement, garbage-collection policies, churn scenarios can be easily integrated to the simulator to try out new schemes in future, and provides a common framework to compare (future) p2p storage systems designs - something which has not been possible so far.

  19. Protection motivation theory and social distancing behaviour in response to a simulated infectious disease epidemic.

    PubMed

    Williams, Lynn; Rasmussen, Susan; Kleczkowski, Adam; Maharaj, Savi; Cairns, Nicole

    2015-01-01

    Epidemics of respiratory infectious disease remain one of the most serious health risks facing the population. Non-pharmaceutical interventions (e.g. hand-washing or wearing face masks) can have a significant impact on the course of an infectious disease epidemic. The current study investigated whether protection motivation theory (PMT) is a useful framework for understanding social distancing behaviour (i.e. the tendency to reduce social contacts) in response to a simulated infectious disease epidemic. There were 230 participants (109 males, 121 females, mean age 32.4 years) from the general population who completed self-report measures assessing the components of PMT. In addition, participants completed a computer game which simulated an infectious disease epidemic in order to provide a measure of social distancing behaviour. The regression analyses revealed that none of the PMT variables were significant predictors of social distancing behaviour during the simulation task. However, fear (β = .218, p < .001), response efficacy (β = .175, p < .01) and self-efficacy (β = .251, p < .001) were all significant predictors of intention to engage in social distancing behaviour. Overall, the PMT variables (and demographic factors) explain 21.2% of the variance in intention. The findings demonstrated that PMT was a useful framework for understanding intention to engage in social distancing behaviour, but not actual behaviour during the simulated epidemic. These findings may reflect an intention-behaviour gap in relation to social distancing behaviour.

  20. A Generalized Hybrid Multiscale Modeling Approach for Flow and Reactive Transport in Porous Media

    NASA Astrophysics Data System (ADS)

    Yang, X.; Meng, X.; Tang, Y. H.; Guo, Z.; Karniadakis, G. E.

    2017-12-01

    Using emerging understanding of biological and environmental processes at fundamental scales to advance predictions of the larger system behavior requires the development of multiscale approaches, and there is strong interest in coupling models at different scales together in a hybrid multiscale simulation framework. A limited number of hybrid multiscale simulation methods have been developed for subsurface applications, mostly using application-specific approaches for model coupling. The proposed generalized hybrid multiscale approach is designed with minimal intrusiveness to the at-scale simulators (pre-selected) and provides a set of lightweight C++ scripts to manage a complex multiscale workflow utilizing a concurrent coupling approach. The workflow includes at-scale simulators (using the lattice-Boltzmann method, LBM, at the pore and Darcy scale, respectively), scripts for boundary treatment (coupling and kriging), and a multiscale universal interface (MUI) for data exchange. The current study aims to apply the generalized hybrid multiscale modeling approach to couple pore- and Darcy-scale models for flow and mixing-controlled reaction with precipitation/dissolution in heterogeneous porous media. The model domain is packed heterogeneously that the mixing front geometry is more complex and not known a priori. To address those challenges, the generalized hybrid multiscale modeling approach is further developed to 1) adaptively define the locations of pore-scale subdomains, 2) provide a suite of physical boundary coupling schemes and 3) consider the dynamic change of the pore structures due to mineral precipitation/dissolution. The results are validated and evaluated by comparing with single-scale simulations in terms of velocities, reactive concentrations and computing cost.

  1. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NASA Astrophysics Data System (ADS)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  2. Design of the HELICS High-Performance Transmission-Distribution-Communication-Market Co-Simulation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan S; Krishnamurthy, Dheepak; Top, Philip

    This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layeredmore » co-simulation architecture.« less

  3. Design of the HELICS High-Performance Transmission-Distribution-Communication-Market Co-Simulation Framework: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan S; Krishnamurthy, Dheepak; Top, Philip

    This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layeredmore » co-simulation architecture.« less

  4. Interleaved EPI diffusion imaging using SPIRiT-based reconstruction with virtual coil compression.

    PubMed

    Dong, Zijing; Wang, Fuyixue; Ma, Xiaodong; Zhang, Zhe; Dai, Erpeng; Yuan, Chun; Guo, Hua

    2018-03-01

    To develop a novel diffusion imaging reconstruction framework based on iterative self-consistent parallel imaging reconstruction (SPIRiT) for multishot interleaved echo planar imaging (iEPI), with computation acceleration by virtual coil compression. As a general approach for autocalibrating parallel imaging, SPIRiT improves the performance of traditional generalized autocalibrating partially parallel acquisitions (GRAPPA) methods in that the formulation with self-consistency is better conditioned, suggesting SPIRiT to be a better candidate in k-space-based reconstruction. In this study, a general SPIRiT framework is adopted to incorporate both coil sensitivity and phase variation information as virtual coils and then is applied to 2D navigated iEPI diffusion imaging. To reduce the reconstruction time when using a large number of coils and shots, a novel shot-coil compression method is proposed for computation acceleration in Cartesian sampling. Simulations and in vivo experiments were conducted to evaluate the performance of the proposed method. Compared with the conventional coil compression, the shot-coil compression achieved higher compression rates with reduced errors. The simulation and in vivo experiments demonstrate that the SPIRiT-based reconstruction outperformed the existing method, realigned GRAPPA, and provided superior images with reduced artifacts. The SPIRiT-based reconstruction with virtual coil compression is a reliable method for high-resolution iEPI diffusion imaging. Magn Reson Med 79:1525-1531, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  5. A multiscale modeling framework model (superparameterized CAM5) with a higher-order turbulence closure: Model description and low-cloud simulations

    DOE PAGES

    Wang, Minghuai; Larson, Vincent E.; Ghan, Steven; ...

    2015-04-18

    In this study, a higher-order turbulence closure scheme, called Cloud Layers Unified by Binormals (CLUBB), is implemented into a Multi-scale Modeling Framework (MMF) model to improve low cloud simulations. The performance of CLUBB in MMF simulations with two different microphysics configurations (one-moment cloud microphysics without aerosol treatment and two-moment cloud microphysics coupled with aerosol treatment) is evaluated against observations and further compared with results from the Community Atmosphere Model, Version 5 (CAM5) with conventional cloud parameterizations. CLUBB is found to improve low cloud simulations in the MMF, and the improvement is particularly evident in the stratocumulus-to-cumulus transition regions. Compared tomore » the single-moment cloud microphysics, CLUBB with two-moment microphysics produces clouds that are closer to the coast, and agrees better with observations. In the stratocumulus-to cumulus transition regions, CLUBB with two-moment cloud microphysics produces shortwave cloud forcing in better agreement with observations, while CLUBB with single moment cloud microphysics overestimates shortwave cloud forcing. CLUBB is further found to produce quantitatively similar improvements in the MMF and CAM5, with slightly better performance in the MMF simulations (e.g., MMF with CLUBB generally produces low clouds that are closer to the coast than CAM5 with CLUBB). As a result, improved low cloud simulations in MMF make it an even more attractive tool for studying aerosol-cloud-precipitation interactions.« less

  6. Developing a Novel Parameter Estimation Method for Agent-Based Model in Immune System Simulation under the Framework of History Matching: A Case Study on Influenza A Virus Infection

    PubMed Central

    Li, Tingting; Cheng, Zhengguo; Zhang, Le

    2017-01-01

    Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency. PMID:29194393

  7. Developing a Novel Parameter Estimation Method for Agent-Based Model in Immune System Simulation under the Framework of History Matching: A Case Study on Influenza A Virus Infection.

    PubMed

    Li, Tingting; Cheng, Zhengguo; Zhang, Le

    2017-12-01

    Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency.

  8. Breathing pulses in singularly perturbed reaction-diffusion systems

    NASA Astrophysics Data System (ADS)

    Veerman, Frits

    2015-07-01

    The weakly nonlinear stability of pulses in general singularly perturbed reaction-diffusion systems near a Hopf bifurcation is determined using a centre manifold expansion. A general framework to obtain leading order expressions for the (Hopf) centre manifold expansion for scale separated, localised structures is presented. Using the scale separated structure of the underlying pulse, directly calculable expressions for the Hopf normal form coefficients are obtained in terms of solutions to classical Sturm-Liouville problems. The developed theory is used to establish the existence of breathing pulses in a slowly nonlinear Gierer-Meinhardt system, and is confirmed by direct numerical simulation.

  9. Local Spatial Analysis and Dynamic Simulation of Childhood Obesity and Neighbourhood Walkability in a Major Canadian City.

    PubMed

    Shahid, Rizwan; Bertazzon, Stefania

    2015-01-01

    Body weight is an important indicator of current and future health and it is even more critical in children, who are tomorrow's adults. This paper analyzes the relationship between childhood obesity and neighbourhood walkability in Calgary, Canada. A multivariate analytical framework recognizes that childhood obesity is also associated with many factors, including socioeconomic status, foodscapes, and environmental factors, as well as less measurable factors, such as individual preferences, that could not be included in this analysis. In contrast with more conventional global analysis, this research employs localized analysis and assesses need-based interventions. The one-size-fit-all strategy may not effectively control obesity rates, since each neighbourhood has unique characteristics that need to be addressed individually. This paper presents an innovative framework combining local analysis with simulation modeling to analyze childhood obesity. Spatial models generally do not deal with simulation over time, making it cumbersome for health planners and policy makers to effectively design and implement interventions and to quantify their impact over time. This research fills this gap by integrating geographically weighted regression (GWR), which identifies vulnerable neighbourhoods and critical factors for childhood obesity, with simulation modeling, which evaluates the impact of the suggested interventions on the targeted neighbourhoods. Neighbourhood walkability was chosen as a potential target for localized interventions, owing to the crucial role of walking in developing a healthy lifestyle, as well as because increasing walkability is relatively more feasible and less expensive then modifying other factors, such as income. Simulation results suggest that local walkability interventions can achieve measurable declines in childhood obesity rates. The results are encouraging, as improvements are likely to compound over time. The results demonstrate that the integration of GWR and simulation modeling is effective, and the proposed framework can assist in designing local interventions to control and prevent childhood obesity.

  10. A Software Framework for Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    2008-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.

  11. Impact of Functionally Graded Cylinders: Theory

    NASA Technical Reports Server (NTRS)

    Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, S. M. (Technical Monitor)

    2001-01-01

    This final report summarizes the work funded under the Grant NAG3-2411 during the 04/05/2000-04/04/2001 period. The objective of this one-year project was to generalize the theoretical framework of the two-dimensional higher-order theory for the analysis of cylindrical functionally graded materials/structural components employed in advanced aircraft engines developed under past NASA Glenn funding. The completed generalization significantly broadens the theory's range of applicability through the incorporation of dynamic impact loading capability into its framework. Thus, it makes possible the assessment of the effect of damage due to fuel impurities, or the presence of submicron-level debris, on the life of functionally graded structural components. Applications involving advanced turbine blades and structural components for the reusable-launch vehicle (RLV) currently under development will benefit from the completed work. The theory's predictive capability is demonstrated through a numerical simulation of a one-dimensional wave propagation set up by an impulse load in a layered half-plane. Full benefit of the completed generalization of the higher-order theory described in this report will be realized upon the development of a related computer code.

  12. Wavelet neural networks: a practical guide.

    PubMed

    Alexandridis, Antonios K; Zapranis, Achilleas D

    2013-06-01

    Wavelet networks (WNs) are a new class of networks which have been used with great success in a wide range of applications. However a general accepted framework for applying WNs is missing from the literature. In this study, we present a complete statistical model identification framework in order to apply WNs in various applications. The following subjects were thoroughly examined: the structure of a WN, training methods, initialization algorithms, variable significance and variable selection algorithms, model selection methods and finally methods to construct confidence and prediction intervals. In addition the complexity of each algorithm is discussed. Our proposed framework was tested in two simulated cases, in one chaotic time series described by the Mackey-Glass equation and in three real datasets described by daily temperatures in Berlin, daily wind speeds in New York and breast cancer classification. Our results have shown that the proposed algorithms produce stable and robust results indicating that our proposed framework can be applied in various applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Mobile Autonomous Sensing Unit (MASU): A Framework That Supports Distributed Pervasive Data Sensing

    PubMed Central

    Medina, Esunly; Lopez, David; Meseguer, Roc; Ochoa, Sergio F.; Royo, Dolors; Santos, Rodrigo

    2016-01-01

    Pervasive data sensing is a major issue that transverses various research areas and application domains. It allows identifying people’s behaviour and patterns without overwhelming the monitored persons. Although there are many pervasive data sensing applications, they are typically focused on addressing specific problems in a single application domain, making them difficult to generalize or reuse. On the other hand, the platforms for supporting pervasive data sensing impose restrictions to the devices and operational environments that make them unsuitable for monitoring loosely-coupled or fully distributed work. In order to help address this challenge this paper present a framework that supports distributed pervasive data sensing in a generic way. Developers can use this framework to facilitate the implementations of their applications, thus reducing complexity and effort in such an activity. The framework was evaluated using simulations and also through an empirical test, and the obtained results indicate that it is useful to support such a sensing activity in loosely-coupled or fully distributed work scenarios. PMID:27409617

  14. A penalized framework for distributed lag non-linear models.

    PubMed

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  15. Why is a computational framework for motivational and metacognitive control needed?

    NASA Astrophysics Data System (ADS)

    Sun, Ron

    2018-01-01

    This paper discusses, in the context of computational modelling and simulation of cognition, the relevance of deeper structures in the control of behaviour. Such deeper structures include motivational control of behaviour, which provides underlying causes for actions, and also metacognitive control, which provides higher-order processes for monitoring and regulation. It is argued that such deeper structures are important and thus cannot be ignored in computational cognitive architectures. A general framework based on the Clarion cognitive architecture is outlined that emphasises the interaction amongst action selection, motivation, and metacognition. The upshot is that it is necessary to incorporate all essential processes; short of that, the understanding of cognition can only be incomplete.

  16. Off-diagonal Jacobian support for Nodal BCs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, John W.; Andrs, David; Gaston, Derek R.

    In this brief note, we describe the implementation of o-diagonal Jacobian computations for nodal boundary conditions in the Multiphysics Object Oriented Simulation Environment (MOOSE) [1] framework. There are presently a number of applications [2{5] based on the MOOSE framework that solve complicated physical systems of partial dierential equations whose boundary conditions are often highly nonlinear. Accurately computing the on- and o-diagonal Jacobian and preconditioner entries associated to these constraints is crucial for enabling ecient numerical solvers in these applications. Two key ingredients are required for properly specifying the Jacobian contributions of nonlinear nodal boundary conditions in MOOSE and nite elementmore » codes in general: 1. The ability to zero out entire Jacobian matrix rows after \

  17. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  18. A new approach for turbulent simulations in complex geometries

    NASA Astrophysics Data System (ADS)

    Israel, Daniel M.

    Historically turbulence modeling has been sharply divided into Reynolds averaged Navier-Stokes (RANS), in which all the turbulent scales of motion are modeled, and large-eddy simulation (LES), in which only a portion of the turbulent spectrum is modeled. In recent years there have been numerous attempts to couple these two approaches either by patching RANS and LES calculations together (zonal methods) or by blending the two sets of equations. In order to create a proper bridging model, that is, a single set of equations which captures both RANS and LES like behavior, it is necessary to place both RANS and LES in a more general framework. The goal of the current work is threefold: to provide such a framework, to demonstrate how the Flow Simulation Methodology (FSM) fits into this framework, and to evaluate the strengths and weaknesses of the current version of the FSM. To do this, first a set of filtered Navier-Stokes (FNS) equations are introduced in terms of an arbitrary generalized filter. Additional exact equations are given for the second order moments and the generalized subfilter dissipation rate tensor. This is followed by a discussion of the role of implicit and explicit filters in turbulence modeling. The FSM is then described with particular attention to its role as a bridging model. In order to evaluate the method a specific implementation of the FSM approach is proposed. Simulations are presented using this model for the case of a separating flow over a "hump" with and without flow control. Careful attention is paid to error estimation, and, in particular, how using flow statistics and time series affects the error analysis. Both mean flow and Reynolds stress profiles are presented, as well as the phase averaged turbulent structures and wall pressure spectra. Using the phase averaged data it is possible to examine how the FSM partitions the energy between the coherent resolved scale motions, the random resolved scale fluctuations, and the subfilter quantities. The method proves to be qualitatively successful at reproducing large turbulent structures. However, like other hybrid methods, it has difficulty in the region where the model behavior transitions from RANS to LES. Consequently the phase averaged structures reproduce the experiments quite well, and the forcing does significantly reduce the length of the separated region. Nevertheless, the recirculation length is significantly too large for all the cases. Overall the current results demonstrate the promise of bridging models in general and the FSM in particular. However, current bridging techniques are still in their infancy. There is still important progress to be made and it is hoped that this work points out the more important avenues for exploration.

  19. Branch length estimation and divergence dating: estimates of error in Bayesian and maximum likelihood frameworks.

    PubMed

    Schwartz, Rachel S; Mueller, Rachel L

    2010-01-11

    Estimates of divergence dates between species improve our understanding of processes ranging from nucleotide substitution to speciation. Such estimates are frequently based on molecular genetic differences between species; therefore, they rely on accurate estimates of the number of such differences (i.e. substitutions per site, measured as branch length on phylogenies). We used simulations to determine the effects of dataset size, branch length heterogeneity, branch depth, and analytical framework on branch length estimation across a range of branch lengths. We then reanalyzed an empirical dataset for plethodontid salamanders to determine how inaccurate branch length estimation can affect estimates of divergence dates. The accuracy of branch length estimation varied with branch length, dataset size (both number of taxa and sites), branch length heterogeneity, branch depth, dataset complexity, and analytical framework. For simple phylogenies analyzed in a Bayesian framework, branches were increasingly underestimated as branch length increased; in a maximum likelihood framework, longer branch lengths were somewhat overestimated. Longer datasets improved estimates in both frameworks; however, when the number of taxa was increased, estimation accuracy for deeper branches was less than for tip branches. Increasing the complexity of the dataset produced more misestimated branches in a Bayesian framework; however, in an ML framework, more branches were estimated more accurately. Using ML branch length estimates to re-estimate plethodontid salamander divergence dates generally resulted in an increase in the estimated age of older nodes and a decrease in the estimated age of younger nodes. Branch lengths are misestimated in both statistical frameworks for simulations of simple datasets. However, for complex datasets, length estimates are quite accurate in ML (even for short datasets), whereas few branches are estimated accurately in a Bayesian framework. Our reanalysis of empirical data demonstrates the magnitude of effects of Bayesian branch length misestimation on divergence date estimates. Because the length of branches for empirical datasets can be estimated most reliably in an ML framework when branches are <1 substitution/site and datasets are > or =1 kb, we suggest that divergence date estimates using datasets, branch lengths, and/or analytical techniques that fall outside of these parameters should be interpreted with caution.

  20. Detector Simulations with DD4hep

    NASA Astrophysics Data System (ADS)

    Petrič, M.; Frank, M.; Gaede, F.; Lu, S.; Nikiforou, N.; Sailer, A.

    2017-10-01

    Detector description is a key component of detector design studies, test beam analyses, and most of particle physics experiments that require the simulation of more and more different detector geometries and event types. This paper describes DD4hep, which is an easy-to-use yet flexible and powerful detector description framework that can be used for detector simulation and also extended to specific needs for a particular working environment. Linear collider detector concepts ILD, SiD and CLICdp as well as detector development collaborations CALICE and FCal have chosen to adopt the DD4hep geometry framework and its DDG4 pathway to Geant4 as its core simulation and reconstruction tools. The DDG4 plugins suite includes a wide variety of input formats, provides access to the Geant4 particle gun or general particles source and allows for handling of Monte Carlo truth information, eg. by linking hits and the primary particle that caused them, which is indispensable for performance and efficiency studies. An extendable array of segmentations and sensitive detectors allows the simulation of a wide variety of detector technologies. This paper shows how DD4hep allows to perform complex Geant4 detector simulations without compiling a single line of additional code by providing a palette of sub-detector components that can be combined and configured via compact XML files. Simulation is controlled either completely via the command line or via simple Python steering files interpreted by a Python executable. It also discusses how additional plugins and extensions can be created to increase the functionality.

  1. a Simulation-As Framework Facilitating Webgis Based Installation Planning

    NASA Astrophysics Data System (ADS)

    Zheng, Z.; Chang, Z. Y.; Fei, Y. F.

    2017-09-01

    Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  2. In silico analysis of antibiotic-induced Clostridium difficile infection: Remediation techniques and biological adaptations

    PubMed Central

    Carlson, Jean M.

    2018-01-01

    In this paper we study antibiotic-induced C. difficile infection (CDI), caused by the toxin-producing C. difficile (CD), and implement clinically-inspired simulated treatments in a computational framework that synthesizes a generalized Lotka-Volterra (gLV) model with SIR modeling techniques. The gLV model uses parameters derived from an experimental mouse model, in which the mice are administered antibiotics and subsequently dosed with CD. We numerically identify which of the experimentally measured initial conditions are vulnerable to CD colonization, then formalize the notion of CD susceptibility analytically. We simulate fecal transplantation, a clinically successful treatment for CDI, and discover that both the transplant timing and transplant donor are relevant to the the efficacy of the treatment, a result which has clinical implications. We incorporate two nongeneric yet dangerous attributes of CD into the gLV model, sporulation and antibiotic-resistant mutation, and for each identify relevant SIR techniques that describe the desired attribute. Finally, we rely on the results of our framework to analyze an experimental study of fecal transplants in mice, and are able to explain observed experimental results, validate our simulated results, and suggest model-motivated experiments. PMID:29451873

  3. Modelling cell motility and chemotaxis with evolving surface finite elements

    PubMed Central

    Elliott, Charles M.; Stinner, Björn; Venkataraman, Chandrasekhar

    2012-01-01

    We present a mathematical and a computational framework for the modelling of cell motility. The cell membrane is represented by an evolving surface, with the movement of the cell determined by the interaction of various forces that act normal to the surface. We consider external forces such as those that may arise owing to inhomogeneities in the medium and a pressure that constrains the enclosed volume, as well as internal forces that arise from the reaction of the cells' surface to stretching and bending. We also consider a protrusive force associated with a reaction–diffusion system (RDS) posed on the cell membrane, with cell polarization modelled by this surface RDS. The computational method is based on an evolving surface finite-element method. The general method can account for the large deformations that arise in cell motility and allows the simulation of cell migration in three dimensions. We illustrate applications of the proposed modelling framework and numerical method by reporting on numerical simulations of a model for eukaryotic chemotaxis and a model for the persistent movement of keratocytes in two and three space dimensions. Movies of the simulated cells can be obtained from http://homepages.warwick.ac.uk/∼maskae/CV_Warwick/Chemotaxis.html. PMID:22675164

  4. Interface COMSOL-PHREEQC (iCP), an efficient numerical framework for the solution of coupled multiphysics and geochemistry

    NASA Astrophysics Data System (ADS)

    Nardi, Albert; Idiart, Andrés; Trinchero, Paolo; de Vries, Luis Manuel; Molinero, Jorge

    2014-08-01

    This paper presents the development, verification and application of an efficient interface, denoted as iCP, which couples two standalone simulation programs: the general purpose Finite Element framework COMSOL Multiphysics® and the geochemical simulator PHREEQC. The main goal of the interface is to maximize the synergies between the aforementioned codes, providing a numerical platform that can efficiently simulate a wide number of multiphysics problems coupled with geochemistry. iCP is written in Java and uses the IPhreeqc C++ dynamic library and the COMSOL Java-API. Given the large computational requirements of the aforementioned coupled models, special emphasis has been placed on numerical robustness and efficiency. To this end, the geochemical reactions are solved in parallel by balancing the computational load over multiple threads. First, a benchmark exercise is used to test the reliability of iCP regarding flow and reactive transport. Then, a large scale thermo-hydro-chemical (THC) problem is solved to show the code capabilities. The results of the verification exercise are successfully compared with those obtained using PHREEQC and the application case demonstrates the scalability of a large scale model, at least up to 32 threads.

  5. In silico analysis of antibiotic-induced Clostridium difficile infection: Remediation techniques and biological adaptations.

    PubMed

    Jones, Eric W; Carlson, Jean M

    2018-02-01

    In this paper we study antibiotic-induced C. difficile infection (CDI), caused by the toxin-producing C. difficile (CD), and implement clinically-inspired simulated treatments in a computational framework that synthesizes a generalized Lotka-Volterra (gLV) model with SIR modeling techniques. The gLV model uses parameters derived from an experimental mouse model, in which the mice are administered antibiotics and subsequently dosed with CD. We numerically identify which of the experimentally measured initial conditions are vulnerable to CD colonization, then formalize the notion of CD susceptibility analytically. We simulate fecal transplantation, a clinically successful treatment for CDI, and discover that both the transplant timing and transplant donor are relevant to the the efficacy of the treatment, a result which has clinical implications. We incorporate two nongeneric yet dangerous attributes of CD into the gLV model, sporulation and antibiotic-resistant mutation, and for each identify relevant SIR techniques that describe the desired attribute. Finally, we rely on the results of our framework to analyze an experimental study of fecal transplants in mice, and are able to explain observed experimental results, validate our simulated results, and suggest model-motivated experiments.

  6. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.

  7. Validation of educational assessments: a primer for simulation and beyond.

    PubMed

    Cook, David A; Hatala, Rose

    2016-01-01

    Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the "interpretation-use argument"), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent "validity argument." A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? Rigorous validation first prioritizes and then empirically evaluates key assumptions in the interpretation and use of assessment scores. Validation science would be improved by more explicit articulation and prioritization of the interpretation-use argument, greater use of formal validation frameworks, and more evidence informing the consequences and implications of assessment.

  8. A Bayesian framework to estimate diversification rates and their variation through time and space

    PubMed Central

    2011-01-01

    Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling. PMID:22013891

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saenz, Juan A.; Chen, Qingshan; Ringler, Todd

    Recent work has shown that taking the thickness-weighted average (TWA) of the Boussinesq equations in buoyancy coordinates results in exact equations governing the prognostic residual mean flow where eddy–mean flow interactions appear in the horizontal momentum equations as the divergence of the Eliassen–Palm flux tensor (EPFT). It has been proposed that, given the mathematical tractability of the TWA equations, the physical interpretation of the EPFT, and its relation to potential vorticity fluxes, the TWA is an appropriate framework for modeling ocean circulation with parameterized eddies. The authors test the feasibility of this proposition and investigate the connections between the TWAmore » framework and the conventional framework used in models, where Eulerian mean flow prognostic variables are solved for. Using the TWA framework as a starting point, this study explores the well-known connections between vertical transfer of horizontal momentum by eddy form drag and eddy overturning by the bolus velocity, used by Greatbatch and Lamb and Gent and McWilliams to parameterize eddies. After implementing the TWA framework in an ocean general circulation model, we verify our analysis by comparing the flows in an idealized Southern Ocean configuration simulated using the TWA and conventional frameworks with the same mesoscale eddy parameterization.« less

  10. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    NASA Astrophysics Data System (ADS)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  11. Numerical simulation of the casting process of titanium removable partial denture frameworks.

    PubMed

    Wu, Menghuai; Wagner, Ingo; Sahm, Peter R; Augthun, Michael

    2002-03-01

    The objective of this work was to study the filling incompleteness and porosity defects in titanium removal partial denture frameworks by means of numerical simulation. Two frameworks, one for lower jaw and one for upper jaw, were chosen according to dentists' recommendation to be simulated. Geometry of the frameworks were laser-digitized and converted into a simulation software (MAGMASOFT). Both mold filling and solidification of the castings with different sprue designs (e.g. tree, ball, and runner-bar) were numerically calculated. The shrinkage porosity was quantitatively predicted by a feeding criterion, the potential filling defect and gas pore sensitivity were estimated based on the filling and solidification results. A satisfactory sprue design with process parameters was finally recommended for real casting trials (four replica for each frameworks). All the frameworks were successfully cast. Through X-ray radiographic inspections it was found that all the castings were acceptably sound except for only one case in which gas bubbles were detected in the grasp region of the frame. It is concluded that numerical simulation aids to achieve understanding of the casting process and defect formation in titanium frameworks, hence to minimize the risk of producing defect casting by improving the sprue design and process parameters.

  12. State-and-transition simulation models: a framework for forecasting landscape change

    USGS Publications Warehouse

    Daniel, Colin; Frid, Leonardo; Sleeter, Benjamin M.; Fortin, Marie-Josée

    2016-01-01

    SummaryA wide range of spatially explicit simulation models have been developed to forecast landscape dynamics, including models for projecting changes in both vegetation and land use. While these models have generally been developed as separate applications, each with a separate purpose and audience, they share many common features.We present a general framework, called a state-and-transition simulation model (STSM), which captures a number of these common features, accompanied by a software product, called ST-Sim, to build and run such models. The STSM method divides a landscape into a set of discrete spatial units and simulates the discrete state of each cell forward as a discrete-time-inhomogeneous stochastic process. The method differs from a spatially interacting Markov chain in several important ways, including the ability to add discrete counters such as age and time-since-transition as state variables, to specify one-step transition rates as either probabilities or target areas, and to represent multiple types of transitions between pairs of states.We demonstrate the STSM method using a model of land-use/land-cover (LULC) change for the state of Hawai'i, USA. Processes represented in this example include expansion/contraction of agricultural lands, urbanization, wildfire, shrub encroachment into grassland and harvest of tree plantations; the model also projects shifts in moisture zones due to climate change. Key model output includes projections of the future spatial and temporal distribution of LULC classes and moisture zones across the landscape over the next 50 years.State-and-transition simulation models can be applied to a wide range of landscapes, including questions of both land-use change and vegetation dynamics. Because the method is inherently stochastic, it is well suited for characterizing uncertainty in model projections. When combined with the ST-Sim software, STSMs offer a simple yet powerful means for developing a wide range of models of landscape dynamics.

  13. A unified stochastic formulation of dissipative quantum dynamics. I. Generalized hierarchical equations

    NASA Astrophysics Data System (ADS)

    Hsieh, Chang-Yu; Cao, Jianshu

    2018-01-01

    We extend a standard stochastic theory to study open quantum systems coupled to a generic quantum environment. We exemplify the general framework by studying a two-level quantum system coupled bilinearly to the three fundamental classes of non-interacting particles: bosons, fermions, and spins. In this unified stochastic approach, the generalized stochastic Liouville equation (SLE) formally captures the exact quantum dissipations when noise variables with appropriate statistics for different bath models are applied. Anharmonic effects of a non-Gaussian bath are precisely encoded in the bath multi-time correlation functions that noise variables have to satisfy. Starting from the SLE, we devise a family of generalized hierarchical equations by averaging out the noise variables and expand bath multi-time correlation functions in a complete basis of orthonormal functions. The general hierarchical equations constitute systems of linear equations that provide numerically exact simulations of quantum dynamics. For bosonic bath models, our general hierarchical equation of motion reduces exactly to an extended version of hierarchical equation of motion which allows efficient simulation for arbitrary spectral densities and temperature regimes. Similar efficiency and flexibility can be achieved for the fermionic bath models within our formalism. The spin bath models can be simulated with two complementary approaches in the present formalism. (I) They can be viewed as an example of non-Gaussian bath models and be directly handled with the general hierarchical equation approach given their multi-time correlation functions. (II) Alternatively, each bath spin can be first mapped onto a pair of fermions and be treated as fermionic environments within the present formalism.

  14. Object-oriented approach for gas turbine engine simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Felder, James L.

    1995-01-01

    An object-oriented gas turbine engine simulation program was developed. This program is a prototype for a more complete, commercial grade engine performance program now being proposed as part of the Numerical Propulsion System Simulator (NPSS). This report discusses architectural issues of this complex software system and the lessons learned from developing the prototype code. The prototype code is a fully functional, general purpose engine simulation program, however, only the component models necessary to model a transient compressor test rig have been written. The production system will be capable of steady state and transient modeling of almost any turbine engine configuration. Chief among the architectural considerations for this code was the framework in which the various software modules will interact. These modules include the equation solver, simulation code, data model, event handler, and user interface. Also documented in this report is the component based design of the simulation module and the inter-component communication paradigm. Object class hierarchies for some of the code modules are given.

  15. A conceptual framework for using Doppler radar acquired atmospheric data for flight simulation

    NASA Technical Reports Server (NTRS)

    Campbell, W.

    1983-01-01

    A concept is presented which can permit turbulence simulation in the vicinity of microbursts. The method involves a large data base, but should be fast enough for use with flight simulators. The model permits any pilot to simulate any flight maneuver in any aircraft. The model simulates a wind field with three-component mean winds and three-component turbulent gusts, and gust variation over the body of an aircraft so that all aerodynamic loads and moments can be calculated. The time and space variation of mean winds and turbulent intensities associated with a particular atmospheric phenomenon such as a microburst is used in the model. In fact, Doppler radar data such as provided by JAWS is uniquely suited for use with the proposed model. The concept is completely general and is not restricted to microburst studies. Reentry and flight in terrestrial or planetary atmospheres could be realistically simulated if supporting data of sufficient resolution were available.

  16. An Object-Oriented Serial DSMC Simulation Package

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  17. System Software Framework for System of Systems Avionics

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.

    2005-01-01

    Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.

  18. Identifying tacit strategies in aircraft maneuvers

    NASA Technical Reports Server (NTRS)

    Lewis, Charles M.; Heidorn, P. B.

    1991-01-01

    Two machine-learning methods are presently used to characterize the avoidance strategies used by skilled pilots in simulated aircraft encounters, and a general framework for the characterization of the strategic components of skilled behavior via qualitative representation of situations and responses is presented. Descriptions of pilot maneuvers that were 'conceptually equivalent' were ascertained by a concept-learning algorithm in conjunction with a classifier system that employed a generic algorithm; satisficing and 'buggy' strategies were thereby revealed.

  19. Gay-Berne and electrostatic multipole based coarse-grain potential in implicit solvent

    NASA Astrophysics Data System (ADS)

    Wu, Johnny; Zhen, Xia; Shen, Hujun; Li, Guohui; Ren, Pengyu

    2011-10-01

    A general, transferable coarse-grain (CG) framework based on the Gay-Berne potential and electrostatic point multipole expansion is presented for polypeptide simulations. The solvent effect is described by the Generalized Kirkwood theory. The CG model is calibrated using the results of all-atom simulations of model compounds in solution. Instead of matching the overall effective forces produced by atomic models, the fundamental intermolecular forces such as electrostatic, repulsion-dispersion, and solvation are represented explicitly at a CG level. We demonstrate that the CG alanine dipeptide model is able to reproduce quantitatively the conformational energy of all-atom force fields in both gas and solution phases, including the electrostatic and solvation components. Replica exchange molecular dynamics and microsecond dynamic simulations of polyalanine of 5 and 12 residues reveal that the CG polyalanines fold into "alpha helix" and "beta sheet" structures. The 5-residue polyalanine displays a substantial increase in the "beta strand" fraction relative to the 12-residue polyalanine. The detailed conformational distribution is compared with those reported from recent all-atom simulations and experiments. The results suggest that the new coarse-graining approach presented in this study has the potential to offer both accuracy and efficiency for biomolecular modeling.

  20. Smoothed Particle Hydrodynamics: A consistent model for interfacial multiphase fluid flow simulations

    NASA Astrophysics Data System (ADS)

    Krimi, Abdelkader; Rezoug, Mehdi; Khelladi, Sofiane; Nogueira, Xesús; Deligant, Michael; Ramírez, Luis

    2018-04-01

    In this work, a consistent Smoothed Particle Hydrodynamics (SPH) model to deal with interfacial multiphase fluid flows simulation is proposed. A modification to the Continuum Stress Surface formulation (CSS) [1] to enhance the stability near the fluid interface is developed in the framework of the SPH method. A non-conservative first-order consistency operator is used to compute the divergence of stress surface tensor. This formulation benefits of all the advantages of the one proposed by Adami et al. [2] and, in addition, it can be applied to more than two phases fluid flow simulations. Moreover, the generalized wall boundary conditions [3] are modified in order to be well adapted to multiphase fluid flows with different density and viscosity. In order to allow the application of this technique to wall-bounded multiphase flows, a modification of generalized wall boundary conditions is presented here for using the SPH method. In this work we also present a particle redistribution strategy as an extension of the damping technique presented in [3] to smooth the initial transient phase of gravitational multiphase fluid flow simulations. Several computational tests are investigated to show the accuracy, convergence and applicability of the proposed SPH interfacial multiphase model.

  1. Improving Fidelity of Launch Vehicle Liftoff Acoustic Simulations

    NASA Technical Reports Server (NTRS)

    Liever, Peter; West, Jeff

    2016-01-01

    Launch vehicles experience high acoustic loads during ignition and liftoff affected by the interaction of rocket plume generated acoustic waves with launch pad structures. Application of highly parallelized Computational Fluid Dynamics (CFD) analysis tools optimized for application on the NAS computer systems such as the Loci/CHEM program now enable simulation of time-accurate, turbulent, multi-species plume formation and interaction with launch pad geometry and capture the generation of acoustic noise at the source regions in the plume shear layers and impingement regions. These CFD solvers are robust in capturing the acoustic fluctuations, but they are too dissipative to accurately resolve the propagation of the acoustic waves throughout the launch environment domain along the vehicle. A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed to improve such liftoff acoustic environment predictions. The framework combines the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin (DG) solver, Loci/THRUST, developed in the same computational framework. Loci/THRUST employs a low dissipation, high-order, unstructured DG method to accurately propagate acoustic waves away from the source regions across large distances. The DG solver is currently capable of solving up to 4th order solutions for non-linear, conservative acoustic field propagation. Higher order boundary conditions are implemented to accurately model the reflection and refraction of acoustic waves on launch pad components. The DG solver accepts generalized unstructured meshes, enabling efficient application of common mesh generation tools for CHEM and THRUST simulations. The DG solution is coupled with the CFD solution at interface boundaries placed near the CFD acoustic source regions. Both simulations are executed simultaneously with coordinated boundary condition data exchange.

  2. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs

    PubMed Central

    Truccolo, Wilson

    2017-01-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity. PMID:28234899

  3. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs.

    PubMed

    Gerhard, Felipe; Deger, Moritz; Truccolo, Wilson

    2017-02-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity.

  4. Optimization of parameter values for complex pulse sequences by simulated annealing: application to 3D MP-RAGE imaging of the brain.

    PubMed

    Epstein, F H; Mugler, J P; Brookeman, J R

    1994-02-01

    A number of pulse sequence techniques, including magnetization-prepared gradient echo (MP-GRE), segmented GRE, and hybrid RARE, employ a relatively large number of variable pulse sequence parameters and acquire the image data during a transient signal evolution. These sequences have recently been proposed and/or used for clinical applications in the brain, spine, liver, and coronary arteries. Thus, the need for a method of deriving optimal pulse sequence parameter values for this class of sequences now exists. Due to the complexity of these sequences, conventional optimization approaches, such as applying differential calculus to signal difference equations, are inadequate. We have developed a general framework for adapting the simulated annealing algorithm to pulse sequence parameter value optimization, and applied this framework to the specific case of optimizing the white matter-gray matter signal difference for a T1-weighted variable flip angle 3D MP-RAGE sequence. Using our algorithm, the values of 35 sequence parameters, including the magnetization-preparation RF pulse flip angle and delay time, 32 flip angles in the variable flip angle gradient-echo acquisition sequence, and the magnetization recovery time, were derived. Optimized 3D MP-RAGE achieved up to a 130% increase in white matter-gray matter signal difference compared with optimized 3D RF-spoiled FLASH with the same total acquisition time. The simulated annealing approach was effective at deriving optimal parameter values for a specific 3D MP-RAGE imaging objective, and may be useful for other imaging objectives and sequences in this general class.

  5. Shock-driven fluid-structure interaction for civil design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Stephen L; Deiterding, Ralf

    The multiphysics fluid-structure interaction simulation of shock-loaded structures requires the dynamic coupling of a shock-capturing flow solver to a solid mechanics solver for large deformations. The Virtual Test Facility combines a Cartesian embedded boundary approach with dynamic mesh adaptation in a generic software framework of flow solvers using hydrodynamic finite volume upwind schemes that are coupled to various explicit finite element solid dynamics solvers (Deiterding et al., 2006). This paper gives a brief overview of the computational approach and presents first simulations that utilize the general purpose solid dynamics code DYNA3D for complex 3D structures of interest in civil engineering.more » Results from simulations of a reinforced column, highway bridge, multistory building, and nuclear reactor building are presented.« less

  6. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  7. Fast Learning for Immersive Engagement in Energy Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less

  8. Topographica: Building and Analyzing Map-Level Simulations from Python, C/C++, MATLAB, NEST, or NEURON Components

    PubMed Central

    Bednar, James A.

    2008-01-01

    Many neural regions are arranged into two-dimensional topographic maps, such as the retinotopic maps in mammalian visual cortex. Computational simulations have led to valuable insights about how cortical topography develops and functions, but further progress has been hindered by the lack of appropriate tools. It has been particularly difficult to bridge across levels of detail, because simulators are typically geared to a specific level, while interfacing between simulators has been a major technical challenge. In this paper, we show that the Python-based Topographica simulator makes it straightforward to build systems that cross levels of analysis, as well as providing a common framework for evaluating and comparing models implemented in other simulators. These results rely on the general-purpose abstractions around which Topographica is designed, along with the Python interfaces becoming available for many simulators. In particular, we present a detailed, general-purpose example of how to wrap an external spiking PyNN/NEST simulation as a Topographica component using only a dozen lines of Python code, making it possible to use any of the extensive input presentation, analysis, and plotting tools of Topographica. Additional examples show how to interface easily with models in other types of simulators. Researchers simulating topographic maps externally should consider using Topographica's analysis tools (such as preference map, receptive field, or tuning curve measurement) to compare results consistently, and for connecting models at different levels. This seamless interoperability will help neuroscientists and computational scientists to work together to understand how neurons in topographic maps organize and operate. PMID:19352443

  9. Composing problem solvers for simulation experimentation: a case study on steady state estimation.

    PubMed

    Leye, Stefan; Ewald, Roland; Uhrmacher, Adelinde M

    2014-01-01

    Simulation experiments involve various sub-tasks, e.g., parameter optimization, simulation execution, or output data analysis. Many algorithms can be applied to such tasks, but their performance depends on the given problem. Steady state estimation in systems biology is a typical example for this: several estimators have been proposed, each with its own (dis-)advantages. Experimenters, therefore, must choose from the available options, even though they may not be aware of the consequences. To support those users, we propose a general scheme to aggregate such algorithms to so-called synthetic problem solvers, which exploit algorithm differences to improve overall performance. Our approach subsumes various aggregation mechanisms, supports automatic configuration from training data (e.g., via ensemble learning or portfolio selection), and extends the plugin system of the open source modeling and simulation framework James II. We show the benefits of our approach by applying it to steady state estimation for cell-biological models.

  10. Death of a Simulated Pediatric Patient: Toward a More Robust Theoretical Framework.

    PubMed

    McBride, Mary E; Schinasi, Dana Aronson; Moga, Michael Alice; Tripathy, Shreepada; Calhoun, Aaron

    2017-12-01

    A theoretical framework was recently proposed that encapsulates learner responses to simulated death due to action or inaction in the pediatric context. This framework, however, was developed at an institution that allows simulated death and thus does not address the experience of those centers at which this technique is not used. To address this, we performed a parallel qualitative study with the intent of augmenting the initial framework. We conducted focus groups, using a constructivist grounded theory approach, using physicians and nurses who have experienced a simulated cardiac arrest. The participants were recruited via e-mail. Transcripts were analyzed by coders blinded to the original framework to generate a list of provisional themes that were iteratively refined. These themes were then compared with the themes from the original article and used to derive a consensus model that incorporated the most relevant features of each. Focus group data yielded 7 themes. Six were similar to those developed in the original framework. One important exception was noted; however, those learners not exposed to patient death due to action or inaction often felt that the mannequin's survival was artificial. This additional theme was incorporated into a revised framework. The original framework addresses most aspects of learner reactions to simulated death. Our work suggests that adding the theme pertaining to the lack of realism that can be perceived when the mannequin is unexpectedly saved results in a more robust theoretical framework transferable to centers that do not allow mannequin death.

  11. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  12. SSAGES: Software Suite for Advanced General Ensemble Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniquesmore » as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.« less

  13. SSAGES: Software Suite for Advanced General Ensemble Simulations.

    PubMed

    Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  14. SSAGES: Software Suite for Advanced General Ensemble Simulations

    NASA Astrophysics Data System (ADS)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  15. A conditional stochastic weather generator for seasonal to multi-decadal simulations

    NASA Astrophysics Data System (ADS)

    Verdin, Andrew; Rajagopalan, Balaji; Kleiber, William; Podestá, Guillermo; Bert, Federico

    2018-01-01

    We present the application of a parametric stochastic weather generator within a nonstationary context, enabling simulations of weather sequences conditioned on interannual and multi-decadal trends. The generalized linear model framework of the weather generator allows any number of covariates to be included, such as large-scale climate indices, local climate information, seasonal precipitation and temperature, among others. Here we focus on the Salado A basin of the Argentine Pampas as a case study, but the methodology is portable to any region. We include domain-averaged (e.g., areal) seasonal total precipitation and mean maximum and minimum temperatures as covariates for conditional simulation. Areal covariates are motivated by a principal component analysis that indicates the seasonal spatial average is the dominant mode of variability across the domain. We find this modification to be effective in capturing the nonstationarity prevalent in interseasonal precipitation and temperature data. We further illustrate the ability of this weather generator to act as a spatiotemporal downscaler of seasonal forecasts and multidecadal projections, both of which are generally of coarse resolution.

  16. A unified framework for weighted parametric multiple test procedures.

    PubMed

    Xi, Dong; Glimm, Ekkehard; Maurer, Willi; Bretz, Frank

    2017-09-01

    We describe a general framework for weighted parametric multiple test procedures based on the closure principle. We utilize general weighting strategies that can reflect complex study objectives and include many procedures in the literature as special cases. The proposed weighted parametric tests bridge the gap between rejection rules using either adjusted significance levels or adjusted p-values. This connection is made by allowing intersection hypotheses of the underlying closed test procedure to be tested at level smaller than α. This may be also necessary to take certain study situations into account. For such cases we introduce a subclass of exact α-level parametric tests that satisfy the consonance property. When the correlation is known only for certain subsets of the test statistics, a new procedure is proposed to fully utilize this knowledge within each subset. We illustrate the proposed weighted parametric tests using a clinical trial example and conduct a simulation study to investigate its operating characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data

    PubMed Central

    Yang, Yan; Simpson, Douglas

    2010-01-01

    Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950

  18. Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena

    2010-01-01

    The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.

  19. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    PubMed

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily activities investigated. Therefore, such a framework can be adopted as a pre-clinical simulation approach to optimise different designs, materials, as well as patient's specific total knee replacements for a range of activities. Copyright © 2017. Published by Elsevier Ltd.

  20. Safer passenger car front shapes for pedestrians: A computational approach to reduce overall pedestrian injury risk in realistic impact scenarios.

    PubMed

    Li, Guibing; Yang, Jikuang; Simms, Ciaran

    2017-03-01

    Vehicle front shape has a significant influence on pedestrian injuries and the optimal design for overall pedestrian protection remains an elusive goal, especially considering the variability of vehicle-to-pedestrian accident scenarios. Therefore this study aims to develop and evaluate an efficient framework for vehicle front shape optimization for pedestrian protection accounting for the broad range of real world impact scenarios and their distributions in recent accident data. Firstly, a framework for vehicle front shape optimization for pedestrian protection was developed based on coupling of multi-body simulations and a genetic algorithm. This framework was then applied for optimizing passenger car front shape for pedestrian protection, and its predictions were evaluated using accident data and kinematic analyses. The results indicate that the optimization shows a good convergence and predictions of the optimization framework are corroborated when compared to the available accident data, and the optimization framework can distinguish 'good' and 'poor' vehicle front shapes for pedestrian safety. Thus, it is feasible and reliable to use the optimization framework for vehicle front shape optimization for reducing overall pedestrian injury risk. The results also show the importance of considering the broad range of impact scenarios in vehicle front shape optimization. A safe passenger car for overall pedestrian protection should have a wide and flat bumper (covering pedestrians' legs from the lower leg up to the shaft of the upper leg with generally even contacts), a bonnet leading edge height around 750mm, a short bonnet (<800mm) with a shallow or steep angle (either >17° or <12°) and a shallow windscreen (≤30°). Sensitivity studies based on simulations at the population level indicate that the demands for a safe passenger car front shape for head and leg protection are generally consistent, but partially conflict with pelvis protection. In particular, both head and leg injury risk increase with increasing bumper lower height and depth, and decrease with increasing bonnet leading edge height, while pelvis injury risk increases with increasing bonnet leading edge height. However, the effects of bonnet leading edge height and windscreen design on head injury risk are complex and require further analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  2. Generalized three-dimensional lattice Boltzmann color-gradient method for immiscible two-phase pore-scale imbibition and drainage in porous media

    NASA Astrophysics Data System (ADS)

    Leclaire, Sébastien; Parmigiani, Andrea; Malaspinas, Orestis; Chopard, Bastien; Latt, Jonas

    2017-03-01

    This article presents a three-dimensional numerical framework for the simulation of fluid-fluid immiscible compounds in complex geometries, based on the multiple-relaxation-time lattice Boltzmann method to model the fluid dynamics and the color-gradient approach to model multicomponent flow interaction. New lattice weights for the lattices D3Q15, D3Q19, and D3Q27 that improve the Galilean invariance of the color-gradient model as well as for modeling the interfacial tension are derived and provided in the Appendix. The presented method proposes in particular an approach to model the interaction between the fluid compound and the solid, and to maintain a precise contact angle between the two-component interface and the wall. Contrarily to previous approaches proposed in the literature, this method yields accurate solutions even in complex geometries and does not suffer from numerical artifacts like nonphysical mass transfer along the solid wall, which is crucial for modeling imbibition-type problems. The article also proposes an approach to model inflow and outflow boundaries with the color-gradient method by generalizing the regularized boundary conditions. The numerical framework is first validated for three-dimensional (3D) stationary state (Jurin's law) and time-dependent (Washburn's law and capillary waves) problems. Then, the usefulness of the method for practical problems of pore-scale flow imbibition and drainage in porous media is demonstrated. Through the simulation of nonwetting displacement in two-dimensional random porous media networks, we show that the model properly reproduces three main invasion regimes (stable displacement, capillary fingering, and viscous fingering) as well as the saturating zone transition between these regimes. Finally, the ability to simulate immiscible two-component flow imbibition and drainage is validated, with excellent results, by numerical simulations in a Berea sandstone, a frequently used benchmark case used in this field, using a complex geometry that originates from a 3D scan of a porous sandstone. The methods presented in this article were implemented in the open-source PALABOS library, a general C++ matrix-based library well adapted for massive fluid flow parallel computation.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nord, B.; Amara, A.; Refregier, A.

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less

  4. A new scheme of general hybrid projective complete dislocated synchronization

    NASA Astrophysics Data System (ADS)

    Chu, Yan-dong; Chang, Ying-Xiang; An, Xin-lei; Yu, Jian-Ning; Zhang, Jian-Gang

    2011-03-01

    Based on the Lyapunov stability theorem, a new type of chaos synchronization, general hybrid projective complete dislocated synchronization (GHPCDS), is proposed under the framework of drive-response systems. The difference between the GHPCDS and complete synchronization is that every state variable of drive system does not equal the corresponding state variable, but equal other ones of response system while evolving in time. The GHPCDS includes complete dislocated synchronization, dislocated anti-synchronization and projective dislocated synchronization as its special item. As examples, the Lorenz chaotic system, Rössler chaotic system, hyperchaotic Chen system and hyperchaotic Lü system are discussed. Numerical simulations are given to show the effectiveness of these methods.

  5. pyGIMLi: An open-source library for modelling and inversion in geophysics

    NASA Astrophysics Data System (ADS)

    Rücker, Carsten; Günther, Thomas; Wagner, Florian M.

    2017-12-01

    Many tasks in applied geosciences cannot be solved by single measurements, but require the integration of geophysical, geotechnical and hydrological methods. Numerical simulation techniques are essential both for planning and interpretation, as well as for the process understanding of modern geophysical methods. These trends encourage open, simple, and modern software architectures aiming at a uniform interface for interdisciplinary and flexible modelling and inversion approaches. We present pyGIMLi (Python Library for Inversion and Modelling in Geophysics), an open-source framework that provides tools for modelling and inversion of various geophysical but also hydrological methods. The modelling component supplies discretization management and the numerical basis for finite-element and finite-volume solvers in 1D, 2D and 3D on arbitrarily structured meshes. The generalized inversion framework solves the minimization problem with a Gauss-Newton algorithm for any physical forward operator and provides opportunities for uncertainty and resolution analyses. More general requirements, such as flexible regularization strategies, time-lapse processing and different sorts of coupling individual methods are provided independently of the actual methods used. The usage of pyGIMLi is first demonstrated by solving the steady-state heat equation, followed by a demonstration of more complex capabilities for the combination of different geophysical data sets. A fully coupled hydrogeophysical inversion of electrical resistivity tomography (ERT) data of a simulated tracer experiment is presented that allows to directly reconstruct the underlying hydraulic conductivity distribution of the aquifer. Another example demonstrates the improvement of jointly inverting ERT and ultrasonic data with respect to saturation by a new approach that incorporates petrophysical relations in the inversion. Potential applications of the presented framework are manifold and include time-lapse, constrained, joint, and coupled inversions of various geophysical and hydrological data sets.

  6. A formal model of interpersonal inference

    PubMed Central

    Moutoussis, Michael; Trujillo-Barreto, Nelson J.; El-Deredy, Wael; Dolan, Raymond J.; Friston, Karl J.

    2014-01-01

    Introduction: We propose that active Bayesian inference—a general framework for decision-making—can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: (1) Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to “mentalizing” in the psychological literature, is based upon the outcomes of interpersonal exchanges. (2) We show how some well-known social-psychological phenomena (e.g., self-serving biases) can be explained in terms of active interpersonal inference. (3) Mentalizing naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one's own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modeling intersubject variability in mentalizing during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalizing is distorted. PMID:24723872

  7. Stacking Faults and Mechanical Behavior beyond the Elastic Limit of an Imidazole-Based Metal Organic Framework: ZIF-8.

    PubMed

    Hegde, Vinay I; Tan, Jin-Chong; Waghmare, Umesh V; Cheetham, Anthony K

    2013-10-17

    We determine the nonlinear mechanical behavior of a prototypical zeolitic imidazolate framework (ZIF-8) along two modes of mechanical failure in response to tensile and shear forces using first-principles simulations. Our generalized stacking fault energy surface reveals an intrinsic stacking fault of surprisingly low energy comparable to that in copper, though the energy barrier associated with its formation is much higher. The lack of vibrational spectroscopic evidence for such faults in experiments can be explained with the structural instability of the barrier state to form a denser and disordered state of ZIF-8 seen in our analysis, that is, large shear leads to its amorphization rather than formation of faults.

  8. On Connectivity of Wireless Sensor Networks with Directional Antennas

    PubMed Central

    Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081

  9. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in Flash

    NASA Technical Reports Server (NTRS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-01-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  10. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in FLASH

    NASA Astrophysics Data System (ADS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-08-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid-structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  11. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems

  12. Attractors in complex networks

    NASA Astrophysics Data System (ADS)

    Rodrigues, Alexandre A. P.

    2017-10-01

    In the framework of the generalized Lotka-Volterra model, solutions representing multispecies sequential competition can be predictable with high probability. In this paper, we show that it occurs because the corresponding "heteroclinic channel" forms part of an attractor. We prove that, generically, in an attracting heteroclinic network involving a finite number of hyperbolic and non-resonant saddle-equilibria whose linearization has only real eigenvalues, the connections corresponding to the most positive expanding eigenvalues form part of an attractor (observable in numerical simulations).

  13. Attractors in complex networks.

    PubMed

    Rodrigues, Alexandre A P

    2017-10-01

    In the framework of the generalized Lotka-Volterra model, solutions representing multispecies sequential competition can be predictable with high probability. In this paper, we show that it occurs because the corresponding "heteroclinic channel" forms part of an attractor. We prove that, generically, in an attracting heteroclinic network involving a finite number of hyperbolic and non-resonant saddle-equilibria whose linearization has only real eigenvalues, the connections corresponding to the most positive expanding eigenvalues form part of an attractor (observable in numerical simulations).

  14. Enabling Functional Neural Circuit Simulations with Distributed Computing of Neuromodulated Plasticity

    PubMed Central

    Potjans, Wiebke; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    A major puzzle in the field of computational neuroscience is how to relate system-level learning in higher organisms to synaptic plasticity. Recently, plasticity rules depending not only on pre- and post-synaptic activity but also on a third, non-local neuromodulatory signal have emerged as key candidates to bridge the gap between the macroscopic and the microscopic level of learning. Crucial insights into this topic are expected to be gained from simulations of neural systems, as these allow the simultaneous study of the multiple spatial and temporal scales that are involved in the problem. In particular, synaptic plasticity can be studied during the whole learning process, i.e., on a time scale of minutes to hours and across multiple brain areas. Implementing neuromodulated plasticity in large-scale network simulations where the neuromodulatory signal is dynamically generated by the network itself is challenging, because the network structure is commonly defined purely by the connectivity graph without explicit reference to the embedding of the nodes in physical space. Furthermore, the simulation of networks with realistic connectivity entails the use of distributed computing. A neuromodulated synapse must therefore be informed in an efficient way about the neuromodulatory signal, which is typically generated by a population of neurons located on different machines than either the pre- or post-synaptic neuron. Here, we develop a general framework to solve the problem of implementing neuromodulated plasticity in a time-driven distributed simulation, without reference to a particular implementation language, neuromodulator, or neuromodulated plasticity mechanism. We implement our framework in the simulator NEST and demonstrate excellent scaling up to 1024 processors for simulations of a recurrent network incorporating neuromodulated spike-timing dependent plasticity. PMID:21151370

  15. A multiphysics and multiscale software environment for modeling astrophysical systems

    NASA Astrophysics Data System (ADS)

    Portegies Zwart, Simon; McMillan, Steve; Harfst, Stefan; Groen, Derek; Fujii, Michiko; Nualláin, Breanndán Ó.; Glebbeek, Evert; Heggie, Douglas; Lombardi, James; Hut, Piet; Angelou, Vangelis; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Gaburov, Evghenii; Izzard, Rob; Jurić, Mario; Justham, Stephen; Sottoriva, Andrea; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel

    2009-05-01

    We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a "Noah's Ark" milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multiscale and multiphysics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe three examples calculated using MUSE: the merger of two galaxies, the merger of two evolving stars, and a hybrid N-body simulation. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  16. Status and future of MUSE

    NASA Astrophysics Data System (ADS)

    Harfst, S.; Portegies Zwart, S.; McMillan, S.

    2008-12-01

    We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  17. A simulation framework for the CMS Track Trigger electronics

    NASA Astrophysics Data System (ADS)

    Amstutz, C.; Magazzù, G.; Weber, M.; Palla, F.

    2015-03-01

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.

  18. Performance evaluation of CESM in simulating the dust cycle

    NASA Astrophysics Data System (ADS)

    Parajuli, S. P.; Yang, Z. L.; Kocurek, G.; Lawrence, D. M.

    2014-12-01

    Mineral dust in the atmosphere has implications for Earth's radiation budget, biogeochemical cycles, hydrological cycles, human health and visibility. Mineral dust is injected into the atmosphere during dust storms when the surface winds are sufficiently strong and the land surface conditions are favorable. Dust storms are very common in specific regions of the world including the Middle East and North Africa (MENA) region, which contains more than 50% of the global dust sources. In this work, we present simulation of the dust cycle under the framework of CESM1.2.2 and evaluate how well the model captures the spatio-temporal characteristics of dust sources, transport and deposition at global scale, especially in dust source regions. We conducted our simulations using two existing erodibility maps (geomorphic and topographic) and a new erodibility map, which is based on the correlation between observed wind and dust. We compare the simulated results with MODIS satellite data, MACC reanalysis data, and AERONET station data. Comparison with MODIS satellite data and MACC reanalysis data shows that all three erodibility maps generally reproduce the spatio-temporal characteristics of dust optical depth globally. However, comparison with AERONET station data shows that the simulated dust optical depth is generally overestimated for all erodibility maps. Results vary greatly by region and scale of observational data. Our results also show that the simulations forced by reanalysis meteorology capture the overall dust cycle more realistically compared to the simulations done using online meteorology.

  19. A versatile model for soft patchy particles with various patch arrangements.

    PubMed

    Li, Zhan-Wei; Zhu, You-Liang; Lu, Zhong-Yuan; Sun, Zhao-Yan

    2016-01-21

    We propose a simple and general mesoscale soft patchy particle model, which can felicitously describe the deformable and surface-anisotropic characteristics of soft patchy particles. This model can be used in dynamics simulations to investigate the aggregation behavior and mechanism of various types of soft patchy particles with tunable number, size, direction, and geometrical arrangement of the patches. To improve the computational efficiency of this mesoscale model in dynamics simulations, we give the simulation algorithm that fits the compute unified device architecture (CUDA) framework of NVIDIA graphics processing units (GPUs). The validation of the model and the performance of the simulations using GPUs are demonstrated by simulating several benchmark systems of soft patchy particles with 1 to 4 patches in a regular geometrical arrangement. Because of its simplicity and computational efficiency, the soft patchy particle model will provide a powerful tool to investigate the aggregation behavior of soft patchy particles, such as patchy micelles, patchy microgels, and patchy dendrimers, over larger spatial and temporal scales.

  20. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  1. A Framework to Design and Optimize Chemical Flooding Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  2. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less

  3. Simulation of the Action of a Shock Wave on Titanium Alloy

    NASA Astrophysics Data System (ADS)

    Afanas'eva, S. A.; Belov, N. N.; Burkin, V. V.; Dudarev, E. F.; Ishchenko, A. N.; Rogaev, K. S.; Dudarev, E. F.; Ishchenko, A. N.; Rogaev, K. S.

    2017-01-01

    The laws and mechanism of fracture of coarse-grain and ultrafine-grain titanium under shock-wave loading has been investigated. For the shock wave generator a "SINUS-7" accelerator emitting a nanosecond relativistic highcurrent electron beam was used. To test the high-velocity impact at velocities of the order of 2500 m/s, a ballistic installation of caliber 23 mm was used. The mathematical simulation of the high-velocity interaction was carried out with account for the fracture, the phase transitions, and the dependence of the strength characteristics of materials on the internal energy within the framework of continuum mechanics. For both granular structures the general laws and features of the fracture have been established.

  4. GEOS-5 Chemistry Transport Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kouatchou, J.; Molod, A.; Nielsen, J. E.; Auer, B.; Putman, W.; Clune, T.

    2015-01-01

    The Goddard Earth Observing System version 5 (GEOS-5) General Circulation Model (GCM) makes use of the Earth System Modeling Framework (ESMF) to enable model configurations with many functions. One of the options of the GEOS-5 GCM is the GEOS-5 Chemistry Transport Model (GEOS-5 CTM), which is an offline simulation of chemistry and constituent transport driven by a specified meteorology and other model output fields. This document describes the basic components of the GEOS-5 CTM, and is a user's guide on to how to obtain and run simulations on the NCCS Discover platform. In addition, we provide information on how to change the model configuration input files to meet users' needs.

  5. Decision Manifold Approximation for Physics-Based Simulations

    NASA Technical Reports Server (NTRS)

    Wong, Jay Ming; Samareh, Jamshid A.

    2016-01-01

    With the recent surge of success in big-data driven deep learning problems, many of these frameworks focus on the notion of architecture design and utilizing massive databases. However, in some scenarios massive sets of data may be difficult, and in some cases infeasible, to acquire. In this paper we discuss a trajectory-based framework that quickly learns the underlying decision manifold of binary simulation classifications while judiciously selecting exploratory target states to minimize the number of required simulations. Furthermore, we draw particular attention to the simulation prediction application idealized to the case where failures in simulations can be predicted and avoided, providing machine intelligence to novice analysts. We demonstrate this framework in various forms of simulations and discuss its efficacy.

  6. Generalized lattice Boltzmann equation with forcing term for computation of wall-bounded turbulent flows.

    PubMed

    Premnath, Kannan N; Pattison, Martin J; Banerjee, Sanjoy

    2009-02-01

    In this paper, we present a framework based on the generalized lattice Boltzmann equation (GLBE) using multiple relaxation times with forcing term for eddy capturing simulation of wall-bounded turbulent flows. Due to its flexibility in using disparate relaxation times, the GLBE is well suited to maintaining numerical stability on coarser grids and in obtaining improved solution fidelity of near-wall turbulent fluctuations. The subgrid scale (SGS) turbulence effects are represented by the standard Smagorinsky eddy viscosity model, which is modified by using the van Driest wall-damping function to account for reduction of turbulent length scales near walls. In order to be able to simulate a wider class of problems, we introduce forcing terms, which can represent the effects of general nonuniform forms of forces, in the natural moment space of the GLBE. Expressions for the strain rate tensor used in the SGS model are derived in terms of the nonequilibrium moments of the GLBE to include such forcing terms, which comprise a generalization of those presented in a recent work [Yu, Comput. Fluids 35, 957 (2006)]. Variable resolutions are introduced into this extended GLBE framework through a conservative multiblock approach. The approach, whose optimized implementation is also discussed, is assessed for two canonical flow problems bounded by walls, viz., fully developed turbulent channel flow at a shear or friction Reynolds number (Re) of 183.6 based on the channel half-width and three-dimensional (3D) shear-driven flows in a cubical cavity at a Re of 12 000 based on the side length of the cavity. Comparisons of detailed computed near-wall turbulent flow structure, given in terms of various turbulence statistics, with available data, including those from direct numerical simulations (DNS) and experiments showed good agreement. The GLBE approach also exhibited markedly better stability characteristics and avoided spurious near-wall turbulent fluctuations on coarser grids when compared with the single-relaxation-time (SRT)-based approach. Moreover, its implementation showed excellent parallel scalability on a large parallel cluster with over a thousand processors.

  7. Integrating software architectures for distributed simulations and simulation analysis communities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less

  8. Towards a hierarchical optimization modeling framework for ...

    EPA Pesticide Factsheets

    Background:Bilevel optimization has been recognized as a 2-player Stackelberg game where players are represented as leaders and followers and each pursue their own set of objectives. Hierarchical optimization problems, which are a generalization of bilevel, are especially difficult because the optimization is nested, meaning that the objectives of one level depend on solutions to the other levels. We introduce a hierarchical optimization framework for spatially targeting multiobjective green infrastructure (GI) incentive policies under uncertainties related to policy budget, compliance, and GI effectiveness. We demonstrate the utility of the framework using a hypothetical urban watershed, where the levels are characterized by multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities), and objectives include minimization of policy cost, implementation cost, and risk; reduction of combined sewer overflow (CSO) events; and improvement in environmental benefits such as reduced nutrient run-off and water availability. Conclusions: While computationally expensive, this hierarchical optimization framework explicitly simulates the interaction between multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities) and is especially useful for constructing and evaluating environmental and ecological policy. Using the framework with a hypothetical urba

  9. A Computational Framework for Efficient Low Temperature Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  10. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  11. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  12. A multi-GPU real-time dose simulation software framework for lung radiotherapy.

    PubMed

    Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A

    2012-09-01

    Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.

  13. Hybrid stochastic simplifications for multiscale gene networks.

    PubMed

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-09-07

    Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach.

  14. C/sup 3/ and combat simulation - a survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, S.A. Jr.

    1983-01-04

    This article looks at the overlap between C/sup 3/ and combat simulation, from the point of view of the developer of combat simulations and models. In this context, there are two different questions. The first is: How and to what extent should specific models of the C/sup 3/ processes be incorporated in simulations of combat. Here the key point is the assessment of impact. In which types or levels of combat does C/sup 3/ play a role sufficiently intricate and closely coupled with combat performance that it would significantly affect combat results. Conversely, when is C/sup 3/ a known factormore » or modifier which can be simply accommodated without a specific detailed model being made for it. The second question is the inverse one. In the development of future C/sup 3/ systems, what rule should combat simulation play. Obviously, simulation of the operation of the hardware, software and other parts of the C/sup 3/ system would be useful in its design and specification, but this is not combat simulation. When is it necessary to encase the C/sup 3/ simulation model in a combat model which has enough detail to be considered a simulation itself. How should this outer combat model be scoped out as to the components needed. In order to build a background for answering these questions a two-pronged approach will be taken. First a framework for C/sup 3/ modeling will be developed, in which the various types of modeling which can be done to include or encase C/sup 3/ in a combat model are organized. This framework will hopefully be useful in describing the particular assumptions made in specific models in terms of what could be done in a more general way. Then a few specific models will be described, concentrating on the C/sup 3/ portion of the simulations, or what could be interpreted as the C/sup 3/ assumptions.« less

  15. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.

  16. An artificial intelligence framework for compensating transgressions and its application to diet management.

    PubMed

    Anselma, Luca; Mazzei, Alessandro; De Michieli, Franco

    2017-04-01

    Today, there is considerable interest in personal healthcare. The pervasiveness of technology allows to precisely track human behavior; however, when dealing with the development of an intelligent assistant exploiting data acquired through such technologies, a critical issue has to be taken into account; namely, that of supporting the user in the event of any transgression with respect to the optimal behavior. In this paper we present a reasoning framework based on Simple Temporal Problems that can be applied to a general class of problems, which we called cake&carrot problems, to support reasoning in presence of human transgression. The reasoning framework offers a number of facilities to ensure a smart management of possible "wrong behaviors" by a user to reach the goals defined by the problem. This paper describes the framework by means of the prototypical use case of diet domain. Indeed, following a healthy diet can be a difficult task for both practical and psychological reasons and dietary transgressions are hard to avoid. Therefore, the framework is tolerant to dietary transgressions and adapts the following meals to facilitate users in recovering from such transgressions. Finally, through a simulation involving a real hospital menu, we show that the framework can effectively achieve good results in a realistic scenario. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Clinical simulation practise framework.

    PubMed

    Khalili, Hossein

    2015-02-01

    Historically, simulation has mainly been used to teach students hands-on skills in a relatively safe environment. With changes in the patient population, professional regulations and clinical environments, clinical simulation practise (CSP) must assist students to integrate and apply their theoretical knowledge and skills with their critical thinking, clinical judgement, prioritisation, problem solving, decision making, and teamwork skills to provide holistic care and treatment to their patients. CSP holds great potential to derive a positive transformation in students' transition into the workplace, by associating and consolidating learning from classrooms to clinical settings, and creating bridges between theory and practice. For CSP to be successful in filling the gap, the design and management of the simulation is crucial. In this article a new framework called 'Clinical simulation practise framework: A knowledge to action strategy in health professional education' is being introduced that aims to assist educators and curriculum developers in designing and managing their simulations. This CSP framework theorises that simulation as an experiential educational tool could improve students' competence, confidence and collaboration in performing professional practice in real settings if the CSP provides the following three dimensions: (1) a safe, positive, reflective and fun simulated learning environment; (2) challenging, but realistic, and integrated simulated scenarios; and (3) interactive, inclusive, interprofessional patient-centred simulated practise. © 2015 John Wiley & Sons Ltd.

  18. Exploring the implication of climate process uncertainties within the Earth System Framework

    NASA Astrophysics Data System (ADS)

    Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.

    2011-12-01

    Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).

  19. Quantifying the effect of hydrogen on dislocation dynamics: A three-dimensional discrete dislocation dynamics framework

    NASA Astrophysics Data System (ADS)

    Gu, Yejun; El-Awady, Jaafar A.

    2018-03-01

    We present a new framework to quantify the effect of hydrogen on dislocations using large scale three-dimensional (3D) discrete dislocation dynamics (DDD) simulations. In this model, the first order elastic interaction energy associated with the hydrogen-induced volume change is accounted for. The three-dimensional stress tensor induced by hydrogen concentration, which is in equilibrium with respect to the dislocation stress field, is derived using the Eshelby inclusion model, while the hydrogen bulk diffusion is treated as a continuum process. This newly developed framework is utilized to quantify the effect of different hydrogen concentrations on the dynamics of a glide dislocation in the absence of an applied stress field as well as on the spacing between dislocations in an array of parallel edge dislocations. A shielding effect is observed for materials having a large hydrogen diffusion coefficient, with the shield effect leading to the homogenization of the shrinkage process leading to the glide loop maintaining its circular shape, as well as resulting in a decrease in dislocation separation distances in the array of parallel edge dislocations. On the other hand, for materials having a small hydrogen diffusion coefficient, the high hydrogen concentrations around the edge characters of the dislocations act to pin them. Higher stresses are required to be able to unpin the dislocations from the hydrogen clouds surrounding them. Finally, this new framework can open the door for further large scale studies on the effect of hydrogen on the different aspects of dislocation-mediated plasticity in metals. With minor modifications of the current formulations, the framework can also be extended to account for general inclusion-induced stress field in discrete dislocation dynamics simulations.

  20. Sequential-Optimization-Based Framework for Robust Modeling and Design of Heterogeneous Catalytic Systems

    DOE PAGES

    Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos

    2017-11-09

    Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less

  1. A Robust State Estimation Framework Considering Measurement Correlations and Imperfect Synchronization

    DOE PAGES

    Zhao, Junbo; Wang, Shaobu; Mili, Lamine; ...

    2018-01-08

    Here, this paper develops a robust power system state estimation framework with the consideration of measurement correlations and imperfect synchronization. In the framework, correlations of SCADA and Phasor Measurements (PMUs) are calculated separately through unscented transformation and a Vector Auto-Regression (VAR) model. In particular, PMU measurements during the waiting period of two SCADA measurement scans are buffered to develop the VAR model with robustly estimated parameters using projection statistics approach. The latter takes into account the temporal and spatial correlations of PMU measurements and provides redundant measurements to suppress bad data and mitigate imperfect synchronization. In case where the SCADAmore » and PMU measurements are not time synchronized, either the forecasted PMU measurements or the prior SCADA measurements from the last estimation run are leveraged to restore system observability. Then, a robust generalized maximum-likelihood (GM)-estimator is extended to integrate measurement error correlations and to handle the outliers in the SCADA and PMU measurements. Simulation results that stem from a comprehensive comparison with other alternatives under various conditions demonstrate the benefits of the proposed framework.« less

  2. Sequential state estimation of nonlinear/non-Gaussian systems with stochastic input for turbine degradation estimation

    NASA Astrophysics Data System (ADS)

    Hanachi, Houman; Liu, Jie; Banerjee, Avisekh; Chen, Ying

    2016-05-01

    Health state estimation of inaccessible components in complex systems necessitates effective state estimation techniques using the observable variables of the system. The task becomes much complicated when the system is nonlinear/non-Gaussian and it receives stochastic input. In this work, a novel sequential state estimation framework is developed based on particle filtering (PF) scheme for state estimation of general class of nonlinear dynamical systems with stochastic input. Performance of the developed framework is then validated with simulation on a Bivariate Non-stationary Growth Model (BNGM) as a benchmark. In the next step, three-year operating data of an industrial gas turbine engine (GTE) are utilized to verify the effectiveness of the developed framework. A comprehensive thermodynamic model for the GTE is therefore developed to formulate the relation of the observable parameters and the dominant degradation symptoms of the turbine, namely, loss of isentropic efficiency and increase of the mass flow. The results confirm the effectiveness of the developed framework for simultaneous estimation of multiple degradation symptoms in complex systems with noisy measured inputs.

  3. A Robust State Estimation Framework Considering Measurement Correlations and Imperfect Synchronization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Junbo; Wang, Shaobu; Mili, Lamine

    Here, this paper develops a robust power system state estimation framework with the consideration of measurement correlations and imperfect synchronization. In the framework, correlations of SCADA and Phasor Measurements (PMUs) are calculated separately through unscented transformation and a Vector Auto-Regression (VAR) model. In particular, PMU measurements during the waiting period of two SCADA measurement scans are buffered to develop the VAR model with robustly estimated parameters using projection statistics approach. The latter takes into account the temporal and spatial correlations of PMU measurements and provides redundant measurements to suppress bad data and mitigate imperfect synchronization. In case where the SCADAmore » and PMU measurements are not time synchronized, either the forecasted PMU measurements or the prior SCADA measurements from the last estimation run are leveraged to restore system observability. Then, a robust generalized maximum-likelihood (GM)-estimator is extended to integrate measurement error correlations and to handle the outliers in the SCADA and PMU measurements. Simulation results that stem from a comprehensive comparison with other alternatives under various conditions demonstrate the benefits of the proposed framework.« less

  4. Sequential-Optimization-Based Framework for Robust Modeling and Design of Heterogeneous Catalytic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos

    Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less

  5. A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition

    NASA Technical Reports Server (NTRS)

    Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.

    2012-01-01

    A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.

  6. Towards a Theoretical Framework for Educational Simulations.

    ERIC Educational Resources Information Center

    Winer, Laura R.; Vazquez-Abad, Jesus

    1981-01-01

    Discusses the need for a sustained and systematic effort toward establishing a theoretical framework for educational simulations, proposes the adaptation of models borrowed from the natural and applied sciences, and describes three simulations based on such a model adapted using Brunerian learning theory. Sixteen references are listed. (LLS)

  7. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodrigues, Anna; Yin, Fang-Fang; Wu, Qiuwen, E-mail: Qiuwen.Wu@Duke.edu

    2015-05-15

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimatedmore » field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm{sup 2} were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R{sub 100}, R{sub 50}, R{sub p}, and R{sub p+} for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R{sub 100}, R{sub 50}, and R{sub p} were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6 to 20 MeV for open and collimated field sizes from 3 × 3 to 25 × 25 cm{sup 2} were studied and results were compared to the measurement data with excellent agreement. Application of this framework can thus be used as the platform for treatment planning of dynamic electron arc radiotherapy and other advanced dynamic techniques with electron beams.« less

  8. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs.

    PubMed

    Rodrigues, Anna; Sawkey, Daren; Yin, Fang-Fang; Wu, Qiuwen

    2015-05-01

    To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimated field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm(2) were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R100, R50, Rp, and Rp+ for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R100, R50, and Rp were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6 to 20 MeV for open and collimated field sizes from 3 × 3 to 25 × 25 cm(2) were studied and results were compared to the measurement data with excellent agreement. Application of this framework can thus be used as the platform for treatment planning of dynamic electron arc radiotherapy and other advanced dynamic techniques with electron beams.

  9. Competency-Based Training and Simulation: Making a "Valid" Argument.

    PubMed

    Noureldin, Yasser A; Lee, Jason Y; McDougall, Elspeth M; Sweet, Robert M

    2018-02-01

    The use of simulation as an assessment tool is much more controversial than is its utility as an educational tool. However, without valid simulation-based assessment tools, the ability to objectively assess technical skill competencies in a competency-based medical education framework will remain challenging. The current literature in urologic simulation-based training and assessment uses a definition and framework of validity that is now outdated. This is probably due to the absence of awareness rather than an absence of comprehension. The following review article provides the urologic community an updated taxonomy on validity theory as it relates to simulation-based training and assessments and translates our simulation literature to date into this framework. While the old taxonomy considered validity as distinct subcategories and focused on the simulator itself, the modern taxonomy, for which we translate the literature evidence, considers validity as a unitary construct with a focus on interpretation of simulator data/scores.

  10. Effect of open metal sites on adsorption of polar and nonpolar molecules in metal-organic framework Cu-BTC.

    PubMed

    Karra, Jagadeswara R; Walton, Krista S

    2008-08-19

    Atomistic grand canonical Monte Carlo simulations were performed in this work to investigate the role of open copper sites of Cu-BTC in affecting the separation of carbon monoxide from binary mixtures containing methane, nitrogen, or hydrogen. Mixtures containing 5%, 50%, or 95% CO were examined. The simulations show that electrostatic interactions between the CO dipole and the partial charges on the metal-organic framework (MOF) atoms dominate the adsorption mechanism. The binary simulations show that Cu-BTC is quite selective for CO over hydrogen and nitrogen for all three mixture compositions at 298 K. The removal of CO from a 5% mixture with methane is slightly enhanced by the electrostatic interactions of CO with the copper sites. However, the pore space of Cu-BTC is large enough to accommodate both molecules at their pure-component loadings, and in general, Cu-BTC exhibits no significant selectivity for CO over methane for the equimolar and 95% mixtures. On the basis of the pure-component and low-concentration behavior of CO, the results indicate that MOFs with open metal sites have the potential for enhancing adsorption separations of molecules of differing polarities, but the pore size relative to the sorbate size will also play a significant role.

  11. dsmcFoam+: An OpenFOAM based direct simulation Monte Carlo solver

    NASA Astrophysics Data System (ADS)

    White, C.; Borg, M. K.; Scanlon, T. J.; Longshaw, S. M.; John, B.; Emerson, D. R.; Reese, J. M.

    2018-03-01

    dsmcFoam+ is a direct simulation Monte Carlo (DSMC) solver for rarefied gas dynamics, implemented within the OpenFOAM software framework, and parallelised with MPI. It is open-source and released under the GNU General Public License in a publicly available software repository that includes detailed documentation and tutorial DSMC gas flow cases. This release of the code includes many features not found in standard dsmcFoam, such as molecular vibrational and electronic energy modes, chemical reactions, and subsonic pressure boundary conditions. Since dsmcFoam+ is designed entirely within OpenFOAM's C++ object-oriented framework, it benefits from a number of key features: the code emphasises extensibility and flexibility so it is aimed first and foremost as a research tool for DSMC, allowing new models and test cases to be developed and tested rapidly. All DSMC cases are as straightforward as setting up any standard OpenFOAM case, as dsmcFoam+ relies upon the standard OpenFOAM dictionary based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of a DSMC simulation is not typical of most OpenFOAM applications. We show that dsmcFoam+ compares well to other well-known DSMC codes and to analytical solutions in terms of benchmark results.

  12. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    ERIC Educational Resources Information Center

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  13. Physically Based Modeling and Simulation with Dynamic Spherical Volumetric Simplex Splines

    PubMed Central

    Tan, Yunhao; Hua, Jing; Qin, Hong

    2009-01-01

    In this paper, we present a novel computational modeling and simulation framework based on dynamic spherical volumetric simplex splines. The framework can handle the modeling and simulation of genus-zero objects with real physical properties. In this framework, we first develop an accurate and efficient algorithm to reconstruct the high-fidelity digital model of a real-world object with spherical volumetric simplex splines which can represent with accuracy geometric, material, and other properties of the object simultaneously. With the tight coupling of Lagrangian mechanics, the dynamic volumetric simplex splines representing the object can accurately simulate its physical behavior because it can unify the geometric and material properties in the simulation. The visualization can be directly computed from the object’s geometric or physical representation based on the dynamic spherical volumetric simplex splines during simulation without interpolation or resampling. We have applied the framework for biomechanic simulation of brain deformations, such as brain shifting during the surgery and brain injury under blunt impact. We have compared our simulation results with the ground truth obtained through intra-operative magnetic resonance imaging and the real biomechanic experiments. The evaluations demonstrate the excellent performance of our new technique. PMID:20161636

  14. Verifiable fault tolerance in measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Hayashi, Masahito

    2017-09-01

    Quantum systems, in general, cannot be simulated efficiently by a classical computer, and hence are useful for solving certain mathematical problems and simulating quantum many-body systems. This also implies, unfortunately, that verification of the output of the quantum systems is not so trivial, since predicting the output is exponentially hard. As another problem, the quantum system is very delicate for noise and thus needs an error correction. Here, we propose a framework for verification of the output of fault-tolerant quantum computation in a measurement-based model. In contrast to existing analyses on fault tolerance, we do not assume any noise model on the resource state, but an arbitrary resource state is tested by using only single-qubit measurements to verify whether or not the output of measurement-based quantum computation on it is correct. Verifiability is equipped by a constant time repetition of the original measurement-based quantum computation in appropriate measurement bases. Since full characterization of quantum noise is exponentially hard for large-scale quantum computing systems, our framework provides an efficient way to practically verify the experimental quantum error correction.

  15. An Algorithm for Interactive Modeling of Space-Transportation Engine Simulations: A Constraint Satisfaction Approach

    NASA Technical Reports Server (NTRS)

    Mitra, Debasis; Thomas, Ajai; Hemminger, Joseph; Sakowski, Barbara

    2001-01-01

    In this research we have developed an algorithm for the purpose of constraint processing by utilizing relational algebraic operators. Van Beek and others have investigated in the past this type of constraint processing from within a relational algebraic framework, producing some unique results. Apart from providing new theoretical angles, this approach also gives the opportunity to use the existing efficient implementations of relational database management systems as the underlying data structures for any relevant algorithm. Our algorithm here enhances that framework. The algorithm is quite general in its current form. Weak heuristics (like forward checking) developed within the Constraint-satisfaction problem (CSP) area could be also plugged easily within this algorithm for further enhancements of efficiency. The algorithm as developed here is targeted toward a component-oriented modeling problem that we are currently working on, namely, the problem of interactive modeling for batch-simulation of engineering systems (IMBSES). However, it could be adopted for many other CSP problems as well. The research addresses the algorithm and many aspects of the problem IMBSES that we are currently handling.

  16. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Thomas; Efendiev, Yalchin; Tchelepi, Hamdi

    2016-05-24

    Our work in this project is aimed at making fundamental advances in multiscale methods for flow and transport in highly heterogeneous porous media. The main thrust of this research is to develop a systematic multiscale analysis and efficient coarse-scale models that can capture global effects and extend existing multiscale approaches to problems with additional physics and uncertainties. A key emphasis is on problems without an apparent scale separation. Multiscale solution methods are currently under active investigation for the simulation of subsurface flow in heterogeneous formations. These procedures capture the effects of fine-scale permeability variations through the calculation of specialized coarse-scalemore » basis functions. Most of the multiscale techniques presented to date employ localization approximations in the calculation of these basis functions. For some highly correlated (e.g., channelized) formations, however, global effects are important and these may need to be incorporated into the multiscale basis functions. Other challenging issues facing multiscale simulations are the extension of existing multiscale techniques to problems with additional physics, such as compressibility, capillary effects, etc. In our project, we explore the improvement of multiscale methods through the incorporation of additional (single-phase flow) information and the development of a general multiscale framework for flows in the presence of uncertainties, compressible flow and heterogeneous transport, and geomechanics. We have considered (1) adaptive local-global multiscale methods, (2) multiscale methods for the transport equation, (3) operator-based multiscale methods and solvers, (4) multiscale methods in the presence of uncertainties and applications, (5) multiscale finite element methods for high contrast porous media and their generalizations, and (6) multiscale methods for geomechanics.« less

  17. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The appendices to the original report are contained in this document.

  18. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  19. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The findings of the assessment are contained in this report.

  20. Comparison and Contrast of Two General Functional Regression Modeling Frameworks

    PubMed Central

    Morris, Jeffrey S.

    2017-01-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502

  1. Comparison and Contrast of Two General Functional Regression Modeling Frameworks.

    PubMed

    Morris, Jeffrey S

    2017-02-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.

  2. Hierarchical control and performance evaluation of multi-vehicle autonomous systems

    NASA Astrophysics Data System (ADS)

    Balakirsky, Stephen; Scrapper, Chris; Messina, Elena

    2005-05-01

    This paper will describe how the Mobility Open Architecture Tools and Simulation (MOAST) framework can facilitate performance evaluations of RCS compliant multi-vehicle autonomous systems. This framework provides an environment that allows for simulated and real architectural components to function seamlessly together. By providing repeatable environmental conditions, this framework allows for the development of individual components as well as component performance metrics. MOAST is composed of high-fidelity and low-fidelity simulation systems, a detailed model of real-world terrain, actual hardware components, a central knowledge repository, and architectural glue to tie all of the components together. This paper will describe the framework"s components in detail and provide an example that illustrates how the framework can be utilized to develop and evaluate a single architectural component through the use of repeatable trials and experimentation that includes both virtual and real components functioning together

  3. On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    PubMed

    Kaabi, Mohamed Ghaith; Tonnelier, Arnaud; Martinez, Dominique

    2011-05-01

    In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced an approximate event-driven strategy, named voltage stepping, that allows the generic simulation of nonlinear spiking neurons. Promising results were achieved in the simulation of single quadratic integrate-and-fire neurons. Here, we assess the performance of voltage stepping in network simulations by considering more complex neurons (quadratic integrate-and-fire neurons with adaptation) coupled with multiple synapses. To handle the discrete nature of synaptic interactions, we recast voltage stepping in a general framework, the discrete event system specification. The efficiency of the method is assessed through simulations and comparisons with a modified time-stepping scheme of the Runge-Kutta type. We demonstrated numerically that the original order of voltage stepping is preserved when simulating connected spiking neurons, independent of the network activity and connectivity.

  4. Coupling Visualization, Simulation, and Deep Learning for Ensemble Steering of Complex Energy Models: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, Kristin C; Brunhart-Lupo, Nicholas J; Bush, Brian W

    We have developed a framework for the exploration, design, and planning of energy systems that combines interactive visualization with machine-learning based approximations of simulations through a general purpose dataflow API. Our system provides a visual inter- face allowing users to explore an ensemble of energy simulations representing a subset of the complex input parameter space, and spawn new simulations to 'fill in' input regions corresponding to new enegery system scenarios. Unfortunately, many energy simula- tions are far too slow to provide interactive responses. To support interactive feedback, we are developing reduced-form models via machine learning techniques, which provide statistically soundmore » esti- mates of the full simulations at a fraction of the computational cost and which are used as proxies for the full-form models. Fast com- putation and an agile dataflow enhance the engagement with energy simulations, and allow researchers to better allocate computational resources to capture informative relationships within the system and provide a low-cost method for validating and quality-checking large-scale modeling efforts.« less

  5. Concurrent generation of multivariate mixed data with variables of dissimilar types.

    PubMed

    Amatya, Anup; Demirtas, Hakan

    2016-01-01

    Data sets originating from wide range of research studies are composed of multiple variables that are correlated and of dissimilar types, primarily of count, binary/ordinal and continuous attributes. The present paper builds on the previous works on multivariate data generation and develops a framework for generating multivariate mixed data with a pre-specified correlation matrix. The generated data consist of components that are marginally count, binary, ordinal and continuous, where the count and continuous variables follow the generalized Poisson and normal distributions, respectively. The use of the generalized Poisson distribution provides a flexible mechanism which allows under- and over-dispersed count variables generally encountered in practice. A step-by-step algorithm is provided and its performance is evaluated using simulated and real-data scenarios.

  6. Synchronization of a Class of Switched Neural Networks with Time-Varying Delays via Nonlinear Feedback Control.

    PubMed

    Wang, Leimin; Shen, Yi; Zhang, Guodong

    2016-10-01

    This paper is concerned with the synchronization problem for a class of switched neural networks (SNNs) with time-varying delays. First, a new crucial lemma which includes and extends the classical exponential stability theorem is constructed. Then by using the lemma, new algebraic criteria of ψ -type synchronization (synchronization with general decay rate) for SNNs are established via the designed nonlinear feedback control. The ψ -type synchronization which is in a general framework is obtained by introducing a ψ -type function. It contains exponential synchronization, polynomial synchronization, and other synchronization as its special cases. The results of this paper are general, and they also complement and extend some previous results. Finally, numerical simulations are carried out to demonstrate the effectiveness of the obtained results.

  7. KMgene: a unified R package for gene-based association analysis for complex traits.

    PubMed

    Yan, Qi; Fang, Zhou; Chen, Wei; Stegle, Oliver

    2018-02-09

    In this report, we introduce an R package KMgene for performing gene-based association tests for familial, multivariate or longitudinal traits using kernel machine (KM) regression under a generalized linear mixed model (GLMM) framework. Extensive simulations were performed to evaluate the validity of the approaches implemented in KMgene. http://cran.r-project.org/web/packages/KMgene. qi.yan@chp.edu or wei.chen@chp.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  8. The l z ( p ) * Person-Fit Statistic in an Unfolding Model Context.

    PubMed

    Tendeiro, Jorge N

    2017-01-01

    Although person-fit analysis has a long-standing tradition within item response theory, it has been applied in combination with dominance response models almost exclusively. In this article, a popular log likelihood-based parametric person-fit statistic under the framework of the generalized graded unfolding model is used. Results from a simulation study indicate that the person-fit statistic performed relatively well in detecting midpoint response style patterns and not so well in detecting extreme response style patterns.

  9. PharmML in Action: an Interoperable Language for Modeling and Simulation

    PubMed Central

    Bizzotto, R; Smith, G; Yvon, F; Kristensen, NR; Swat, MJ

    2017-01-01

    PharmML1 is an XML‐based exchange format2, 3, 4 created with a focus on nonlinear mixed‐effect (NLME) models used in pharmacometrics,5, 6 but providing a very general framework that also allows describing mathematical and statistical models such as single‐subject or nonlinear and multivariate regression models. This tutorial provides an overview of the structure of this language, brief suggestions on how to work with it, and use cases demonstrating its power and flexibility. PMID:28575551

  10. Simulation Framework to Estimate the Performance of CO2 and O2 Sensing from Space and Airborne Platforms for the ASCENDS Mission Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Plitau, Denis; Prasad, Narasimha S.

    2012-01-01

    The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.

  11. Gathering Validity Evidence for Surgical Simulation: A Systematic Review.

    PubMed

    Borgersen, Nanna Jo; Naur, Therese M H; Sørensen, Stine M D; Bjerrum, Flemming; Konge, Lars; Subhi, Yousif; Thomsen, Ann Sofia S

    2018-06-01

    To identify current trends in the use of validity frameworks in surgical simulation, to provide an overview of the evidence behind the assessment of technical skills in all surgical specialties, and to present recommendations and guidelines for future validity studies. Validity evidence for assessment tools used in the evaluation of surgical performance is of paramount importance to ensure valid and reliable assessment of skills. We systematically reviewed the literature by searching 5 databases (PubMed, EMBASE, Web of Science, PsycINFO, and the Cochrane Library) for studies published from January 1, 2008, to July 10, 2017. We included original studies evaluating simulation-based assessments of health professionals in surgical specialties and extracted data on surgical specialty, simulator modality, participant characteristics, and the validity framework used. Data were synthesized qualitatively. We identified 498 studies with a total of 18,312 participants. Publications involving validity assessments in surgical simulation more than doubled from 2008 to 2010 (∼30 studies/year) to 2014 to 2016 (∼70 to 90 studies/year). Only 6.6% of the studies used the recommended contemporary validity framework (Messick). The majority of studies used outdated frameworks such as face validity. Significant differences were identified across surgical specialties. The evaluated assessment tools were mostly inanimate or virtual reality simulation models. An increasing number of studies have gathered validity evidence for simulation-based assessments in surgical specialties, but the use of outdated frameworks remains common. To address the current practice, this paper presents guidelines on how to use the contemporary validity framework when designing validity studies.

  12. Gay-Berne and electrostatic multipole based coarse-grain potential in implicit solvent

    PubMed Central

    Wu, Johnny; Zhen, Xia; Shen, Hujun; Li, Guohui; Ren, Pengyu

    2011-01-01

    A general, transferable coarse-grain (CG) framework based on the Gay-Berne potential and electrostatic point multipole expansion is presented for polypeptide simulations. The solvent effect is described by the Generalized Kirkwood theory. The CG model is calibrated using the results of all-atom simulations of model compounds in solution. Instead of matching the overall effective forces produced by atomic models, the fundamental intermolecular forces such as electrostatic, repulsion-dispersion, and solvation are represented explicitly at a CG level. We demonstrate that the CG alanine dipeptide model is able to reproduce quantitatively the conformational energy of all-atom force fields in both gas and solution phases, including the electrostatic and solvation components. Replica exchange molecular dynamics and microsecond dynamic simulations of polyalanine of 5 and 12 residues reveal that the CG polyalanines fold into “alpha helix” and “beta sheet” structures. The 5-residue polyalanine displays a substantial increase in the “beta strand” fraction relative to the 12-residue polyalanine. The detailed conformational distribution is compared with those reported from recent all-atom simulations and experiments. The results suggest that the new coarse-graining approach presented in this study has the potential to offer both accuracy and efficiency for biomolecular modeling. PMID:22029338

  13. BioASF: a framework for automatically generating executable pathway models specified in BioPAX.

    PubMed

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap

    2016-06-15

    Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.

  14. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE PAGES

    Huang, Qiuhua; Vittal, Vijay

    2018-05-09

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  15. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vittal, Vijay

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  16. Study on general design of dual-DMD based infrared two-band scene simulation system

    NASA Astrophysics Data System (ADS)

    Pan, Yue; Qiao, Yang; Xu, Xi-ping

    2017-02-01

    Mid-wave infrared(MWIR) and long-wave infrared(LWIR) two-band scene simulation system is a kind of testing equipment that used for infrared two-band imaging seeker. Not only it would be qualified for working waveband, but also realize the essence requests that infrared radiation characteristics should correspond to the real scene. Past single-digital micromirror device (DMD) based infrared scene simulation system does not take the huge difference between targets and background radiation into account, and it cannot realize the separated modulation to two-band light beam. Consequently, single-DMD based infrared scene simulation system cannot accurately express the thermal scene model that upper-computer built, and it is not that practical. To solve the problem, we design a dual-DMD based, dual-channel, co-aperture, compact-structure infrared two-band scene simulation system. The operating principle of the system is introduced in detail, and energy transfer process of the hardware-in-the-loop simulation experiment is analyzed as well. Also, it builds the equation about the signal-to-noise ratio of infrared detector in the seeker, directing the system overall design. The general design scheme of system is given, including the creation of infrared scene model, overall control, optical-mechanical structure design and image registration. By analyzing and comparing the past designs, we discuss the arrangement of optical engine framework in the system. According to the main content of working principle and overall design, we summarize each key techniques in the system.

  17. Combining Monte Carlo methods with coherent wave optics for the simulation of phase-sensitive X-ray imaging

    PubMed Central

    Peter, Silvia; Modregger, Peter; Fix, Michael K.; Volken, Werner; Frei, Daniel; Manser, Peter; Stampanoni, Marco

    2014-01-01

    Phase-sensitive X-ray imaging shows a high sensitivity towards electron density variations, making it well suited for imaging of soft tissue matter. However, there are still open questions about the details of the image formation process. Here, a framework for numerical simulations of phase-sensitive X-ray imaging is presented, which takes both particle- and wave-like properties of X-rays into consideration. A split approach is presented where we combine a Monte Carlo method (MC) based sample part with a wave optics simulation based propagation part, leading to a framework that takes both particle- and wave-like properties into account. The framework can be adapted to different phase-sensitive imaging methods and has been validated through comparisons with experiments for grating interferometry and propagation-based imaging. The validation of the framework shows that the combination of wave optics and MC has been successfully implemented and yields good agreement between measurements and simulations. This demonstrates that the physical processes relevant for developing a deeper understanding of scattering in the context of phase-sensitive imaging are modelled in a sufficiently accurate manner. The framework can be used for the simulation of phase-sensitive X-ray imaging, for instance for the simulation of grating interferometry or propagation-based imaging. PMID:24763652

  18. PyPWA: A partial-wave/amplitude analysis software framework

    NASA Astrophysics Data System (ADS)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  19. Multistate modelling extended by behavioural rules: An application to migration.

    PubMed

    Klabunde, Anna; Zinn, Sabine; Willekens, Frans; Leuchter, Matthias

    2017-10-01

    We propose to extend demographic multistate models by adding a behavioural element: behavioural rules explain intentions and thus transitions. Our framework is inspired by the Theory of Planned Behaviour. We exemplify our approach with a model of migration from Senegal to France. Model parameters are determined using empirical data where available. Parameters for which no empirical correspondence exists are determined by calibration. Age- and period-specific migration rates are used for model validation. Our approach adds to the toolkit of demographic projection by allowing for shocks and social influence, which alter behaviour in non-linear ways, while sticking to the general framework of multistate modelling. Our simulations yield that higher income growth in Senegal leads to higher emigration rates in the medium term, while a decrease in fertility yields lower emigration rates.

  20. Isca, v1.0: a framework for the global modelling of the atmospheres of Earth and other planets at varying levels of complexity

    NASA Astrophysics Data System (ADS)

    Vallis, Geoffrey K.; Colyer, Greg; Geen, Ruth; Gerber, Edwin; Jucker, Martin; Maher, Penelope; Paterson, Alexander; Pietschnig, Marianne; Penn, James; Thomson, Stephen I.

    2018-03-01

    Isca is a framework for the idealized modelling of the global circulation of planetary atmospheres at varying levels of complexity and realism. The framework is an outgrowth of models from the Geophysical Fluid Dynamics Laboratory in Princeton, USA, designed for Earth's atmosphere, but it may readily be extended into other planetary regimes. Various forcing and radiation options are available, from dry, time invariant, Newtonian thermal relaxation to moist dynamics with radiative transfer. Options are available in the dry thermal relaxation scheme to account for the effects of obliquity and eccentricity (and so seasonality), different atmospheric optical depths and a surface mixed layer. An idealized grey radiation scheme, a two-band scheme, and a multiband scheme are also available, all with simple moist effects and astronomically based solar forcing. At the complex end of the spectrum the framework provides a direct connection to comprehensive atmospheric general circulation models. For Earth modelling, options include an aquaplanet and configurable continental outlines and topography. Continents may be defined by changing albedo, heat capacity, and evaporative parameters and/or by using a simple bucket hydrology model. Oceanic Q fluxes may be added to reproduce specified sea surface temperatures, with arbitrary continental distributions. Planetary atmospheres may be configured by changing planetary size and mass, solar forcing, atmospheric mass, radiation, and other parameters. Examples are given of various Earth configurations as well as a giant planet simulation, a slowly rotating terrestrial planet simulation, and tidally locked and other orbitally resonant exoplanet simulations. The underlying model is written in Fortran and may largely be configured with Python scripts. Python scripts are also used to run the model on different architectures, to archive the output, and for diagnostics, graphics, and post-processing. All of these features are publicly available in a Git-based repository.

  1. Scaling and stochastic cascade properties of NEMO oceanic simulations and their potential value for GCM evaluation and downscaling

    NASA Astrophysics Data System (ADS)

    Verrier, Sébastien; Crépon, Michel; Thiria, Sylvie

    2014-09-01

    Spectral scaling properties have already been evidenced on oceanic numerical simulations and have been subject to several interpretations. They can be used to evaluate classical turbulence theories that predict scaling with specific exponents and to evaluate the quality of GCM outputs from a statistical and multiscale point of view. However, a more complete framework based on multifractal cascades is able to generalize the classical but restrictive second-order spectral framework to other moment orders, providing an accurate description of probability distributions of the fields at multiple scales. The predictions of this formalism still needed systematic verification in oceanic GCM while they have been confirmed recently for their atmospheric counterparts by several papers. The present paper is devoted to a systematic analysis of several oceanic fields produced by the NEMO oceanic GCM. Attention is focused to regional, idealized configurations that permit to evaluate the NEMO engine core from a scaling point of view regardless of limitations involved by land masks. Based on classical multifractal analysis tools, multifractal properties were evidenced for several oceanic state variables (sea surface temperature and salinity, velocity components, etc.). While first-order structure functions estimated a different nonconservativity parameter H in two scaling ranges, the multiorder statistics of turbulent fluxes were scaling over almost the whole available scaling range. This multifractal scaling was then parameterized with the help of the universal multifractal framework, providing parameters that are coherent with existing empirical literature. Finally, we argue that the knowledge of these properties may be useful for oceanographers. The framework seems very well suited for the statistical evaluation of OGCM outputs. Moreover, it also provides practical solutions to simulate subpixel variability stochastically for GCM downscaling purposes. As an independent perspective, the existence of multifractal properties in oceanic flows seems also interesting for investigating scale dependencies in remote sensing inversion algorithms.

  2. Characterizing ponds in watershed simulations and evaluating their influence on streamflowin a Mississippi Watershed

    USDA-ARS?s Scientific Manuscript database

    Small water bodies are common landscape features, but often are not simulated within a watershed modeling framework. The wetland modeling tool, AgWET, uses a GIS framework to characterize the features of ponds and wetlands so that they can be incorporated into watershed simulations using the Annuali...

  3. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  4. Enabling parallel simulation of large-scale HPC network systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  5. Architectural Framework for Addressing Legacy Waste from the Cold War - 13611

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Love, Gregory A.; Glazner, Christopher G.; Steckley, Sam

    We present an architectural framework for the use of a hybrid simulation model of enterprise-wide operations used to develop system-level insight into the U.S. Department of Energy's (DOE) environmental cleanup of legacy nuclear waste at the Savannah River Site. We use this framework for quickly exploring policy and architectural options, analyzing plans, addressing management challenges and developing mitigation strategies for DOE Office of Environmental Management (EM). The socio-technical complexity of EM's mission compels the use of a qualitative approach to complement a more a quantitative discrete event modeling effort. We use this model-based analysis to pinpoint pressure and leverage pointsmore » and develop a shared conceptual understanding of the problem space and platform for communication among stakeholders across the enterprise in a timely manner. This approach affords the opportunity to discuss problems using a unified conceptual perspective and is also general enough that it applies to a broad range of capital investment/production operations problems. (authors)« less

  6. A general-purpose framework to simulate musculoskeletal system of human body: using a motion tracking approach.

    PubMed

    Ehsani, Hossein; Rostami, Mostafa; Gudarzi, Mohammad

    2016-02-01

    Computation of muscle force patterns that produce specified movements of muscle-actuated dynamic models is an important and challenging problem. This problem is an undetermined one, and then a proper optimization is required to calculate muscle forces. The purpose of this paper is to develop a general model for calculating all muscle activation and force patterns in an arbitrary human body movement. For this aim, the equations of a multibody system forward dynamics, which is considered for skeletal system of the human body model, is derived using Lagrange-Euler formulation. Next, muscle contraction dynamics is added to this model and forward dynamics of an arbitrary musculoskeletal system is obtained. For optimization purpose, the obtained model is used in computed muscle control algorithm, and a closed-loop system for tracking desired motions is derived. Finally, a popular sport exercise, biceps curl, is simulated by using this algorithm and the validity of the obtained results is evaluated via EMG signals.

  7. Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration

    NASA Technical Reports Server (NTRS)

    Lin, Risheng; Afjeh, Abdollah A.

    2003-01-01

    This paper discusses the detailed design of an XML databinding framework for aircraft engine simulation. The framework provides an object interface to access and use engine data. while at the same time preserving the meaning of the original data. The Language independent representation of engine component data enables users to move around XML data using HTTP through disparate networks. The application of this framework is demonstrated via a web-based turbofan propulsion system simulation using the World Wide Web (WWW). A Java Servlet based web component architecture is used for rendering XML engine data into HTML format and dealing with input events from the user, which allows users to interact with simulation data from a web browser. The simulation data can also be saved to a local disk for archiving or to restart the simulation at a later time.

  8. iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems

    NASA Astrophysics Data System (ADS)

    Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.

    2017-11-01

    iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.

  9. Variance-Based Cluster Selection Criteria in a K-Means Framework for One-Mode Dissimilarity Data.

    PubMed

    Vera, J Fernando; Macías, Rodrigo

    2017-06-01

    One of the main problems in cluster analysis is that of determining the number of groups in the data. In general, the approach taken depends on the cluster method used. For K-means, some of the most widely employed criteria are formulated in terms of the decomposition of the total point scatter, regarding a two-mode data set of N points in p dimensions, which are optimally arranged into K classes. This paper addresses the formulation of criteria to determine the number of clusters, in the general situation in which the available information for clustering is a one-mode [Formula: see text] dissimilarity matrix describing the objects. In this framework, p and the coordinates of points are usually unknown, and the application of criteria originally formulated for two-mode data sets is dependent on their possible reformulation in the one-mode situation. The decomposition of the variability of the clustered objects is proposed in terms of the corresponding block-shaped partition of the dissimilarity matrix. Within-block and between-block dispersion values for the partitioned dissimilarity matrix are derived, and variance-based criteria are subsequently formulated in order to determine the number of groups in the data. A Monte Carlo experiment was carried out to study the performance of the proposed criteria. For simulated clustered points in p dimensions, greater efficiency in recovering the number of clusters is obtained when the criteria are calculated from the related Euclidean distances instead of the known two-mode data set, in general, for unequal-sized clusters and for low dimensionality situations. For simulated dissimilarity data sets, the proposed criteria always outperform the results obtained when these criteria are calculated from their original formulation, using dissimilarities instead of distances.

  10. Whole-body PET parametric imaging employing direct 4D nested reconstruction and a generalized non-linear Patlak model

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Nicolas A.; Rahmim, Arman

    2014-03-01

    Graphical analysis is employed in the research setting to provide quantitative estimation of PET tracer kinetics from dynamic images at a single bed. Recently, we proposed a multi-bed dynamic acquisition framework enabling clinically feasible whole-body parametric PET imaging by employing post-reconstruction parameter estimation. In addition, by incorporating linear Patlak modeling within the system matrix, we enabled direct 4D reconstruction in order to effectively circumvent noise amplification in dynamic whole-body imaging. However, direct 4D Patlak reconstruction exhibits a relatively slow convergence due to the presence of non-sparse spatial correlations in temporal kinetic analysis. In addition, the standard Patlak model does not account for reversible uptake, thus underestimating the influx rate Ki. We have developed a novel whole-body PET parametric reconstruction framework in the STIR platform, a widely employed open-source reconstruction toolkit, a) enabling accelerated convergence of direct 4D multi-bed reconstruction, by employing a nested algorithm to decouple the temporal parameter estimation from the spatial image update process, and b) enhancing the quantitative performance particularly in regions with reversible uptake, by pursuing a non-linear generalized Patlak 4D nested reconstruction algorithm. A set of published kinetic parameters and the XCAT phantom were employed for the simulation of dynamic multi-bed acquisitions. Quantitative analysis on the Ki images demonstrated considerable acceleration in the convergence of the nested 4D whole-body Patlak algorithm. In addition, our simulated and patient whole-body data in the postreconstruction domain indicated the quantitative benefits of our extended generalized Patlak 4D nested reconstruction for tumor diagnosis and treatment response monitoring.

  11. Imitative and Direct Learning as Interacting Factors in Life History Evolution.

    PubMed

    Bullinaria, John A

    2017-01-01

    The idea that lifetime learning can have a significant effect on life history evolution has recently been explored using a series of artificial life simulations. These involved populations of competing individuals evolving by natural selection to learn to perform well on simplified abstract tasks, with the learning consisting of identifying regularities in their environment. In reality, there is more to learning than that type of direct individual experience, because it often includes a substantial degree of social learning that involves various forms of imitation of what other individuals have learned before them. This article rectifies that omission by incorporating memes and imitative learning into revised versions of the previous approach. To do this reliably requires formulating and testing a general framework for meme-based simulations that will enable more complete investigations of learning as a factor in any life history evolution scenarios. It does that by simulating imitative information transfer in terms of memes being passed between individuals, and developing a process for merging that information with the (possibly inconsistent) information acquired by direct experience, leading to a consistent overall body of learning. The proposed framework is tested on a range of learning variations and a representative set of life history factors to confirm the robustness of the approach. The simulations presented illustrate the types of interactions and tradeoffs that can emerge, and indicate the kinds of species-specific models that could be developed with this approach in the future.

  12. A framework of knowledge creation processes in participatory simulation of hospital work systems.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2017-04-01

    Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.

  13. Quantum approach to classical statistical mechanics.

    PubMed

    Somma, R D; Batista, C D; Ortiz, G

    2007-07-20

    We present a new approach to study the thermodynamic properties of d-dimensional classical systems by reducing the problem to the computation of ground state properties of a d-dimensional quantum model. This classical-to-quantum mapping allows us to extend the scope of standard optimization methods by unifying them under a general framework. The quantum annealing method is naturally extended to simulate classical systems at finite temperatures. We derive the rates to assure convergence to the optimal thermodynamic state using the adiabatic theorem of quantum mechanics. For simulated and quantum annealing, we obtain the asymptotic rates of T(t) approximately (pN)/(k(B)logt) and gamma(t) approximately (Nt)(-c/N), for the temperature and magnetic field, respectively. Other annealing strategies are also discussed.

  14. Real gas CFD simulations of hydrogen/oxygen supercritical combustion

    NASA Astrophysics Data System (ADS)

    Pohl, S.; Jarczyk, M.; Pfitzner, M.; Rogg, B.

    2013-03-01

    A comprehensive numerical framework has been established to simulate reacting flows under conditions typically encountered in rocket combustion chambers. The model implemented into the commercial CFD Code ANSYS CFX includes appropriate real gas relations based on the volume-corrected Peng-Robinson (PR) equation of state (EOS) for the flow field and a real gas extension of the laminar flamelet combustion model. The results indicate that the real gas relations have a considerably larger impact on the flow field than on the detailed flame structure. Generally, a realistic flame shape could be achieved for the real gas approach compared to experimental data from the Mascotte test rig V03 operated at ONERA when the differential diffusion processes were only considered within the flame zone.

  15. Perspective: Theory and simulation of hybrid halide perovskites

    PubMed Central

    Jung, Young-Kwang

    2017-01-01

    Organic-inorganic halide perovskites present a number of challenges for first-principles atomistic materials modeling. Such “plastic crystals” feature dynamic processes across multiple length and time scales. These include the following: (i) transport of slow ions and fast electrons; (ii) highly anharmonic lattice dynamics with short phonon lifetimes; (iii) local symmetry breaking of the average crystallographic space group; (iv) strong relativistic (spin-orbit coupling) effects on the electronic band structure; and (v) thermodynamic metastability and rapid chemical breakdown. These issues, which affect the operation of solar cells, are outlined in this perspective. We also discuss general guidelines for performing quantitative and predictive simulations of these materials, which are relevant to metal-organic frameworks and other hybrid semiconducting, dielectric and ferroelectric compounds. PMID:29166078

  16. Modeling mass transfer and reaction of dilute solutes in a ternary phase system by the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Fu, Yu-Hang; Bai, Lin; Luo, Kai-Hong; Jin, Yong; Cheng, Yi

    2017-04-01

    In this work, we propose a general approach for modeling mass transfer and reaction of dilute solute(s) in incompressible three-phase flows by introducing a collision operator in lattice Boltzmann (LB) method. An LB equation was used to simulate the solute dynamics among three different fluids, in which the newly expanded collision operator was used to depict the interface behavior of dilute solute(s). The multiscale analysis showed that the presented model can recover the macroscopic transport equations derived from the Maxwell-Stefan equation for dilute solutes in three-phase systems. Compared with the analytical equation of state of solute and dynamic behavior, these results are proven to constitute a generalized framework to simulate solute distributions in three-phase flows, including compound soluble in one phase, compound adsorbed on single-interface, compound in two phases, and solute soluble in three phases. Moreover, numerical simulations of benchmark cases, such as phase decomposition, multilayered planar interfaces, and liquid lens, were performed to test the stability and efficiency of the model. Finally, the multiphase mass transfer and reaction in Janus droplet transport in a straight microchannel were well reproduced.

  17. Modeling hydrodynamic self-propulsion with Stokesian Dynamics. Or teaching Stokesian Dynamics to swim

    NASA Astrophysics Data System (ADS)

    Swan, James W.; Brady, John F.; Moore, Rachel S.; ChE 174

    2011-07-01

    We develop a general framework for modeling the hydrodynamic self-propulsion (i.e., swimming) of bodies (e.g., microorganisms) at low Reynolds number via Stokesian Dynamics simulations. The swimming body is composed of many spherical particles constrained to form an assembly that deforms via relative motion of its constituent particles. The resistance tensor describing the hydrodynamic interactions among the individual particles maps directly onto that for the assembly. Specifying a particular swimming gait and imposing the condition that the swimming body is force- and torque-free determine the propulsive speed. The body's translational and rotational velocities computed via this methodology are identical in form to that from the classical theory for the swimming of arbitrary bodies at low Reynolds number. We illustrate the generality of the method through simulations of a wide array of swimming bodies: pushers and pullers, spinners, the Taylor/Purcell swimming toroid, Taylor's helical swimmer, Purcell's three-link swimmer, and an amoeba-like body undergoing large-scale deformation. An open source code is a part of the supplementary material and can be used to simulate the swimming of a body with arbitrary geometry and swimming gait.

  18. Using the ECD Framework to Support Evidentiary Reasoning in the Context of a Simulation Study for Detecting Learner Differences in Epistemic Games

    ERIC Educational Resources Information Center

    Sweet, Shauna J.; Rupp, Andre A.

    2012-01-01

    The "evidence-centered design" (ECD) framework is a powerful tool that supports careful and critical thinking about the identification and accumulation of evidence in assessment contexts. In this paper, we demonstrate how the ECD framework provides critical support for designing simulation studies to investigate statistical methods…

  19. Aerosol single-scattering albedo over the global oceans: Comparing PARASOL retrievals with AERONET, OMI, and AeroCom models estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacagnina, Carlo; Hasekamp, Otto P.; Bian, Huisheng

    2015-09-27

    The aerosol Single Scattering Albedo (SSA) over the global oceans is evaluated based on polarimetric measurements by the PARASOL satellite. The retrieved values for SSA and Aerosol Optical Depth (AOD) agree well with the ground-based measurements of the AErosol RObotic NETwork (AERONET). The global coverage provided by the PARASOL observations represents a unique opportunity to evaluate SSA and AOD simulated by atmospheric transport model runs, as performed in the AeroCom framework. The SSA estimate provided by the AeroCom models is generally higher than the SSA retrieved from both PARASOL and AERONET. On the other hand, the mean simulated AOD ismore » about right or slightly underestimated compared with observations. An overestimate of the SSA by the models would suggest that these simulate an overly strong aerosol radiative cooling at top-of-atmosphere (TOA) and underestimate it at surface. This implies that aerosols have a potential stronger impact within the atmosphere than currently simulated.« less

  20. Equivalence of Brownian dynamics and dynamic Monte Carlo simulations in multicomponent colloidal suspensions.

    PubMed

    Cuetos, Alejandro; Patti, Alessandro

    2015-08-01

    We propose a simple but powerful theoretical framework to quantitatively compare Brownian dynamics (BD) and dynamic Monte Carlo (DMC) simulations of multicomponent colloidal suspensions. By extending our previous study focusing on monodisperse systems of rodlike colloids, here we generalize the formalism described there to multicomponent colloidal mixtures and validate it by investigating the dynamics in isotropic and liquid crystalline phases containing spherical and rodlike particles. In order to investigate the dynamics of multicomponent colloidal systems by DMC simulations, it is key to determine the elementary time step of each species and establish a unique timescale. This is crucial to consistently study the dynamics of colloidal particles with different geometry. By analyzing the mean-square displacement, the orientation autocorrelation functions, and the self part of the van Hove correlation functions, we show that DMC simulation is a very convenient and reliable technique to describe the stochastic dynamics of any multicomponent colloidal system. Our theoretical formalism can be easily extended to any colloidal system containing size and/or shape polydisperse particles.

  1. Simulations of Cavitating Cryogenic Inducers

    NASA Technical Reports Server (NTRS)

    Dorney, Dan (Technical Monitor); Hosangadi, Ashvin; Ahuja, Vineet; Ungewitter, Ronald J.

    2004-01-01

    Simulations of cavitating turbopump inducers at their design flow rate are presented. Results over a broad range of Nss, numbers extending from single-phase flow conditions through the critical head break down point are discussed. The flow characteristics and performance of a subscale geometry designed for water testing are compared with the fullscale configuration that employs LOX. In particular, thermal depression effects arising from cavitation in cryogenic fluids are identified and their impact on the suction performance of the inducer quantified. The simulations have been performed using the CRUNCH CFD[R] code that has a generalized multi-element unstructured framework suitable for turbomachinery applications. An advanced multi-phase formulation for cryogenic fluids that models temperature depression and real fluid property variations is employed. The formulation has been extensively validated for both liquid nitrogen and liquid hydrogen by simulating the experiments of Hord on hydrofoils; excellent estimates of the leading edge temperature and pressure depression were obtained while the comparisons in the cavity closure region were reasonable.

  2. Multi-dimensional high order essentially non-oscillatory finite difference methods in generalized coordinates

    NASA Technical Reports Server (NTRS)

    Shu, Chi-Wang

    1992-01-01

    The nonlinear stability of compact schemes for shock calculations is investigated. In recent years compact schemes were used in various numerical simulations including direct numerical simulation of turbulence. However to apply them to problems containing shocks, one has to resolve the problem of spurious numerical oscillation and nonlinear instability. A framework to apply nonlinear limiting to a local mean is introduced. The resulting scheme can be proven total variation (1D) or maximum norm (multi D) stable and produces nice numerical results in the test cases. The result is summarized in the preprint entitled 'Nonlinearly Stable Compact Schemes for Shock Calculations', which was submitted to SIAM Journal on Numerical Analysis. Research was continued on issues related to two and three dimensional essentially non-oscillatory (ENO) schemes. The main research topics include: parallel implementation of ENO schemes on Connection Machines; boundary conditions; shock interaction with hydrogen bubbles, a preparation for the full combustion simulation; and direct numerical simulation of compressible sheared turbulence.

  3. Modernizing the ATLAS simulation infrastructure

    NASA Astrophysics Data System (ADS)

    Di Simone, A.; CollaborationAlbert-Ludwigs-Universitt Freiburg, ATLAS; Institut, Physikalisches; Br., 79104 Freiburg i.; Germany

    2017-10-01

    The ATLAS Simulation infrastructure has been used to produce upwards of 50 billion proton-proton collision events for analyses ranging from detailed Standard Model measurements to searches for exotic new phenomena. In the last several years, the infrastructure has been heavily revised to allow intuitive multithreading and significantly improved maintainability. Such a massive update of a legacy code base requires careful choices about what pieces of code to completely rewrite and what to wrap or revise. The initialization of the complex geometry was generalized to allow new tools and geometry description languages, popular in some detector groups. The addition of multithreading requires Geant4-MT and GaudiHive, two frameworks with fundamentally different approaches to multithreading, to work together. It also required enforcing thread safety throughout a large code base, which required the redesign of several aspects of the simulation, including truth, the record of particle interactions with the detector during the simulation. These advances were possible thanks to close interactions with the Geant4 developers.

  4. Cognitive simulators for medical education and training.

    PubMed

    Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L

    2009-08-01

    Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.

  5. BacNet and Analog/Digital Interfaces of the Building Controls Virtual Testbed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nouidui, Thierry Stephane; Wetter, Michael; Li, Zhengwei

    2011-11-01

    This paper gives an overview of recent developments in the Building Controls Virtual Test Bed (BCVTB), a framework for co-simulation and hardware-in-the-loop. First, a general overview of the BCVTB is presented. Second, we describe the BACnet interface, a link which has been implemented to couple BACnet devices to the BCVTB. We present a case study where the interface was used to couple a whole building simulation program to a building control system to assess in real-time the performance of a real building. Third, we present the ADInterfaceMCC, an analog/digital interface that allows a USB-based analog/digital converter to be linked tomore » the BCVTB. In a case study, we show how the link was used to couple the analog/digital converter to a building simulation model for local loop control.« less

  6. Coupled hydrodynamic and ecological simulation for prognosticating land reclamation impacts in river estuaries

    NASA Astrophysics Data System (ADS)

    Xu, Yan; Cai, Yanpeng; Sun, Tao; Yang, Zhifeng; Hao, Yan

    2018-03-01

    A multiphase finite-element hydrodynamic model and a phytoplankton simulation approach are coupled into a general modeling framework. It can help quantify impacts of land reclamation. Compared with previous studies, it has the following improvements: a) reflection of physical currents and suitable growth areas for phytoplankton, (b) advancement of a simulation method to describe the suitability of phytoplankton in the sea water. As the results, water velocity is 16.7% higher than that of original state without human disturbances. The related filling engineering has shortened sediment settling paths, weakened the vortex flow and reduced the capacity of material exchange. Additionally, coastal reclamation lead to decrease of the growth suitability index (GSI), thus it cut down the stability of phytoplankton species approximately 4-12%. The proposed GSI can be applied to the management of coastal reclamation for minimizing ecological impacts. It will be helpful for facilitating identifying suitable phytoplankton growth areas.

  7. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  8. Software systems for modeling articulated figures

    NASA Technical Reports Server (NTRS)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  9. Solar Corona Simulation Model With Positivity-preserving Property

    NASA Astrophysics Data System (ADS)

    Feng, X. S.

    2015-12-01

    Positivity-preserving is one of crucial problems in solar corona simulation. In such numerical simulation of low plasma β region, keeping density and pressure is a first of all matter to obtain physical sound solution. In the present paper, we utilize the maximum-principle-preserving flux limiting technique to develop a class of second order positivity-preserving Godunov finite volume HLL methods for the solar wind plasma MHD equations. Based on the underlying first order building block of positivity preserving Lax-Friedrichs, our schemes, under the constrained transport (CT) and generalized Lagrange multiplier (GLM) framework, can achieve high order accuracy, a discrete divergence-free condition and positivity of the numerical solution simultaneously without extra CFL constraints. Numerical results in four Carrington rotation during the declining, rising, minimum and maximum solar activity phases are provided to demonstrate the performance of modeling small plasma beta with positivity-preserving property of the proposed method.

  10. Modeling Real-Time Coordination of Distributed Expertise and Event Response in NASA Mission Control Center Operations

    NASA Astrophysics Data System (ADS)

    Onken, Jeffrey

    This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.

  11. Hybrid stochastic simplifications for multiscale gene networks

    PubMed Central

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-01-01

    Background Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. Results We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Conclusion Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach. PMID:19735554

  12. STAR-CCM+ Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less

  13. Relativistic interpretation of Newtonian simulations for cosmic structure formation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fidler, Christian; Tram, Thomas; Crittenden, Robert

    2016-09-01

    The standard numerical tools for studying non-linear collapse of matter are Newtonian N -body simulations. Previous work has shown that these simulations are in accordance with General Relativity (GR) up to first order in perturbation theory, provided that the effects from radiation can be neglected. In this paper we show that the present day matter density receives more than 1% corrections from radiation on large scales if Newtonian simulations are initialised before z =50. We provide a relativistic framework in which unmodified Newtonian simulations are compatible with linear GR even in the presence of radiation. Our idea is to usemore » GR perturbation theory to keep track of the evolution of relativistic species and the relativistic space-time consistent with the Newtonian trajectories computed in N -body simulations. If metric potentials are sufficiently small, they can be computed using a first-order Einstein–Boltzmann code such as CLASS. We make this idea rigorous by defining a class of GR gauges, the Newtonian motion gauges, which are defined such that matter particles follow Newtonian trajectories. We construct a simple example of a relativistic space-time within which unmodified Newtonian simulations can be interpreted.« less

  14. A Generic Guidance and Control Structure for Six-Degree-of-Freedom Conceptual Aircraft Design

    NASA Technical Reports Server (NTRS)

    Cotting, M. Christopher; Cox, Timothy H.

    2005-01-01

    A control system framework is presented for both real-time and batch six-degree-of-freedom simulation. This framework allows stabilization and control with multiple command options, from body rate control to waypoint guidance. Also, pilot commands can be used to operate the simulation in a pilot-in-the-loop environment. This control system framework is created by using direct vehicle state feedback with nonlinear dynamic inversion. A direct control allocation scheme is used to command aircraft effectors. Online B-matrix estimation is used in the control allocation algorithm for maximum algorithm flexibility. Primary uses for this framework include conceptual design and early preliminary design of aircraft, where vehicle models change rapidly and a knowledge of vehicle six-degree-of-freedom performance is required. A simulated airbreathing hypersonic vehicle and a simulated high performance fighter are controlled to demonstrate the flexibility and utility of the control system.

  15. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium - Part 1: Theory

    NASA Astrophysics Data System (ADS)

    Sundberg, R.; Moberg, A.; Hind, A.

    2012-08-01

    A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.

  16. Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.

    PubMed

    Böttcher, Björn

    2010-12-03

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.

  17. Feller Processes: The Next Generation in Modeling. Brownian Motion, Lévy Processes and Beyond

    PubMed Central

    Böttcher, Björn

    2010-01-01

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular Brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes. PMID:21151931

  18. Radiative-convective equilibrium model intercomparison project

    NASA Astrophysics Data System (ADS)

    Wing, Allison A.; Reed, Kevin A.; Satoh, Masaki; Stevens, Bjorn; Bony, Sandrine; Ohno, Tomoki

    2018-03-01

    RCEMIP, an intercomparison of multiple types of models configured in radiative-convective equilibrium (RCE), is proposed. RCE is an idealization of the climate system in which there is a balance between radiative cooling of the atmosphere and heating by convection. The scientific objectives of RCEMIP are three-fold. First, clouds and climate sensitivity will be investigated in the RCE setting. This includes determining how cloud fraction changes with warming and the role of self-aggregation of convection in climate sensitivity. Second, RCEMIP will quantify the dependence of the degree of convective aggregation and tropical circulation regimes on temperature. Finally, by providing a common baseline, RCEMIP will allow the robustness of the RCE state across the spectrum of models to be assessed, which is essential for interpreting the results found regarding clouds, climate sensitivity, and aggregation, and more generally, determining which features of tropical climate a RCE framework is useful for. A novel aspect and major advantage of RCEMIP is the accessibility of the RCE framework to a variety of models, including cloud-resolving models, general circulation models, global cloud-resolving models, single-column models, and large-eddy simulation models.

  19. PRIFIRA: General regularization using prior-conditioning for fast radio interferometric imaging†

    NASA Astrophysics Data System (ADS)

    Naghibzadeh, Shahrzad; van der Veen, Alle-Jan

    2018-06-01

    Image formation in radio astronomy is a large-scale inverse problem that is inherently ill-posed. We present a general algorithmic framework based on a Bayesian-inspired regularized maximum likelihood formulation of the radio astronomical imaging problem with a focus on diffuse emission recovery from limited noisy correlation data. The algorithm is dubbed PRIor-conditioned Fast Iterative Radio Astronomy (PRIFIRA) and is based on a direct embodiment of the regularization operator into the system by right preconditioning. The resulting system is then solved using an iterative method based on projections onto Krylov subspaces. We motivate the use of a beamformed image (which includes the classical "dirty image") as an efficient prior-conditioner. Iterative reweighting schemes generalize the algorithmic framework and can account for different regularization operators that encourage sparsity of the solution. The performance of the proposed method is evaluated based on simulated one- and two-dimensional array arrangements as well as actual data from the core stations of the Low Frequency Array radio telescope antenna configuration, and compared to state-of-the-art imaging techniques. We show the generality of the proposed method in terms of regularization schemes while maintaining a competitive reconstruction quality with the current reconstruction techniques. Furthermore, we show that exploiting Krylov subspace methods together with the proper noise-based stopping criteria results in a great improvement in imaging efficiency.

  20. A generalized mixed effects model of abundance for mark-resight data when sampling is without replacement

    USGS Publications Warehouse

    McClintock, B.T.; White, Gary C.; Burnham, K.P.; Pryde, M.A.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    In recent years, the mark-resight method for estimating abundance when the number of marked individuals is known has become increasingly popular. By using field-readable bands that may be resighted from a distance, these techniques can be applied to many species, and are particularly useful for relatively small, closed populations. However, due to the different assumptions and general rigidity of the available estimators, researchers must often commit to a particular model without rigorous quantitative justification for model selection based on the data. Here we introduce a nonlinear logit-normal mixed effects model addressing this need for a more generalized framework. Similar to models available for mark-recapture studies, the estimator allows a wide variety of sampling conditions to be parameterized efficiently under a robust sampling design. Resighting rates may be modeled simply or with more complexity by including fixed temporal and random individual heterogeneity effects. Using information theory, the model(s) best supported by the data may be selected from the candidate models proposed. Under this generalized framework, we hope the uncertainty associated with mark-resight model selection will be reduced substantially. We compare our model to other mark-resight abundance estimators when applied to mainland New Zealand robin (Petroica australis) data recently collected in Eglinton Valley, Fiordland National Park and summarize its performance in simulation experiments.

  1. An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective

    DTIC Science & Technology

    2014-12-01

    An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective by Robert A...Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective Robert A Sottilare and Anne M Sinatra Human...2014 4. TITLE AND SUBTITLE An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective

  2. Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.

    2010-11-01

    We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.

  3. A flexible framework for process-based hydraulic and water ...

    EPA Pesticide Factsheets

    Background Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. While popular, generally, the GI models are relatively simplistic. However, GI model predictions are being relied upon by many municipalities and State/Local agencies to make decisions about grey vs. green infrastructure improvement planning. Adding complexity to GI modeling frameworks may preclude their use in simpler urban planning situations. Therefore, the goal here was to develop a sophisticated, yet flexible tool that could be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media used in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biophysical processes affecting contaminants such as reactions, and particle-associated transport accurately while maintaining a high degree of flexibly to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated.Framework Features The process-based model framework developed here can be used to model a diverse range of GI practices such as green roof, retention pond, bioretention, infiltration trench, permeable pavement and

  4. Probabilistic approach for earthquake scenarios in the Marmara region from dynamic rupture simulations

    NASA Astrophysics Data System (ADS)

    Aochi, Hideo

    2014-05-01

    The Marmara region (Turkey) along the North Anatolian fault is known as a high potential of large earthquakes in the next decades. For the purpose of seismic hazard/risk evaluation, kinematic and dynamic source models have been proposed (e.g. Oglesby and Mai, GJI, 2012). In general, the simulated earthquake scenarios depend on the hypothesis and cannot be verified before the expected earthquake. We then introduce a probabilistic insight to give the initial/boundary conditions to statistically analyze the simulated scenarios. We prepare different fault geometry models, tectonic loading and hypocenter locations. We keep the same framework of the simulation procedure as the dynamic rupture process of the adjacent 1999 Izmit earthquake (Aochi and Madariaga, BSSA, 2003), as the previous models were able to reproduce the seismological/geodetic aspects of the event. Irregularities in fault geometry play a significant role to control the rupture progress, and a relatively large change in geometry may work as barriers. The variety of the simulate earthquake scenarios should be useful for estimating the variety of the expected ground motion.

  5. Relativistic N-body simulations with massive neutrinos

    NASA Astrophysics Data System (ADS)

    Adamek, Julian; Durrer, Ruth; Kunz, Martin

    2017-11-01

    Some of the dark matter in the Universe is made up of massive neutrinos. Their impact on the formation of large scale structure can be used to determine their absolute mass scale from cosmology, but to this end accurate numerical simulations have to be developed. Due to their relativistic nature, neutrinos pose additional challenges when one tries to include them in N-body simulations that are traditionally based on Newtonian physics. Here we present the first numerical study of massive neutrinos that uses a fully relativistic approach. Our N-body code, gevolution, is based on a weak-field formulation of general relativity that naturally provides a self-consistent framework for relativistic particle species. This allows us to model neutrinos from first principles, without invoking any ad-hoc recipes. Our simulation suite comprises some of the largest neutrino simulations performed to date. We study the effect of massive neutrinos on the nonlinear power spectra and the halo mass function, focusing on the interesting mass range between 0.06 eV and 0.3 eV and including a case for an inverted mass hierarchy.

  6. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine.

    PubMed

    Sawyer, Taylor; White, Marjorie; Zaveri, Pavan; Chang, Todd; Ades, Anne; French, Heather; Anderson, JoDee; Auerbach, Marc; Johnston, Lindsay; Kessler, David

    2015-08-01

    Acquisition of competency in procedural skills is a fundamental goal of medical training. In this Perspective, the authors propose an evidence-based pedagogical framework for procedural skill training. The framework was developed based on a review of the literature using a critical synthesis approach and builds on earlier models of procedural skill training in medicine. The authors begin by describing the fundamentals of procedural skill development. Then, a six-step pedagogical framework for procedural skills training is presented: Learn, See, Practice, Prove, Do, and Maintain. In this framework, procedural skill training begins with the learner acquiring requisite cognitive knowledge through didactic education (Learn) and observation of the procedure (See). The learner then progresses to the stage of psychomotor skill acquisition and is allowed to deliberately practice the procedure on a simulator (Practice). Simulation-based mastery learning is employed to allow the trainee to prove competency prior to performing the procedure on a patient (Prove). Once competency is demonstrated on a simulator, the trainee is allowed to perform the procedure on patients with direct supervision, until he or she can be entrusted to perform the procedure independently (Do). Maintenance of the skill is ensured through continued clinical practice, supplemented by simulation-based training as needed (Maintain). Evidence in support of each component of the framework is presented. Implementation of the proposed framework presents a paradigm shift in procedural skill training. However, the authors believe that adoption of the framework will improve procedural skill training and patient safety.

  7. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  8. The LSST metrics analysis framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.

    2014-07-01

    We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.

  9. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  10. Next Generation Simulation Framework for Robotic and Human Space Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven

    2012-01-01

    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  11. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  12. A generalized Levene's scale test for variance heterogeneity in the presence of sample correlation and group uncertainty.

    PubMed

    Soave, David; Sun, Lei

    2017-09-01

    We generalize Levene's test for variance (scale) heterogeneity between k groups for more complex data, when there are sample correlation and group membership uncertainty. Following a two-stage regression framework, we show that least absolute deviation regression must be used in the stage 1 analysis to ensure a correct asymptotic χk-12/(k-1) distribution of the generalized scale (gS) test statistic. We then show that the proposed gS test is independent of the generalized location test, under the joint null hypothesis of no mean and no variance heterogeneity. Consequently, we generalize the recently proposed joint location-scale (gJLS) test, valuable in settings where there is an interaction effect but one interacting variable is not available. We evaluate the proposed method via an extensive simulation study and two genetic association application studies. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  13. General mixture item response models with different item response structures: Exposition with an application to Likert scales.

    PubMed

    Tijmstra, Jesper; Bolsinova, Maria; Jeon, Minjeong

    2018-01-10

    This article proposes a general mixture item response theory (IRT) framework that allows for classes of persons to differ with respect to the type of processes underlying the item responses. Through the use of mixture models, nonnested IRT models with different structures can be estimated for different classes, and class membership can be estimated for each person in the sample. If researchers are able to provide competing measurement models, this mixture IRT framework may help them deal with some violations of measurement invariance. To illustrate this approach, we consider a two-class mixture model, where a person's responses to Likert-scale items containing a neutral middle category are either modeled using a generalized partial credit model, or through an IRTree model. In the first model, the middle category ("neither agree nor disagree") is taken to be qualitatively similar to the other categories, and is taken to provide information about the person's endorsement. In the second model, the middle category is taken to be qualitatively different and to reflect a nonresponse choice, which is modeled using an additional latent variable that captures a person's willingness to respond. The mixture model is studied using simulation studies and is applied to an empirical example.

  14. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.

  15. Enhancements to the SHARP Build System and NEK5000 Coupling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alex; Bennett, Andrew R.; Billings, Jay Jay

    The SHARP project for the Department of Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program provides a multiphysics framework for coupled simulations of advanced nuclear reactor designs. It provides an overall coupling environment that utilizes custom interfaces to couple existing physics codes through a common spatial decomposition and unique solution transfer component. As of this writing, SHARP couples neutronics, thermal hydraulics, and structural mechanics using PROTEUS, Nek5000, and Diablo respectively. This report details two primary SHARP improvements regarding the Nek5000 and Diablo individual physics codes: (1) an improved Nek5000 coupling interface that lets SHARP achieve a vast increase inmore » overall solution accuracy by manipulating the structure of the internal Nek5000 spatial mesh, and (2) the capability to seamlessly couple structural mechanics calculations into the framework through improvements to the SHARP build system. The Nek5000 coupling interface now uses a barycentric Lagrange interpolation method that takes the vertex-based power and density computed from the PROTEUS neutronics solver and maps it to the user-specified, general-order Nek5000 spectral element mesh. Before this work, SHARP handled this vertex-based solution transfer in an averaging-based manner. SHARP users can now achieve higher levels of accuracy by specifying any arbitrary Nek5000 spectral mesh order. This improvement takes the average percentage error between the PROTEUS power solution and the Nek5000 interpolated result down drastically from over 23 % to just above 2 %, and maintains the correct power profile. We have integrated Diablo into the SHARP build system to facilitate the future coupling of structural mechanics calculations into SHARP. Previously, simulations involving Diablo were done in an iterative manner, requiring a large amount manual work, and left only as a task for advanced users. This report will detail a new Diablo build system that was implemented using GNU Autotools, mirroring much of the current SHARP build system, and easing the use of structural mechanics calculations for end-users of the SHARP multiphysics framework. It lets users easily build and use Diablo as a stand-alone simulation, as well as fully couple with the other SHARP physics modules. The top-level SHARP build system was modified to allow Diablo to hook in directly. New dependency handlers were implemented to let SHARP users easily build the framework with these new simulation capabilities. The remainder of this report will describe this work in full, with a detailed discussion of the overall design philosophy of SHARP, the new solution interpolation method introduced, and the Diablo integration work. We will conclude with a discussion of possible future SHARP improvements that will serve to increase solution accuracy and framework capability.« less

  16. A proposal for a computer-based framework of support for public health in the management of biological incidents: the Czech Republic experience.

    PubMed

    Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel

    2012-11-01

    Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.

  17. A diffusion model-free framework with echo time dependence for free-water elimination and brain tissue microstructure characterization.

    PubMed

    Molina-Romero, Miguel; Gómez, Pedro A; Sperl, Jonathan I; Czisch, Michael; Sämann, Philipp G; Jones, Derek K; Menzel, Marion I; Menze, Bjoern H

    2018-03-23

    The compartmental nature of brain tissue microstructure is typically studied by diffusion MRI, MR relaxometry or their correlation. Diffusion MRI relies on signal representations or biophysical models, while MR relaxometry and correlation studies are based on regularized inverse Laplace transforms (ILTs). Here we introduce a general framework for characterizing microstructure that does not depend on diffusion modeling and replaces ill-posed ILTs with blind source separation (BSS). This framework yields proton density, relaxation times, volume fractions, and signal disentanglement, allowing for separation of the free-water component. Diffusion experiments repeated for several different echo times, contain entangled diffusion and relaxation compartmental information. These can be disentangled by BSS using a physically constrained nonnegative matrix factorization. Computer simulations, phantom studies, together with repeatability and reproducibility experiments demonstrated that BSS is capable of estimating proton density, compartmental volume fractions and transversal relaxations. In vivo results proved its potential to correct for free-water contamination and to estimate tissue parameters. Formulation of the diffusion-relaxation dependence as a BSS problem introduces a new framework for studying microstructure compartmentalization, and a novel tool for free-water elimination. © 2018 International Society for Magnetic Resonance in Medicine.

  18. A new framework for comprehensive, robust, and efficient global sensitivity analysis: 1. Theory

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2016-01-01

    Computer simulation models are continually growing in complexity with increasingly more factors to be identified. Sensitivity Analysis (SA) provides an essential means for understanding the role and importance of these factors in producing model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to "variogram analysis," that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. Synthetic functions that resemble actual model response surfaces are used to illustrate the concepts, and show VARS to be as much as two orders of magnitude more computationally efficient than the state-of-the-art Sobol approach. In a companion paper, we propose a practical implementation strategy, and demonstrate the effectiveness, efficiency, and reliability (robustness) of the VARS framework on real-data case studies.

  19. The strength and dislocation microstructure evolution in superalloy microcrystals

    NASA Astrophysics Data System (ADS)

    Hussein, Ahmed M.; Rao, Satish I.; Uchic, Michael D.; Parthasarathay, Triplicane A.; El-Awady, Jaafar A.

    2017-02-01

    In this work, the evolution of the dislocations microstructure in single crystal two-phase superalloy microcrystals under monotonic loading has been studied using the three-dimensional discrete dislocation dynamics (DDD) method. The DDD framework has been extended to properly handle the collective behavior of dislocations and their interactions with large collections of arbitrary shaped precipitates. Few constraints are imposed on the initial distribution of the dislocations or the precipitates, and the extended DDD framework can support experimentally-obtained precipitate geometries. Full tracking of the creation and destruction of anti-phase boundaries (APB) is accounted for. The effects of the precipitate volume fraction, APB energy, precipitate size, and crystal size on the deformation of superalloy microcrystals have been quantified. Correlations between the precipitate microstructure and the dominant deformation features, such as dislocation looping versus precipitate shearing, are also discussed. It is shown that the mechanical strength is independent of the crystal size, increases linearly with increasing the volume fraction, follows a near square-root relationship with the APB energy and an inverse square-root relationship with the precipitate size. Finally, the flow strength in simulations having initial dislocation pair sources show a flow strength that is about one half of that predicted from simulations starting with single dislocation sources. The method developed can be used, with minimal extensions, to simulate dislocation microstructure evolution in general multiphase materials.

  20. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  1. Run-to-Run Optimization Control Within Exact Inverse Framework for Scan Tracking.

    PubMed

    Yeoh, Ivan L; Reinhall, Per G; Berg, Martin C; Chizeck, Howard J; Seibel, Eric J

    2017-09-01

    A run-to-run optimization controller uses a reduced set of measurement parameters, in comparison to more general feedback controllers, to converge to the best control point for a repetitive process. A new run-to-run optimization controller is presented for the scanning fiber device used for image acquisition and display. This controller utilizes very sparse measurements to estimate a system energy measure and updates the input parameterizations iteratively within a feedforward with exact-inversion framework. Analysis, simulation, and experimental investigations on the scanning fiber device demonstrate improved scan accuracy over previous methods and automatic controller adaptation to changing operating temperature. A specific application example and quantitative error analyses are provided of a scanning fiber endoscope that maintains high image quality continuously across a 20 °C temperature rise without interruption of the 56 Hz video.

  2. A generalized framework unifying image registration and respiratory motion models and incorporating image reconstruction, for partial image data or full images

    NASA Astrophysics Data System (ADS)

    McClelland, Jamie R.; Modat, Marc; Arridge, Simon; Grimes, Helen; D'Souza, Derek; Thomas, David; O' Connell, Dylan; Low, Daniel A.; Kaza, Evangelia; Collins, David J.; Leach, Martin O.; Hawkes, David J.

    2017-06-01

    Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of ‘partial’ imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated.

  3. A generalized framework unifying image registration and respiratory motion models and incorporating image reconstruction, for partial image data or full images.

    PubMed

    McClelland, Jamie R; Modat, Marc; Arridge, Simon; Grimes, Helen; D'Souza, Derek; Thomas, David; Connell, Dylan O'; Low, Daniel A; Kaza, Evangelia; Collins, David J; Leach, Martin O; Hawkes, David J

    2017-06-07

    Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of 'partial' imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated.

  4. A generalized framework unifying image registration and respiratory motion models and incorporating image reconstruction, for partial image data or full images

    PubMed Central

    McClelland, Jamie R; Modat, Marc; Arridge, Simon; Grimes, Helen; D’Souza, Derek; Thomas, David; Connell, Dylan O’; Low, Daniel A; Kaza, Evangelia; Collins, David J; Leach, Martin O; Hawkes, David J

    2017-01-01

    Abstract Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of ‘partial’ imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated. PMID:28195833

  5. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  6. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE PAGES

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...

    2017-04-24

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  7. CyberMedVPS: visual programming for development of simulators.

    PubMed

    Morais, Aline M; Machado, Liliane S

    2011-01-01

    Computer applications based on Virtual Reality (VR) has been outstanding in training and teaching in the medical filed due to their ability to simulate realistic in which users can practice skills and decision making in different situations. But was realized in these frameworks a hard interaction of non-programmers users. Based on this problematic will be shown the CyberMedVPS, a graphical module which implement Visual Programming concepts to solve an interaction trouble. Frameworks to develop such simulators are available but their use demands knowledge of programming. Based on this problematic will be shown the CyberMedVPS, a graphical module for the CyberMed framework, which implements Visual Programming concepts to allow the development of simulators by non-programmers professionals of the medical field.

  8. General cognitive principles for learning structure in time and space.

    PubMed

    Goldstein, Michael H; Waterfall, Heidi R; Lotem, Arnon; Halpern, Joseph Y; Schwade, Jennifer A; Onnis, Luca; Edelman, Shimon

    2010-06-01

    How are hierarchically structured sequences of objects, events or actions learned from experience and represented in the brain? When several streams of regularities present themselves, which will be learned and which ignored? Can statistical regularities take effect on their own, or are additional factors such as behavioral outcomes expected to influence statistical learning? Answers to these questions are starting to emerge through a convergence of findings from naturalistic observations, behavioral experiments, neurobiological studies, and computational analyses and simulations. We propose that a small set of principles are at work in every situation that involves learning of structure from patterns of experience and outline a general framework that accounts for such learning. (c) 2010 Elsevier Ltd. All rights reserved.

  9. Spectral analysis for nonstationary and nonlinear systems: a discrete-time-model-based approach.

    PubMed

    He, Fei; Billings, Stephen A; Wei, Hua-Liang; Sarrigiannis, Ptolemaios G; Zhao, Yifan

    2013-08-01

    A new frequency-domain analysis framework for nonlinear time-varying systems is introduced based on parametric time-varying nonlinear autoregressive with exogenous input models. It is shown how the time-varying effects can be mapped to the generalized frequency response functions (FRFs) to track nonlinear features in frequency, such as intermodulation and energy transfer effects. A new mapping to the nonlinear output FRF is also introduced. A simulated example and the application to intracranial electroencephalogram data are used to illustrate the theoretical results.

  10. Line transect estimation of population size: the exponential case with grouped data

    USGS Publications Warehouse

    Anderson, D.R.; Burnham, K.P.; Crain, B.R.

    1979-01-01

    Gates, Marshall, and Olson (1968) investigated the line transect method of estimating grouse population densities in the case where sighting probabilities are exponential. This work is followed by a simulation study in Gates (1969). A general overview of line transect analysis is presented by Burnham and Anderson (1976). These articles all deal with the ungrouped data case. In the present article, an analysis of line transect data is formulated under the Gates framework of exponential sighting probabilities and in the context of grouped data.

  11. The kinetic origin of delayed yielding in metallic glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Y. F.; Liu, X. D.; Wang, S.

    2016-06-20

    Recent experiments showed that irreversible structural change or plasticity could occur in metallic glasses (MGs) even within the apparent elastic limit after a sufficiently long waiting time. To explain this phenomenon, a stochastic shear transformation model is developed based on a unified rate theory to predict delayed yielding in MGs, which is validated afterwards through extensive atomistic simulations carried out on different MGs. On a fundamental level, an analytic framework is established in this work that links time, stress, and temperature altogether into a general yielding criterion for MGs.

  12. Optimal placement of excitations and sensors for verification of large dynamical systems

    NASA Technical Reports Server (NTRS)

    Salama, M.; Rose, T.; Garba, J.

    1987-01-01

    The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.

  13. An epidemiological modeling and data integration framework.

    PubMed

    Pfeifer, B; Wurz, M; Hanser, F; Seger, M; Netzer, M; Osl, M; Modre-Osprian, R; Schreier, G; Baumgartner, C

    2010-01-01

    In this work, a cellular automaton software package for simulating different infectious diseases, storing the simulation results in a data warehouse system and analyzing the obtained results to generate prediction models as well as contingency plans, is proposed. The Brisbane H3N2 flu virus, which has been spreading during the winter season 2009, was used for simulation in the federal state of Tyrol, Austria. The simulation-modeling framework consists of an underlying cellular automaton. The cellular automaton model is parameterized by known disease parameters and geographical as well as demographical conditions are included for simulating the spreading. The data generated by simulation are stored in the back room of the data warehouse using the Talend Open Studio software package, and subsequent statistical and data mining tasks are performed using the tool, termed Knowledge Discovery in Database Designer (KD3). The obtained simulation results were used for generating prediction models for all nine federal states of Austria. The proposed framework provides a powerful and easy to handle interface for parameterizing and simulating different infectious diseases in order to generate prediction models and improve contingency plans for future events.

  14. Verification and validation of a patient simulator for test and evaluation of a laser doppler vibrometer

    NASA Astrophysics Data System (ADS)

    Byrd, Kenneth A.; Yauger, Sunny

    2012-06-01

    In the medical community, patient simulators are used to educate and train nurses, medics and doctors in rendering dierent levels of treatment and care to various patient populations. Students have the opportunity to perform real-world medical procedures without putting any patients at risk. A new thrust for the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD), is the use of remote sensing technologies to detect human vital signs at stando distances. This capability will provide medics with the ability to diagnose while under re in addition to helping them to prioritize the care and evacuation of battleeld casualties. A potential alternative (or precursor) to human subject testing is the use of patient simulators. This substitution (or augmenting) provides a safe and cost eective means to develop, test, and evaluate sensors without putting any human subjects at risk. In this paper, we present a generalized framework that can be used to accredit patient simulator technologies as human simulants for remote physiological monitoring (RPM). Results indicate that we were successful in using a commercial Laser Doppler Vibrometer (LDV) to exploit pulse and respiration signals from a SimMan 3G patient simulator at stando (8 meters).

  15. The Virtual Quake earthquake simulator: a simulation-based forecast of the El Mayor-Cucapah region and evidence of predictability in simulated earthquake sequences

    NASA Astrophysics Data System (ADS)

    Yoder, Mark R.; Schultz, Kasey W.; Heien, Eric M.; Rundle, John B.; Turcotte, Donald L.; Parker, Jay W.; Donnellan, Andrea

    2015-12-01

    In this manuscript, we introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert-based forecasting metric, and show that it exhibits significant information gain compared to random forecasts. We also discuss the long-standing question of activation versus quiescent type earthquake triggering. We show that VQ exhibits both behaviours separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California, USA and northern Baja California Norte, Mexico.

  16. Formalization, implementation, and modeling of institutional controllers for distributed robotic systems.

    PubMed

    Pereira, José N; Silva, Porfírio; Lima, Pedro U; Martinoli, Alcherio

    2014-01-01

    The work described is part of a long term program of introducing institutional robotics, a novel framework for the coordination of robot teams that stems from institutional economics concepts. Under the framework, institutions are cumulative sets of persistent artificial modifications made to the environment or to the internal mechanisms of a subset of agents, thought to be functional for the collective order. In this article we introduce a formal model of institutional controllers based on Petri nets. We define executable Petri nets-an extension of Petri nets that takes into account robot actions and sensing-to design, program, and execute institutional controllers. We use a generalized stochastic Petri net view of the robot team controlled by the institutional controllers to model and analyze the stochastic performance of the resulting distributed robotic system. The ability of our formalism to replicate results obtained using other approaches is assessed through realistic simulations of up to 40 e-puck robots. In particular, we model a robot swarm and its institutional controller with the goal of maintaining wireless connectivity, and successfully compare our model predictions and simulation results with previously reported results, obtained by using finite state automaton models and controllers.

  17. An Information Theoretic Framework and Self-organizing Agent- based Sensor Network Architecture for Power Plant Condition Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loparo, Kenneth; Kolacinski, Richard; Threeanaew, Wanchat

    A central goal of the work was to enable both the extraction of all relevant information from sensor data, and the application of information gained from appropriate processing and fusion at the system level to operational control and decision-making at various levels of the control hierarchy through: 1. Exploiting the deep connection between information theory and the thermodynamic formalism, 2. Deployment using distributed intelligent agents with testing and validation in a hardware-in-the loop simulation environment. Enterprise architectures are the organizing logic for key business processes and IT infrastructure and, while the generality of current definitions provides sufficient flexibility, the currentmore » architecture frameworks do not inherently provide the appropriate structure. Of particular concern is that existing architecture frameworks often do not make a distinction between ``data'' and ``information.'' This work defines an enterprise architecture for health and condition monitoring of power plant equipment and further provides the appropriate foundation for addressing shortcomings in current architecture definition frameworks through the discovery of the information connectivity between the elements of a power generation plant. That is, to identify the correlative structure between available observations streams using informational measures. The principle focus here is on the implementation and testing of an emergent, agent-based, algorithm based on the foraging behavior of ants for eliciting this structure and on measures for characterizing differences between communication topologies. The elicitation algorithms are applied to data streams produced by a detailed numerical simulation of Alstom’s 1000 MW ultra-super-critical boiler and steam plant. The elicitation algorithm and topology characterization can be based on different informational metrics for detecting connectivity, e.g. mutual information and linear correlation.« less

  18. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  19. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  20. Semiclassical dynamics of spin density waves

    NASA Astrophysics Data System (ADS)

    Chern, Gia-Wei; Barros, Kipton; Wang, Zhentao; Suwa, Hidemaro; Batista, Cristian D.

    2018-01-01

    We present a theoretical framework for equilibrium and nonequilibrium dynamical simulation of quantum states with spin-density-wave (SDW) order. Within a semiclassical adiabatic approximation that retains electron degrees of freedom, we demonstrate that the SDW order parameter obeys a generalized Landau-Lifshitz equation. With the aid of an enhanced kernel polynomial method, our linear-scaling quantum Landau-Lifshitz dynamics (QLLD) method enables dynamical SDW simulations with N ≃105 lattice sites. Our real-space formulation can be used to compute dynamical responses, such as the dynamical structure factor, of complex and even inhomogeneous SDW configurations at zero or finite temperatures. Applying the QLLD to study the relaxation of a noncoplanar topological SDW under the excitation of a short pulse, we further demonstrate the crucial role of spatial correlations and fluctuations in the SDW dynamics.

  1. On simulating flow with multiple time scales using a method of averages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margolin, L.G.

    1997-12-31

    The author presents a new computational method based on averaging to efficiently simulate certain systems with multiple time scales. He first develops the method in a simple one-dimensional setting and employs linear stability analysis to demonstrate numerical stability. He then extends the method to multidimensional fluid flow. His method of averages does not depend on explicit splitting of the equations nor on modal decomposition. Rather he combines low order and high order algorithms in a generalized predictor-corrector framework. He illustrates the methodology in the context of a shallow fluid approximation to an ocean basin circulation. He finds that his newmore » method reproduces the accuracy of a fully explicit second-order accurate scheme, while costing less than a first-order accurate scheme.« less

  2. Gradient calculations for dynamic recurrent neural networks: a survey.

    PubMed

    Pearlmutter, B A

    1995-01-01

    Surveys learning algorithms for recurrent neural networks with hidden units and puts the various techniques into a common framework. The authors discuss fixed point learning algorithms, namely recurrent backpropagation and deterministic Boltzmann machines, and nonfixed point algorithms, namely backpropagation through time, Elman's history cutoff, and Jordan's output feedback architecture. Forward propagation, an on-line technique that uses adjoint equations, and variations thereof, are also discussed. In many cases, the unified presentation leads to generalizations of various sorts. The author discusses advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones continues with some "tricks of the trade" for training, using, and simulating continuous time and recurrent neural networks. The author presents some simulations, and at the end, addresses issues of computational complexity and learning speed.

  3. Vapor-liquid equilibrium and equation of state of two-dimensional fluids from a discrete perturbation theory

    NASA Astrophysics Data System (ADS)

    Trejos, Víctor M.; Santos, Andrés; Gámez, Francisco

    2018-05-01

    The interest in the description of the properties of fluids of restricted dimensionality is growing for theoretical and practical reasons. In this work, we have firstly developed an analytical expression for the Helmholtz free energy of the two-dimensional square-well fluid in the Barker-Henderson framework. This equation of state is based on an approximate analytical radial distribution function for d-dimensional hard-sphere fluids (1 ≤ d ≤ 3) and is validated against existing and new simulation results. The so-obtained equation of state is implemented in a discrete perturbation theory able to account for general potential shapes. The prototypical Lennard-Jones and Yukawa fluids are tested in its two-dimensional version against available and new simulation data with semiquantitative agreement.

  4. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care

    PubMed Central

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988

  5. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.

    PubMed

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.

  6. A general theoretical framework for interpreting patient-reported outcomes estimated from ordinally scaled item responses.

    PubMed

    Massof, Robert W

    2014-10-01

    A simple theoretical framework explains patient responses to items in rating scale questionnaires. Fixed latent variables position each patient and each item on the same linear scale. Item responses are governed by a set of fixed category thresholds, one for each ordinal response category. A patient's item responses are magnitude estimates of the difference between the patient variable and the patient's estimate of the item variable, relative to his/her personally defined response category thresholds. Differences between patients in their personal estimates of the item variable and in their personal choices of category thresholds are represented by random variables added to the corresponding fixed variables. Effects of intervention correspond to changes in the patient variable, the patient's response bias, and/or latent item variables for a subset of items. Intervention effects on patients' item responses were simulated by assuming the random variables are normally distributed with a constant scalar covariance matrix. Rasch analysis was used to estimate latent variables from the simulated responses. The simulations demonstrate that changes in the patient variable and changes in response bias produce indistinguishable effects on item responses and manifest as changes only in the estimated patient variable. Changes in a subset of item variables manifest as intervention-specific differential item functioning and as changes in the estimated person variable that equals the average of changes in the item variables. Simulations demonstrate that intervention-specific differential item functioning produces inefficiencies and inaccuracies in computer adaptive testing. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  7. Development and implementation of an empirical frequency map for use in MD simulations of isotope-edited proteins, and, Development, implementation, and evaluation of an online student portal as a textbook replacement in an advanced general chemistry course

    NASA Astrophysics Data System (ADS)

    Shorb, Justin Matthew

    The first portion of this thesis describes an extension of work done in the Skinner group to develop an empirical frequency map for N-methylacetamide (NMA) in water. NMA is a peptide bond capped on either side by a methyl group and is therefore a common prototypical molecule used when studying complicated polypeptides and proteins. This amide bond is present along the backbone of every protein as it connects individual component amino acids. This amide bond also has a strong observable frequency in the IR due to the Amide-I mode (predominantly carbon-oxygen stretching motion). This project describes the simplification of the prior model for mapping the frequency of the Amide-I mode from the electric field due to the environment and develops a parallel implementation of this algorithm for use in larger biological systems, such as the trans-membrane portion of the tetrameric polypeptide bundle protein CD3zeta. The second portion of this thesis describes the development, implementation and evaluation of an online textbook within the context of a cohesive theoretical framework. The project begins by describing what is meant when discussing a digital textbook, including a survey of various types of digital media being used to deliver textbook-like content. This leads into the development of a theoretical framework based on constructivist pedagogical theory, hypertext learning theory, and chemistry visualization and representation frameworks. The implementation and design of ChemPaths, the general chemistry online text developed within the Chemistry Education Digital Library (ChemEd DL) is then described. The effectiveness of ChemPaths being used as a textbook replacement in an advanced general chemistry course is evaluated within the developed theoretical framework both qualitatively and quantitatively.

  8. In Vivo Investigation of the Effectiveness of a Hyper-viscoelastic Model in Simulating Brain Retraction

    NASA Astrophysics Data System (ADS)

    Li, Ping; Wang, Weiwei; Zhang, Chenxi; An, Yong; Song, Zhijian

    2016-07-01

    Intraoperative brain retraction leads to a misalignment between the intraoperative positions of the brain structures and their previous positions, as determined from preoperative images. In vitro swine brain sample uniaxial tests showed that the mechanical response of brain tissue to compression and extension could be described by the hyper-viscoelasticity theory. The brain retraction caused by the mechanical process is a combination of brain tissue compression and extension. In this paper, we first constructed a hyper-viscoelastic framework based on the extended finite element method (XFEM) to simulate intraoperative brain retraction. To explore its effectiveness, we then applied this framework to an in vivo brain retraction simulation. The simulation strictly followed the clinical scenario, in which seven swine were subjected to brain retraction. Our experimental results showed that the hyper-viscoelastic XFEM framework is capable of simulating intraoperative brain retraction and improving the navigation accuracy of an image-guided neurosurgery system (IGNS).

  9. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  10. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  11. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  12. Mirrored continuum and molecular scale simulations of the ignition of high-pressure phases of RDX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Kibaek; Stewart, D. Scott, E-mail: santc@illinois.edu, E-mail: dss@illinois.edu; Joshi, Kaushik

    2016-05-14

    We present a mirrored atomistic and continuum framework that is used to describe the ignition of energetic materials, and a high-pressure phase of RDX in particular. The continuum formulation uses meaningful averages of thermodynamic properties obtained from the atomistic simulation and a simplification of enormously complex reaction kinetics. In particular, components are identified based on molecular weight bin averages and our methodology assumes that both the averaged atomistic and continuum simulations are represented on the same time and length scales. The atomistic simulations of thermally initiated ignition of RDX are performed using reactive molecular dynamics (RMD). The continuum model ismore » based on multi-component thermodynamics and uses a kinetics scheme that describes observed chemical changes of the averaged atomistic simulations. Thus the mirrored continuum simulations mimic the rapid change in pressure, temperature, and average molecular weight of species in the reactive mixture. This mirroring enables a new technique to simplify the chemistry obtained from reactive MD simulations while retaining the observed features and spatial and temporal scales from both the RMD and continuum model. The primary benefit of this approach is a potentially powerful, but familiar way to interpret the atomistic simulations and understand the chemical events and reaction rates. The approach is quite general and thus can provide a way to model chemistry based on atomistic simulations and extend the reach of those simulations.« less

  13. A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON

    PubMed Central

    King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix

    2008-01-01

    As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597

  14. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  15. Hierarchical Coarse-Graining Via a Generalized Yvon-Born Green Framework: Many-Body Correlations, Mappings, and Structural Accuracy

    NASA Astrophysics Data System (ADS)

    Rudzinski, Joseph F.

    Atomically-detailed molecular dynamics simulations have emerged as one of the most powerful theoretic tools for studying complex, condensed-phase systems. Despite their ability to provide incredible molecular insight, these simulations are insufficient for investigating complex biological processes, e.g., protein folding or molecular aggregation, on relevant length and time scales. The increasing scope and sophistication of atomically-detailed models has motivated the development of "hierarchical" approaches, which parameterize a low resolution, coarse-grained (CG) model based on simulations of an atomically-detailed model. The utility of hierarchical CG models depends on their ability to accurately incorporate the correct physics of the underlying model. One approach for ensuring this "consistency" between the models is to parameterize the CG model to reproduce the structural ensemble generated by the high resolution model. The many-body potential of mean force is the proper CG energy function for reproducing all structural distributions of the atomically-detailed model, at the CG level of resolution. However, this CG potential is a configuration-dependent free energy function that is generally too complicated to represent or simulate. The multiscale coarse-graining (MS-CG) method employs a generalized Yvon-Born-Green (g-YBG) relation to directly determine a variationally optimal approximation to the many-body potential of mean force. The MS-CG/g-YBG method provides a convenient and transparent framework for investigating the equilibrium structure of the system, at the CG level of resolution. In this work, we investigate the fundamental limitations and approximations of the MS-CG/g-YBG method. Throughout the work, we propose several theoretic constructs to directly relate the MS-CG/g-YBG method to other popular structure-based CG approaches. We investigate the physical interpretation of the MS-CG/g-YBG correlation matrix, the quantity responsible for disentangling the various contributions to the average force on a CG site. We then employ an iterative extension of the MS-CG/g-YBG method that improves the accuracy of a particular set of low order correlation functions relative to the original MS-CG/g-YBG model. We demonstrate that this method provides a powerful framework for identifying the precise source of error in an MS-CG/g-YBG model. We then propose a method for identifying an optimal CG representation, prior to the development of the CG model. We employ these techniques together to demonstrate that in the cases where the MS-CG/g-YBG method fails to determine an accurate model, a fundamental problem likely exists with the chosen CG representation or interaction set. Additionally, we explicitly demonstrate that while the iterative model successfully improves the accuracy of the low order structure, it does so by distorting the higher order structural correlations relative to the underlying model. Finally, we apply these methods to investigate the utility of the MS-CG/g- YBG method for developing models for systems with complex intramolecular structure. Overall, our results demonstrate the power of the g-YBG framework for developing accurate CG models and for investigating the driving forces of equilibrium structures for complex condensed-phase systems. This work also explicitly motivates future development of bottom-up CG methods and highlights some outstanding problems in the field. iii.

  16. High-Performance Computer Modeling of the Cosmos-Iridium Collision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S; Cook, K; Fasenfest, B

    2009-08-28

    This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellitemore » collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.« less

  17. The Unified Behavior Framework for the Simulation of Autonomous Agents

    DTIC Science & Technology

    2015-03-01

    1980s, researchers have designed a variety of robot control architectures intending to imbue robots with some degree of autonomy. A recently developed ...Identification Friend or Foe viii THE UNIFIED BEHAVIOR FRAMEWORK FOR THE SIMULATION OF AUTONOMOUS AGENTS I. Introduction The development of autonomy has...room for research by utilizing methods like simulation and modeling that consume less time and fewer monetary resources. A recently developed reactive

  18. Introducing FNCS: Framework for Network Co-Simulation

    ScienceCinema

    None

    2018-06-07

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  19. Introducing FNCS: Framework for Network Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-10-23

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  20. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  1. Time-reversal MUSIC imaging of extended targets.

    PubMed

    Marengo, Edwin A; Gruber, Fred K; Simonetti, Francesco

    2007-08-01

    This paper develops, within a general framework that is applicable to rather arbitrary electromagnetic and acoustic remote sensing systems, a theory of time-reversal "MUltiple Signal Classification" (MUSIC)-based imaging of extended (nonpoint-like) scatterers (targets). The general analysis applies to arbitrary remote sensing geometry and sheds light onto how the singular system of the scattering matrix relates to the geometrical and propagation characteristics of the entire transmitter-target-receiver system and how to use this effect for imaging. All the developments are derived within exact scattering theory which includes multiple scattering effects. The derived time-reversal MUSIC methods include both interior sampling, as well as exterior sampling (or enclosure) approaches. For presentation simplicity, particular attention is given to the time-harmonic case where the informational wave modes employed for target interrogation are purely spatial, but the corresponding generalization to broadband fields is also given. This paper includes computer simulations illustrating the derived theory and algorithms.

  2. Simulation training: a systematic review of simulation in arthroscopy and proposal of a new competency-based training framework.

    PubMed

    Tay, Charison; Khajuria, Ankur; Gupte, Chinmay

    2014-01-01

    Traditional orthopaedic training has followed an apprenticeship model whereby trainees enhance their skills by operating under guidance. However the introduction of limitations on training hours and shorter training programmes mean that alternative training strategies are required. To perform a literature review on simulation training in arthroscopy and devise a framework that structures different simulation techniques that could be used in arthroscopic training. A systematic search of Medline, Embase, Google Scholar and the Cochrane Databases were performed. Search terms included "virtual reality OR simulator OR simulation" and "arthroscopy OR arthroscopic". 14 studies evaluating simulators in knee, shoulder and hip arthroplasty were included. The majority of the studies demonstrated construct and transference validity but only one showed concurrent validity. More studies are required to assess its potential as a training and assessment tool, skills transference between simulators and to determine the extent of skills decay from prolonged delays in training. We also devised a "ladder of arthroscopic simulation" that provides a competency-based framework to implement different simulation strategies. The incorporation of simulation into an orthopaedic curriculum will depend on a coordinated approach between many bodies. But the successful integration of simulators in other areas of surgery supports a possible role for simulation in advancing orthopaedic education. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  3. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    NASA Astrophysics Data System (ADS)

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-04-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.

  4. GiPSi:a framework for open source/open architecture software development for organ-level surgical simulation.

    PubMed

    Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank

    2006-04-01

    This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.

  5. MOOSE: A parallel computational framework for coupled systems of nonlinear equations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Derek Gaston; Chris Newman; Glen Hansen

    Systems of coupled, nonlinear partial differential equations (PDEs) often arise in simulation of nuclear processes. MOOSE: Multiphysics Object Oriented Simulation Environment, a parallel computational framework targeted at the solution of such systems, is presented. As opposed to traditional data-flow oriented computational frameworks, MOOSE is instead founded on the mathematical principle of Jacobian-free Newton-Krylov (JFNK) solution methods. Utilizing the mathematical structure present in JFNK, physics expressions are modularized into `Kernels,'' allowing for rapid production of new simulation tools. In addition, systems are solved implicitly and fully coupled, employing physics based preconditioning, which provides great flexibility even with large variance in timemore » scales. A summary of the mathematics, an overview of the structure of MOOSE, and several representative solutions from applications built on the framework are presented.« less

  6. Electronic Structure Calculations of Hydrogen Storage in Lithium-Decorated Metal-Graphyne Framework.

    PubMed

    Kumar, Sandeep; Dhilip Kumar, Thogluva Janardhanan

    2017-08-30

    Porous metal-graphyne framework (MGF) made up of graphyne linker decorated with lithium has been investigated for hydrogen storage. Applying density functional theory spin-polarized generalized gradient approximation with the Perdew-Burke-Ernzerhof functional containing Grimme's diffusion parameter with double numeric polarization basis set, the structural stability, and physicochemical properties have been analyzed. Each linker binds two Li atoms over the surface of the graphyne linker forming MGF-Li 8 by Dewar coordination. On saturation with hydrogen, each Li atom physisorbs three H 2 molecules resulting in MGF-Li 8 -H 24 . H 2 and Li interact by charge polarization mechanism leading to elongation in average H-H bond length indicating physisorption. Sorption energy decreases gradually from ≈0.4 to 0.20 eV on H 2 loading. Molecular dynamics simulations and computed sorption energy range indicate the high reversibility of H 2 in the MGF-Li 8 framework with the hydrogen storage capacity of 6.4 wt %. The calculated thermodynamic practical hydrogen storage at room temperature makes the Li-decorated MGF system a promising hydrogen storage material.

  7. Leveraging constraints and biotelemetry data to pinpoint repetitively used spatial features

    USGS Publications Warehouse

    Brost, Brian M.; Hooten, Mevin B.; Small, Robert J.

    2016-01-01

    Satellite telemetry devices collect valuable information concerning the sites visited by animals, including the location of central places like dens, nests, rookeries, or haul‐outs. Existing methods for estimating the location of central places from telemetry data require user‐specified thresholds and ignore common nuances like measurement error. We present a fully model‐based approach for locating central places from telemetry data that accounts for multiple sources of uncertainty and uses all of the available locational data. Our general framework consists of an observation model to account for large telemetry measurement error and animal movement, and a highly flexible mixture model specified using a Dirichlet process to identify the location of central places. We also quantify temporal patterns in central place use by incorporating ancillary behavioral data into the model; however, our framework is also suitable when no such behavioral data exist. We apply the model to a simulated data set as proof of concept. We then illustrate our framework by analyzing an Argos satellite telemetry data set on harbor seals (Phoca vitulina) in the Gulf of Alaska, a species that exhibits fidelity to terrestrial haul‐out sites.

  8. A kernel regression approach to gene-gene interaction detection for case-control studies.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  9. Reference interaction site model and optimized perturbation theories of colloidal dumbbells with increasing anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munaò, Gianmarco, E-mail: gmunao@unime.it; Costa, Dino; Caccamo, Carlo

    We investigate thermodynamic properties of anisotropic colloidal dumbbells in the frameworks provided by the Reference Interaction Site Model (RISM) theory and an Optimized Perturbation Theory (OPT), this latter based on a fourth-order high-temperature perturbative expansion of the free energy, recently generalized to molecular fluids. Our model is constituted by two identical tangent hard spheres surrounded by square-well attractions with same widths and progressively different depths. Gas-liquid coexistence curves are obtained by predicting pressures, free energies, and chemical potentials. In comparison with previous simulation results, RISM and OPT agree in reproducing the progressive reduction of the gas-liquid phase separation as themore » anisotropy of the interaction potential becomes more pronounced; in particular, the RISM theory provides reasonable predictions for all coexistence curves, bar the strong anisotropy regime, whereas OPT performs generally less well. Both theories predict a linear dependence of the critical temperature on the interaction strength, reproducing in this way the mean-field behavior observed in simulations; the critical density—that drastically drops as the anisotropy increases—turns to be less accurate. Our results appear as a robust benchmark for further theoretical studies, in support to the simulation approach, of self-assembly in model colloidal systems.« less

  10. Accelerated path integral methods for atomistic simulations at ultra-low temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uhl, Felix, E-mail: felix.uhl@rub.de; Marx, Dominik; Ceriotti, Michele

    2016-08-07

    Path integral methods provide a rigorous and systematically convergent framework to include the quantum mechanical nature of atomic nuclei in the evaluation of the equilibrium properties of molecules, liquids, or solids at finite temperature. Such nuclear quantum effects are often significant for light nuclei already at room temperature, but become crucial at cryogenic temperatures such as those provided by superfluid helium as a solvent. Unfortunately, the cost of converged path integral simulations increases significantly upon lowering the temperature so that the computational burden of simulating matter at the typical superfluid helium temperatures becomes prohibitive. Here we investigate how accelerated pathmore » integral techniques based on colored noise generalized Langevin equations, in particular the so-called path integral generalized Langevin equation thermostat (PIGLET) variant, perform in this extreme quantum regime using as an example the quasi-rigid methane molecule and its highly fluxional protonated cousin, CH{sub 5}{sup +}. We show that the PIGLET technique gives a speedup of two orders of magnitude in the evaluation of structural observables and quantum kinetic energy at ultralow temperatures. Moreover, we computed the spatial spread of the quantum nuclei in CH{sub 4} to illustrate the limits of using such colored noise thermostats close to the many body quantum ground state.« less

  11. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nomura, K; Seymour, R; Wang, W

    2009-02-17

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based onmore » hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops {center_dot} day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).« less

  12. Building energy simulation in real time through an open standard interface

    DOE PAGES

    Pang, Xiufeng; Nouidui, Thierry S.; Wetter, Michael; ...

    2015-10-20

    Building energy models (BEMs) are typically used for design and code compliance for new buildings and in the renovation of existing buildings to predict energy use. We present the increasing adoption of BEM as standard practice in the building industry presents an opportunity to extend the use of BEMs into construction, commissioning and operation. In 2009, the authors developed a real-time simulation framework to execute an EnergyPlus model in real time to improve building operation. This paper reports an enhancement of that real-time energy simulation framework. The previous version only works with software tools that implement the custom co-simulation interfacemore » of the Building Controls Virtual Test Bed (BCVTB), such as EnergyPlus, Dymola and TRNSYS. The new version uses an open standard interface, the Functional Mockup Interface (FMI), to provide a generic interface to any application that supports the FMI protocol. In addition, the new version utilizes the Simple Measurement and Actuation Profile (sMAP) tool as the data acquisition system to acquire, store and present data. Lastly, this paper introduces the updated architecture of the real-time simulation framework using FMI and presents proof-of-concept demonstration results which validate the new framework.« less

  13. Rapid Prototyping of an Aircraft Model in an Object-Oriented Simulation

    NASA Technical Reports Server (NTRS)

    Kenney, P. Sean

    2003-01-01

    A team was created to participate in the Mars Scout Opportunity. Trade studies determined that an aircraft provided the best opportunity to complete the science objectives of the team. A high fidelity six degree of freedom flight simulation was required to provide credible evidence that the aircraft design fulfilled mission objectives and to support the aircraft design process by providing performance evaluations. The team created the simulation using the Langley Standard Real-Time Simulation in C++ (LaSRS++) application framework. A rapid prototyping approach was necessary because the team had only three months to both develop the aircraft simulation model and evaluate aircraft performance as the design and mission parameters matured. The design of LaSRS++ enabled rapid-prototyping in several ways. First, the framework allowed component models to be designed, implemented, unit-tested, and integrated quickly. Next, the framework provides a highly reusable infrastructure that allowed developers to maximize code reuse while concentrating on aircraft and mission specific features. Finally, the framework reduces risk by providing reusable components that allow developers to build a quality product with a compressed testing cycle that relies heavily on unit testing of new components.

  14. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, J. A.

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  15. Characteristic-based and interface-sharpening algorithm for high-order simulations of immiscible compressible multi-material flows

    NASA Astrophysics Data System (ADS)

    He, Zhiwei; Tian, Baolin; Zhang, Yousheng; Gao, Fujie

    2017-03-01

    The present work focuses on the simulation of immiscible compressible multi-material flows with the Mie-Grüneisen-type equation of state governed by the non-conservative five-equation model [1]. Although low-order single fluid schemes have already been adopted to provide some feasible results, the application of high-order schemes (introducing relatively small numerical dissipation) to these flows may lead to results with severe numerical oscillations. Consequently, attempts to apply any interface-sharpening techniques to stop the progressively more severe smearing interfaces for a longer simulation time may result in an overshoot increase and in some cases convergence to a non-physical solution occurs. This study proposes a characteristic-based interface-sharpening algorithm for performing high-order simulations of such flows by deriving a pressure-equilibrium-consistent intermediate state (augmented with approximations of pressure derivatives) for local characteristic variable reconstruction and constructing a general framework for interface sharpening. First, by imposing a weak form of the jump condition for the non-conservative five-equation model, we analytically derive an intermediate state with pressure derivatives treated as additional parameters of the linearization procedure. Based on this intermediate state, any well-established high-order reconstruction technique can be employed to provide the state at each cell edge. Second, by designing another state with only different reconstructed values of the interface function at each cell edge, the advection term in the equation of the interface function is discretized twice using any common algorithm. The difference between the two discretizations is employed consistently for interface compression, yielding a general framework for interface sharpening. Coupled with the fifth-order improved accurate monotonicity-preserving scheme [2] for local characteristic variable reconstruction and the tangent of hyperbola for the interface capturing scheme [3] for designing other reconstructed values of the interface function, the present algorithm is examined using some typical tests, with the Mie-Grüneisen-type equation of state used for characterizing the materials of interest in both one- and two-dimensional spaces. The results of these tests verify the effectiveness of the present algorithm: essentially non-oscillatory and interface-sharpened results are obtained.

  16. Range Finding with a Plenoptic Camera

    DTIC Science & Technology

    2014-03-27

    92 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Simulated Camera Analysis...Varying Lens Diameter . . . . . . . . . . . . . . . . 95 Simulated Camera Analysis: Varying Detector Size . . . . . . . . . . . . . . . . . 98 Simulated ...Matching Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 37 Simulated Camera Performance with SIFT

  17. Integrated Hydrogeological Model of the General Separations Area, Vol. 2, Rev. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FLACH, GREGORYK.

    1999-04-01

    The 15 mi2 General Separations Area (GSA) contains more than 35 RCRA and CERCLA waste units, and is the focus of numerous ongoing and anticipated contaminant migration and remedial alternatives studies. To meet the analysis needs of GSA remediation programs, a groundwater flow model of the area based on the FACT code was developed. The model is consistent with detailed characterization and monitoring data through 1996. Model preprocessing has been automated so that future updates and modifications can be performed quickly and efficiently. Most remedial action scenarios can be explicitly simulated, including vertical recirculation wells, vertical barriers, surface caps, pumpingmore » wells at arbitrary locations, specified drawdown within well casings (instead of flowrate), and wetland impacts of remedial actions. The model has a fine scale vertical mesh and heterogeneous conductivity field, and includes the vadose zone. Therefore, the model is well suited to support subsequent contaminant transport simulations. the model can provide a common framework for analyzing groundwater flow, contaminant migration, and remedial alternatives across Environmental Restoration programs within the GSA.« less

  18. Three-Dimensional Geometric Modeling of Membrane-bound Organelles in Ventricular Myocytes: Bridging the Gap between Microscopic Imaging and Mathematical Simulation

    PubMed Central

    Yu, Zeyun; Holst, Michael J.; Hayashi, Takeharu; Bajaj, Chandrajit L.; Ellisman, Mark H.; McCammon, J. Andrew; Hoshijima, Masahiko

    2009-01-01

    A general framework of image-based geometric processing is presented to bridge the gap between three-dimensional (3D) imaging that provides structural details of a biological system and mathematical simulation where high-quality surface or volumetric meshes are required. A 3D density map is processed in the order of image pre-processing (contrast enhancement and anisotropic filtering), feature extraction (boundary segmentation and skeletonization), and high-quality and realistic surface (triangular) and volumetric (tetrahedral) mesh generation. While the tool-chain described is applicable to general types of 3D imaging data, the performance is demonstrated specifically on membrane-bound organelles in ventricular myocytes that are imaged and reconstructed with electron microscopic (EM) tomography and two-photon microscopy (T-PM). Of particular interest in this study are two types of membrane-bound Ca2+-handling organelles, namely, transverse tubules (T-tubules) and junctional sarcoplasmic reticulum (jSR), both of which play an important role in regulating the excitation-contraction (E-C) coupling through dynamic Ca2+ mobilization in cardiomyocytes. PMID:18835449

  19. Three-dimensional geometric modeling of membrane-bound organelles in ventricular myocytes: bridging the gap between microscopic imaging and mathematical simulation.

    PubMed

    Yu, Zeyun; Holst, Michael J; Hayashi, Takeharu; Bajaj, Chandrajit L; Ellisman, Mark H; McCammon, J Andrew; Hoshijima, Masahiko

    2008-12-01

    A general framework of image-based geometric processing is presented to bridge the gap between three-dimensional (3D) imaging that provides structural details of a biological system and mathematical simulation where high-quality surface or volumetric meshes are required. A 3D density map is processed in the order of image pre-processing (contrast enhancement and anisotropic filtering), feature extraction (boundary segmentation and skeletonization), and high-quality and realistic surface (triangular) and volumetric (tetrahedral) mesh generation. While the tool-chain described is applicable to general types of 3D imaging data, the performance is demonstrated specifically on membrane-bound organelles in ventricular myocytes that are imaged and reconstructed with electron microscopic (EM) tomography and two-photon microscopy (T-PM). Of particular interest in this study are two types of membrane-bound Ca(2+)-handling organelles, namely, transverse tubules (T-tubules) and junctional sarcoplasmic reticulum (jSR), both of which play an important role in regulating the excitation-contraction (E-C) coupling through dynamic Ca(2+) mobilization in cardiomyocytes.

  20. On the Helix Propensity in Generalized Born Solvent Descriptions of Modeling the Dark Proteome

    PubMed Central

    Olson, Mark A.

    2017-01-01

    Intrinsically disordered proteins that populate the so-called “Dark Proteome” offer challenging benchmarks of atomistic simulation methods to accurately model conformational transitions on a multidimensional energy landscape. This work explores the application of parallel tempering with implicit solvent models as a computational framework to capture the conformational ensemble of an intrinsically disordered peptide derived from the Ebola virus protein VP35. A recent X-ray crystallographic study reported a protein-peptide interface where the VP35 peptide underwent a folding transition from a disordered form to a helix-β-turn-helix topological fold upon molecular association with the Ebola protein NP. An assessment is provided of the accuracy of two generalized Born solvent models (GBMV2 and GBSW2) using the CHARMM force field and applied with temperature-based replica exchange dynamics to calculate the disorder propensity of the peptide and its probability density of states in a continuum solvent. A further comparison is presented of applying an explicit/implicit solvent hybrid replica exchange simulation of the peptide to determine the effect of modeling water interactions at the all-atom resolution. PMID:28197405

  1. On the Helix Propensity in Generalized Born Solvent Descriptions of Modeling the Dark Proteome.

    PubMed

    Olson, Mark A

    2017-01-01

    Intrinsically disordered proteins that populate the so-called "Dark Proteome" offer challenging benchmarks of atomistic simulation methods to accurately model conformational transitions on a multidimensional energy landscape. This work explores the application of parallel tempering with implicit solvent models as a computational framework to capture the conformational ensemble of an intrinsically disordered peptide derived from the Ebola virus protein VP35. A recent X-ray crystallographic study reported a protein-peptide interface where the VP35 peptide underwent a folding transition from a disordered form to a helix-β-turn-helix topological fold upon molecular association with the Ebola protein NP. An assessment is provided of the accuracy of two generalized Born solvent models (GBMV2 and GBSW2) using the CHARMM force field and applied with temperature-based replica exchange dynamics to calculate the disorder propensity of the peptide and its probability density of states in a continuum solvent. A further comparison is presented of applying an explicit/implicit solvent hybrid replica exchange simulation of the peptide to determine the effect of modeling water interactions at the all-atom resolution.

  2. A partial differential equation-based general framework adapted to Rayleigh's, Rician's and Gaussian's distributed noise for restoration and enhancement of magnetic resonance image.

    PubMed

    Yadav, Ram Bharos; Srivastava, Subodh; Srivastava, Rajeev

    2016-01-01

    The proposed framework is obtained by casting the noise removal problem into a variational framework. This framework automatically identifies the various types of noise present in the magnetic resonance image and filters them by choosing an appropriate filter. This filter includes two terms: the first term is a data likelihood term and the second term is a prior function. The first term is obtained by minimizing the negative log likelihood of the corresponding probability density functions: Gaussian or Rayleigh or Rician. Further, due to the ill-posedness of the likelihood term, a prior function is needed. This paper examines three partial differential equation based priors which include total variation based prior, anisotropic diffusion based prior, and a complex diffusion (CD) based prior. A regularization parameter is used to balance the trade-off between data fidelity term and prior. The finite difference scheme is used for discretization of the proposed method. The performance analysis and comparative study of the proposed method with other standard methods is presented for brain web dataset at varying noise levels in terms of peak signal-to-noise ratio, mean square error, structure similarity index map, and correlation parameter. From the simulation results, it is observed that the proposed framework with CD based prior is performing better in comparison to other priors in consideration.

  3. Object-oriented philosophy in designing adaptive finite-element package for 3D elliptic deferential equations

    NASA Astrophysics Data System (ADS)

    Zhengyong, R.; Jingtian, T.; Changsheng, L.; Xiao, X.

    2007-12-01

    Although adaptive finite-element (AFE) analysis is becoming more and more focused in scientific and engineering fields, its efficient implementations are remain to be a discussed problem as its more complex procedures. In this paper, we propose a clear C++ framework implementation to show the powerful properties of Object-oriented philosophy (OOP) in designing such complex adaptive procedure. In terms of the modal functions of OOP language, the whole adaptive system is divided into several separate parts such as the mesh generation or refinement, a-posterior error estimator, adaptive strategy and the final post processing. After proper designs are locally performed on these separate modals, a connected framework of adaptive procedure is formed finally. Based on the general elliptic deferential equation, little efforts should be added in the adaptive framework to do practical simulations. To show the preferable properties of OOP adaptive designing, two numerical examples are tested. The first one is the 3D direct current resistivity problem in which the powerful framework is efficiently shown as only little divisions are added. And then, in the second induced polarization£¨IP£©exploration case, new adaptive procedure is easily added which adequately shows the strong extendibility and re-usage of OOP language. Finally we believe based on the modal framework adaptive implementation by OOP methodology, more advanced adaptive analysis system will be available in future.

  4. A generic analytical foot rollover model for predicting translational ankle kinematics in gait simulation studies.

    PubMed

    Ren, Lei; Howard, David; Ren, Luquan; Nester, Chris; Tian, Limei

    2010-01-19

    The objective of this paper is to develop an analytical framework to representing the ankle-foot kinematics by modelling the foot as a rollover rocker, which cannot only be used as a generic tool for general gait simulation but also allows for case-specific modelling if required. Previously, the rollover models used in gait simulation have often been based on specific functions that have usually been of a simple form. In contrast, the analytical model described here is in a general form that the effective foot rollover shape can be represented by any polar function rho=rho(phi). Furthermore, a normalized generic foot rollover model has been established based on a normative foot rollover shape dataset of 12 normal healthy subjects. To evaluate model accuracy, the predicted ankle motions and the centre of pressure (CoP) were compared with measurement data for both subject-specific and general cases. The results demonstrated that the ankle joint motions in both vertical and horizontal directions (relative RMSE approximately 10%) and CoP (relative RMSE approximately 15% for most of the subjects) are accurately predicted over most of the stance phase (from 10% to 90% of stance). However, we found that the foot cannot be very accurately represented by a rollover model just after heel strike (HS) and just before toe off (TO), probably due to shear deformation of foot plantar tissues (ankle motion can occur without any foot rotation). The proposed foot rollover model can be used in both inverse and forward dynamics gait simulation studies and may also find applications in rehabilitation engineering. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. Surgical simulation: Current practices and future perspectives for technical skills training.

    PubMed

    Bjerrum, Flemming; Thomsen, Ann Sofia Skou; Nayahangan, Leizl Joy; Konge, Lars

    2018-06-17

    Simulation-based training (SBT) has become a standard component of modern surgical education, yet successful implementation of evidence-based training programs remains challenging. In this narrative review, we use Kern's framework for curriculum development to describe where we are now and what lies ahead for SBT within surgery with a focus on technical skills in operative procedures. Despite principles for optimal SBT (proficiency-based, distributed, and deliberate practice) having been identified, massed training with fixed time intervals or a fixed number of repetitions is still being extensively used, and simulators are generally underutilized. SBT should be part of surgical training curricula, including theoretical, technical, and non-technical skills, and be based on relevant needs assessments. Furthermore, training should follow evidence-based theoretical principles for optimal training, and the effect of training needs to be evaluated using relevant outcomes. There is a larger, still unrealized potential of surgical SBT, which may be realized in the near future as simulator technologies evolve, more evidence-based training programs are implemented, and cost-effectiveness and impact on patient safety is clearly demonstrated.

  6. Sampling enhancement for the quantum mechanical potential based molecular dynamics simulations: a general algorithm and its extension for free energy calculation on rugged energy surface.

    PubMed

    Li, Hongzhi; Yang, Wei

    2007-03-21

    An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.

  7. Geometric integrator for simulations in the canonical ensemble

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tapias, Diego, E-mail: diego.tapias@nucleares.unam.mx; Sanders, David P., E-mail: dpsanders@ciencias.unam.mx; Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139

    2016-08-28

    We introduce a geometric integrator for molecular dynamics simulations of physical systems in the canonical ensemble that preserves the invariant distribution in equations arising from the density dynamics algorithm, with any possible type of thermostat. Our integrator thus constitutes a unified framework that allows the study and comparison of different thermostats and of their influence on the equilibrium and non-equilibrium (thermo-)dynamic properties of a system. To show the validity and the generality of the integrator, we implement it with a second-order, time-reversible method and apply it to the simulation of a Lennard-Jones system with three different thermostats, obtaining good conservationmore » of the geometrical properties and recovering the expected thermodynamic results. Moreover, to show the advantage of our geometric integrator over a non-geometric one, we compare the results with those obtained by using the non-geometric Gear integrator, which is frequently used to perform simulations in the canonical ensemble. The non-geometric integrator induces a drift in the invariant quantity, while our integrator has no such drift, thus ensuring that the system is effectively sampling the correct ensemble.« less

  8. A Generic Simulation Framework for Non-Entangled based Experimental Quantum Cryptography and Communication: Quantum Cryptography and Communication Simulator (QuCCs)

    NASA Astrophysics Data System (ADS)

    Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad

    2016-11-01

    The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.

  9. Tomographic imaging of non-local media based on space-fractional diffusion models

    NASA Astrophysics Data System (ADS)

    Buonocore, Salvatore; Semperlotti, Fabio

    2018-06-01

    We investigate a generalized tomographic imaging framework applicable to a class of inhomogeneous media characterized by non-local diffusive energy transport. Under these conditions, the transport mechanism is well described by fractional-order continuum models capable of capturing anomalous diffusion that would otherwise remain undetected when using traditional integer-order models. Although the underlying idea of the proposed framework is applicable to any transport mechanism, the case of fractional heat conduction is presented as a specific example to illustrate the methodology. By using numerical simulations, we show how complex inhomogeneous media involving non-local transport can be successfully imaged if fractional order models are used. In particular, results will show that by properly recognizing and accounting for the fractional character of the host medium not only allows achieving increased resolution but, in case of strong and spatially distributed non-locality, it represents the only viable approach to achieve a successful reconstruction.

  10. Bayesian Group Bridge for Bi-level Variable Selection.

    PubMed

    Mallick, Himel; Yi, Nengjun

    2017-06-01

    A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.

  11. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  12. Distributed attitude synchronization of formation flying via consensus-based virtual structure

    NASA Astrophysics Data System (ADS)

    Cong, Bing-Long; Liu, Xiang-Dong; Chen, Zhen

    2011-06-01

    This paper presents a general framework for synchronized multiple spacecraft rotations via consensus-based virtual structure. In this framework, attitude control systems for formation spacecrafts and virtual structure are designed separately. Both parametric uncertainty and external disturbance are taken into account. A time-varying sliding mode control (TVSMC) algorithm is designed to improve the robustness of the actual attitude control system. As for the virtual attitude control system, a behavioral consensus algorithm is presented to accomplish the attitude maneuver of the entire formation and guarantee a consistent attitude among the local virtual structure counterparts during the attitude maneuver. A multiple virtual sub-structures (MVSSs) system is introduced to enhance current virtual structure scheme when large amounts of spacecrafts are involved in the formation. The attitude of spacecraft is represented by modified Rodrigues parameter (MRP) for its non-redundancy. Finally, a numerical simulation with three synchronization situations is employed to illustrate the effectiveness of the proposed strategy.

  13. Human mobility and epidemic invasion

    NASA Astrophysics Data System (ADS)

    Colizza, Vittoria

    2010-03-01

    The current H1N1 influenza pandemic is just the latest example of how human mobility helps drive infectious diseases. Travel has grown explosively in the last decades, contributing to an emerging complex pattern of traffic flows that unfolds at different scales, shaping the spread of epidemics. Restrictions on people's mobility are thus investigated to design possible containment measures. By considering a theoretical framework in terms of reaction-diffusion processes, it is possible to study the invasion dynamics of epidemics in a metapopulation system with heterogeneous mobility patterns. The system is found to exhibit a global invasion threshold that sets the critical mobility rate below which the epidemic is contained. The results provide a general framework for the understanding of the numerical evidence from detailed data-driven simulations that show the limited benefit provided by travel flows reduction in slowing down or containing an emerging epidemic.

  14. Summary of a Modeling and Simulation Framework for High-Fidelity Weapon Models in Joint Semi-Automated Forces (JSAF) and Other Mission-Simulation Software

    DTIC Science & Technology

    2008-05-01

    communicate with other weapon models In a mission-level simulation; (3) introduces the four configuration levels of the M&S framework; and (4) presents a cost ...and Disadvantages ....................................................................... 26 6 COST -EFFECTIVE M&S LABORATORY PLAN...25 23 Weapon Model Sample Time and Average TET Displayed on the Target PC ..... 26 24 Design and Cost of an

  15. Chimaera simulation of complex states of flowing matter

    PubMed Central

    2016-01-01

    We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro–meso–micro levels through suitable ‘mutations’ of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698031

  16. LAMMPS framework for dynamic bonding and an application modeling DNA

    NASA Astrophysics Data System (ADS)

    Svaneborg, Carsten

    2012-08-01

    We have extended the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) to support directional bonds and dynamic bonding. The framework supports stochastic formation of new bonds, breakage of existing bonds, and conversion between bond types. Bond formation can be controlled to limit the maximal functionality of a bead with respect to various bond types. Concomitant with the bond dynamics, angular and dihedral interactions are dynamically introduced between newly connected triplets and quartets of beads, where the interaction type is determined from the local pattern of bead and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework. Catalogue identifier: AEME_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEME_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 2 243 491 No. of bytes in distributed program, including test data, etc.: 771 Distribution format: tar.gz Programming language: C++ Computer: Single and multiple core servers Operating system: Linux/Unix/Windows Has the code been vectorized or parallelized?: Yes. The code has been parallelized by the use of MPI directives. RAM: 1 Gb Classification: 16.11, 16.12 Nature of problem: Simulating coarse-grain models capable of chemistry e.g. DNA hybridization dynamics. Solution method: Extending LAMMPS to handle dynamic bonding and directional bonds. Unusual features: Allows bonds to be created and broken while angular and dihedral interactions are kept consistent. Additional comments: The distribution file for this program is approximately 36 Mbytes and therefore is not delivered directly when download or E-mail is requested. Instead an html file giving details of how the program can be obtained is sent. Running time: Hours to days. The examples provided in the distribution take just seconds to run.

  17. An assessment of geographical distribution of different plant functional types over North America simulated using the CLASS-CTEM modelling framework

    NASA Astrophysics Data System (ADS)

    Shrestha, Rudra K.; Arora, Vivek K.; Melton, Joe R.; Sushama, Laxmi

    2017-10-01

    The performance of the competition module of the CLASS-CTEM (Canadian Land Surface Scheme and Canadian Terrestrial Ecosystem Model) modelling framework is assessed at 1° spatial resolution over North America by comparing the simulated geographical distribution of its plant functional types (PFTs) with two observation-based estimates. The model successfully reproduces the broad geographical distribution of trees, grasses and bare ground although limitations remain. In particular, compared to the two observation-based estimates, the simulated fractional vegetation coverage is lower in the arid southwest North American region and higher in the Arctic region. The lower-than-observed simulated vegetation coverage in the southwest region is attributed to lack of representation of shrubs in the model and plausible errors in the observation-based data sets. The observation-based data indicate vegetation fractional coverage of more than 60 % in this arid region, despite only 200-300 mm of precipitation that the region receives annually, and observation-based leaf area index (LAI) values in the region are lower than one. The higher-than-observed vegetation fractional coverage in the Arctic is likely due to the lack of representation of moss and lichen PFTs and also likely because of inadequate representation of permafrost in the model as a result of which the C3 grass PFT performs overly well in the region. The model generally reproduces the broad spatial distribution and the total area covered by the two primary tree PFTs (needleleaf evergreen trees, NDL-EVG; and broadleaf cold deciduous trees, BDL-DCD-CLD) reasonably well. The simulated fractional coverage of tree PFTs increases after the 1960s in response to the CO2 fertilization effect and climate warming. Differences between observed and simulated PFT coverages highlight model limitations and suggest that the inclusion of shrubs, and moss and lichen PFTs, and an adequate representation of permafrost will help improve model performance.

  18. A Physics-driven Neural Networks-based Simulation System (PhyNNeSS) for multimodal interactive virtual environments involving nonlinear deformable objects

    PubMed Central

    De, Suvranu; Deo, Dhannanjay; Sankaranarayanan, Ganesh; Arikatla, Venkata S.

    2012-01-01

    Background While an update rate of 30 Hz is considered adequate for real time graphics, a much higher update rate of about 1 kHz is necessary for haptics. Physics-based modeling of deformable objects, especially when large nonlinear deformations and complex nonlinear material properties are involved, at these very high rates is one of the most challenging tasks in the development of real time simulation systems. While some specialized solutions exist, there is no general solution for arbitrary nonlinearities. Methods In this work we present PhyNNeSS - a Physics-driven Neural Networks-based Simulation System - to address this long-standing technical challenge. The first step is an off-line pre-computation step in which a database is generated by applying carefully prescribed displacements to each node of the finite element models of the deformable objects. In the next step, the data is condensed into a set of coefficients describing neurons of a Radial Basis Function network (RBFN). During real-time computation, these neural networks are used to reconstruct the deformation fields as well as the interaction forces. Results We present realistic simulation examples from interactive surgical simulation with real time force feedback. As an example, we have developed a deformable human stomach model and a Penrose-drain model used in the Fundamentals of Laparoscopic Surgery (FLS) training tool box. Conclusions A unique computational modeling system has been developed that is capable of simulating the response of nonlinear deformable objects in real time. The method distinguishes itself from previous efforts in that a systematic physics-based pre-computational step allows training of neural networks which may be used in real time simulations. We show, through careful error analysis, that the scheme is scalable, with the accuracy being controlled by the number of neurons used in the simulation. PhyNNeSS has been integrated into SoFMIS (Software Framework for Multimodal Interactive Simulation) for general use. PMID:22629108

  19. A mathematical model for Vertical Attitude Takeoff and Landing (VATOL) aircraft simulation. Volume 3: User's manual for VATOL simulation program

    NASA Technical Reports Server (NTRS)

    Fortenbaugh, R. L.

    1980-01-01

    Instructions for using Vertical Attitude Takeoff and Landing Aircraft Simulation (VATLAS), the digital simulation program for application to vertical attitude takeoff and landing (VATOL) aircraft developed for installation on the NASA Ames CDC 7600 computer system are described. The framework for VATLAS is the Off-Line Simulation (OLSIM) routine. The OLSIM routine provides a flexible framework and standardized modules which facilitate the development of off-line aircraft simulations. OLSIM runs under the control of VTOLTH, the main program, which calls the proper modules for executing user specified options. These options include trim, stability derivative calculation, time history generation, and various input-output options.

  20. A general observatory control software framework design for existing small and mid-size telescopes

    NASA Astrophysics Data System (ADS)

    Ge, Liang; Lu, Xiao-Meng; Jiang, Xiao-Jun

    2015-07-01

    A general framework for observatory control software would help to improve the efficiency of observation and operation of telescopes, and would also be advantageous for remote and joint observations. We describe a general framework for observatory control software, which considers principles of flexibility and inheritance to meet the expectations from observers and technical personnel. This framework includes observation scheduling, device control and data storage. The design is based on a finite state machine that controls the whole process.

Top