Sample records for extensive simulations based

  1. simulation of the DNA force-extension curve

    NASA Astrophysics Data System (ADS)

    Shinaberry, Gregory; Mikhaylov, Ivan; Balaeff, Alexander

    A molecular dynamics simulation study of the force-extension curve of double-stranded DNA is presented. Extended simulations of the DNA at multiple points along the force-extension curve are conducted with DNA end-to-end length constrained at each point. The calculated force-extension curve qualitatively reproduces the experimental one. The DNA conformational ensemble at each extension shows that the famous plateau of the force-extension curve results from B-DNA melting, whereas the formation of the earlier-predicted novel DNA conformation called 'zip-DNA' takes place at extensions past the plateau. An extensive analysis of the DNA conformational ensemble in terms of base configuration, backbone configuration, solvent interaction energy, etc., is conducted in order to elucidate the physical origin of DNA elasticity and the main interactions responsible for the shape of the force-extension curve.

  2. Python-based geometry preparation and simulation visualization toolkits for STEPS

    PubMed Central

    Chen, Weiliang; De Schutter, Erik

    2014-01-01

    STEPS is a stochastic reaction-diffusion simulation engine that implements a spatial extension of Gillespie's Stochastic Simulation Algorithm (SSA) in complex tetrahedral geometries. An extensive Python-based interface is provided to STEPS so that it can interact with the large number of scientific packages in Python. However, a gap existed between the interfaces of these packages and the STEPS user interface, where supporting toolkits could reduce the amount of scripting required for research projects. This paper introduces two new supporting toolkits that support geometry preparation and visualization for STEPS simulations. PMID:24782754

  3. An object oriented Python interface for atomistic simulations

    NASA Astrophysics Data System (ADS)

    Hynninen, T.; Himanen, L.; Parkkinen, V.; Musso, T.; Corander, J.; Foster, A. S.

    2016-01-01

    Programmable simulation environments allow one to monitor and control calculations efficiently and automatically before, during, and after runtime. Environments directly accessible in a programming environment can be interfaced with powerful external analysis tools and extensions to enhance the functionality of the core program, and by incorporating a flexible object based structure, the environments make building and analysing computational setups intuitive. In this work, we present a classical atomistic force field with an interface written in Python language. The program is an extension for an existing object based atomistic simulation environment.

  4. Real-time maritime scene simulation for ladar sensors

    NASA Astrophysics Data System (ADS)

    Christie, Chad L.; Gouthas, Efthimios; Swierkowski, Leszek; Williams, Owen M.

    2011-06-01

    Continuing interest exists in the development of cost-effective synthetic environments for testing Laser Detection and Ranging (ladar) sensors. In this paper we describe a PC-based system for real-time ladar scene simulation of ships and small boats in a dynamic maritime environment. In particular, we describe the techniques employed to generate range imagery accompanied by passive radiance imagery. Our ladar scene generation system is an evolutionary extension of the VIRSuite infrared scene simulation program and includes all previous features such as ocean wave simulation, the physically-realistic representation of boat and ship dynamics, wake generation and simulation of whitecaps, spray, wake trails and foam. A terrain simulation extension is also under development. In this paper we outline the development, capabilities and limitations of the VIRSuite extensions.

  5. Simulating dynamic and mixed-severity fire regimes: a process-based fire extension for LANDIS-II

    Treesearch

    Brian R. Sturtevant; Robert M. Scheller; Brian R. Miranda; Douglas Shinneman; Alexandra Syphard

    2009-01-01

    Fire regimes result from reciprocal interactions between vegetation and fire that may be further affected by other disturbances, including climate, landform, and terrain. In this paper, we describe fire and fuel extensions for the forest landscape simulation model, LANDIS-II, that allow dynamic interactions among fire, vegetation, climate, and landscape structure, and...

  6. An actuator extension transformation for a motion simulator and an inverse transformation applying Newton-Raphson's method

    NASA Technical Reports Server (NTRS)

    Dieudonne, J. E.

    1972-01-01

    A set of equations which transform position and angular orientation of the centroid of the payload platform of a six-degree-of-freedom motion simulator into extensions of the simulator's actuators has been derived and is based on a geometrical representation of the system. An iterative scheme, Newton-Raphson's method, has been successfully used in a real time environment in the calculation of the position and angular orientation of the centroid of the payload platform when the magnitude of the actuator extensions is known. Sufficient accuracy is obtained by using only one Newton-Raphson iteration per integration step of the real time environment.

  7. Estimating canopy bulk density and canopy base height for conifer stands in the interior Western United States using the Forest Vegetation Simulator Fire and Fuels Extension.

    Treesearch

    Seth Ex; Frederick Smith; Tara Keyser; Stephanie Rebain

    2017-01-01

    The Forest Vegetation Simulator Fire and Fuels Extension (FFE-FVS) is often used to estimate canopy bulk density (CBD) and canopy base height (CBH), which are key indicators of crown fire hazard for conifer stands in the Western United States. Estimated CBD from FFE-FVS is calculated as the maximum 4 m running mean bulk density of predefined 0.3 m thick canopy layers (...

  8. Emissions from Open Burning of Simulated Military Waste from Forward Operating Bases

    EPA Science Inventory

    Emissions from open burning of simulated military waste from forward operating bases (FOBs) were extensively characterized as an initial step in assessing potential inhalation exposure of FOB personnel and future disposal alternatives. Emissions from two different burning scenar...

  9. Live tree carbon stock equivalence of fire and fuels extension to the Forest Vegetation Simulator and Forest Inventory and Analysis approaches

    Treesearch

    James E. Smith; Coeli M. Hoover

    2017-01-01

    The carbon reports in the Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) provide two alternate approaches to carbon estimates for live trees (Rebain 2010). These are (1) the FFE biomass algorithms, which are volumebased biomass equations, and (2) the Jenkins allometric equations (Jenkins and others 2003), which are diameter based. Here, we...

  10. A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON

    PubMed Central

    King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix

    2008-01-01

    As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597

  11. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.

    PubMed

    Rincon, J A; Poza-Lujan, Jose-Luis; Julian, V; Posadas-Yagüe, Juan-Luis; Carrascosa, C

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system.

  12. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments

    PubMed Central

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691

  13. LOOS: an extensible platform for the structural analysis of simulations.

    PubMed

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.

  14. Signal treatments to reduce heavy vehicle crash-risk at metropolitan highway intersections.

    PubMed

    Archer, Jeffery; Young, William

    2009-05-01

    Heavy vehicle red-light running at intersections is a common safety problem that has severe consequences. This paper investigates alternative signal treatments that address this issue. A micro-simulation analysis approach was adopted as a precursor to a field trial. The simulation model emulated traffic conditions at a known problem intersection and provided a baseline measure to compare the effects of: an extension of amber time; an extension of green for heavy vehicles detected in the dilemma zone at the onset of amber; an extension of the all-red safety-clearance time based on the detection of vehicles considered likely to run the red light at two detector locations during amber; an extension of the all-red safety-clearance time based on the detection of potential red-light runners during amber or red; and a combination of the second and fourth alternatives. Results suggested safety improvements for all treatments. An extension of amber provided the best safety effect but is known to be prone to behavioural adaptation effects and wastes traffic movement time unnecessarily. A green extension for heavy vehicles detected in the dilemma zone and an all-red extension for potential red-light runners were deemed to provide a sustainable safety improvement and operational efficiency.

  15. Physics based modeling of a series parallel battery pack for asymmetry analysis, predictive control and life extension

    NASA Astrophysics Data System (ADS)

    Ganesan, Nandhini; Basu, Suman; Hariharan, Krishnan S.; Kolake, Subramanya Mayya; Song, Taewon; Yeo, Taejung; Sohn, Dong Kee; Doo, Seokgwang

    2016-08-01

    Lithium-Ion batteries used for electric vehicle applications are subject to large currents and various operation conditions, making battery pack design and life extension a challenging problem. With increase in complexity, modeling and simulation can lead to insights that ensure optimal performance and life extension. In this manuscript, an electrochemical-thermal (ECT) coupled model for a 6 series × 5 parallel pack is developed for Li ion cells with NCA/C electrodes and validated against experimental data. Contribution of the cathode to overall degradation at various operating conditions is assessed. Pack asymmetry is analyzed from a design and an operational perspective. Design based asymmetry leads to a new approach of obtaining the individual cell responses of the pack from an average ECT output. Operational asymmetry is demonstrated in terms of effects of thermal gradients on cycle life, and an efficient model predictive control technique is developed. Concept of reconfigurable battery pack is studied using detailed simulations that can be used for effective monitoring and extension of battery pack life.

  16. A Three-Dimensional Eulerian Code for Simulation of High-Speed Multimaterial Interactions

    DTIC Science & Technology

    2011-08-01

    PDE -based extension. The extension process is done on only the host cells on a particular processor. After extension the parallel communication is...condensation shocks, explosive debris transport, detonation in heterogeneous media and so on. In these flows complex interactions occur between the...A.22] and ijΩ is the spin tensor. The Jaumann derivative is used to ensure objectivity of the stress tensor with respect to rotation

  17. Developing a Conceptual Architecture for a Generalized Agent-based Modeling Environment (GAME)

    DTIC Science & Technology

    2008-03-01

    4. REPAST (Java, Python , C#, Open Source) ........28 5. MASON: Multi-Agent Modeling Language (Swarm Extension... Python , C#, Open Source) Repast (Recursive Porous Agent Simulation Toolkit) was designed for building agent-based models and simulations in the...Repast makes it easy for inexperienced users to build models by including a built-in simple model and provide interfaces through which menus and Python

  18. In Vivo Investigation of the Effectiveness of a Hyper-viscoelastic Model in Simulating Brain Retraction

    NASA Astrophysics Data System (ADS)

    Li, Ping; Wang, Weiwei; Zhang, Chenxi; An, Yong; Song, Zhijian

    2016-07-01

    Intraoperative brain retraction leads to a misalignment between the intraoperative positions of the brain structures and their previous positions, as determined from preoperative images. In vitro swine brain sample uniaxial tests showed that the mechanical response of brain tissue to compression and extension could be described by the hyper-viscoelasticity theory. The brain retraction caused by the mechanical process is a combination of brain tissue compression and extension. In this paper, we first constructed a hyper-viscoelastic framework based on the extended finite element method (XFEM) to simulate intraoperative brain retraction. To explore its effectiveness, we then applied this framework to an in vivo brain retraction simulation. The simulation strictly followed the clinical scenario, in which seven swine were subjected to brain retraction. Our experimental results showed that the hyper-viscoelastic XFEM framework is capable of simulating intraoperative brain retraction and improving the navigation accuracy of an image-guided neurosurgery system (IGNS).

  19. On-time reliability impacts of advanced traveler information services (ATIS). Volume II, Extensions and applications of the simulated yoked study concept

    DOT National Transportation Integrated Search

    2002-03-01

    In a simulated yoke study, estimates of roadway travel times are archived from web-based Advanced Traveler Information Systems (ATIS) and used to recreate hypothetical, retrospective paired driving trials between travelers with and without ATIS. Prev...

  20. Kinematic modeling of a double octahedral Variable Geometry Truss (VGT) as an extensible gimbal

    NASA Technical Reports Server (NTRS)

    Williams, Robert L., II

    1994-01-01

    This paper presents the complete forward and inverse kinematics solutions for control of the three degree-of-freedom (DOF) double octahedral variable geometry truss (VGT) module as an extensible gimbal. A VGT is a truss structure partially comprised of linearly actuated members. A VGT can be used as joints in a large, lightweight, high load-bearing manipulator for earth- and space-based remote operations, plus industrial applications. The results have been used to control the NASA VGT hardware as an extensible gimbal, demonstrating the capability of this device to be a joint in a VGT-based manipulator. This work is an integral part of a VGT-based manipulator design, simulation, and control tool.

  1. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  2. OpenMM 4: A Reusable, Extensible, Hardware Independent Library for High Performance Molecular Simulation.

    PubMed

    Eastman, Peter; Friedrichs, Mark S; Chodera, John D; Radmer, Randall J; Bruns, Christopher M; Ku, Joy P; Beauchamp, Kyle A; Lane, Thomas J; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R; Pande, Vijay S

    2013-01-08

    OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added.

  3. OpenMM 4: A Reusable, Extensible, Hardware Independent Library for High Performance Molecular Simulation

    PubMed Central

    Eastman, Peter; Friedrichs, Mark S.; Chodera, John D.; Radmer, Randall J.; Bruns, Christopher M.; Ku, Joy P.; Beauchamp, Kyle A.; Lane, Thomas J.; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R.; Pande, Vijay S.

    2012-01-01

    OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added. PMID:23316124

  4. Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2000-01-01

    A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.

  5. iCrowd: agent-based behavior modeling and crowd simulator

    NASA Astrophysics Data System (ADS)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  6. A generalized weight-based particle-in-cell simulation scheme

    NASA Astrophysics Data System (ADS)

    Lee, W. W.; Jenkins, T. G.; Ethier, S.

    2011-03-01

    A generalized weight-based particle simulation scheme suitable for simulating magnetized plasmas, where the zeroth-order inhomogeneity is important, is presented. The scheme is an extension of the perturbative simulation schemes developed earlier for particle-in-cell (PIC) simulations. The new scheme is designed to simulate both the perturbed distribution ( δf) and the full distribution (full- F) within the same code. The development is based on the concept of multiscale expansion, which separates the scale lengths of the background inhomogeneity from those associated with the perturbed distributions. The potential advantage for such an arrangement is to minimize the particle noise by using δf in the linear stage of the simulation, while retaining the flexibility of a full- F capability in the fully nonlinear stage of the development when signals associated with plasma turbulence are at a much higher level than those from the intrinsic particle noise.

  7. Monte Carlo Simulations and Generation of the SPI Response

    NASA Technical Reports Server (NTRS)

    Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.

    2003-01-01

    In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss our extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and inflight calibration data with MGEANT simulation.

  8. Monte Carlo Simulations and Generation of the SPI Response

    NASA Technical Reports Server (NTRS)

    Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Cordier, B.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.

    2003-01-01

    In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss ow extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and infiight Calibration data with MGEANT simulations.

  9. The effect of resistance level and stability demands on recruitment patterns and internal loading of spine in dynamic flexion and extension using a simple trunk model.

    PubMed

    Zeinali-Davarani, Shahrokh; Shirazi-Adl, Aboulfazl; Dariush, Behzad; Hemami, Hooshang; Parnianpour, Mohamad

    2011-07-01

    The effects of external resistance on the recruitment of trunk muscles in sagittal movements and the coactivation mechanism to maintain spinal stability were investigated using a simple computational model of iso-resistive spine sagittal movements. Neural excitation of muscles was attained based on inverse dynamics approach along with a stability-based optimisation. The trunk flexion and extension movements between 60° flexion and the upright posture against various resistance levels were simulated. Incorporation of the stability constraint in the optimisation algorithm required higher antagonistic activities for all resistance levels mostly close to the upright position. Extension movements showed higher coactivation with higher resistance, whereas flexion movements demonstrated lower coactivation indicating a greater stability demand in backward extension movements against higher resistance at the neighbourhood of the upright posture. Optimal extension profiles based on minimum jerk, work and power had distinct kinematics profiles which led to recruitment patterns with different timing and amplitude of activation.

  10. Urbanization and watershed sustainability: Collaborative simulation modeling of future development states

    NASA Astrophysics Data System (ADS)

    Randhir, Timothy O.; Raposa, Sarah

    2014-11-01

    Urbanization has a significant impact on water resources and requires a watershed-based approach to evaluate impacts of land use and urban development on watershed processes. This study uses a simulation with urban policy scenarios to model and strategize transferable recommendations for municipalities and cities to guide urban decisions using watershed ecohydrologic principles. The watershed simulation model is used to evaluation intensive (policy in existing built regions) and extensive (policy outside existing build regions) urban development scenarios with and without implementation of Best Management practices (BMPs). Water quantity and quality changes are simulated to assess effectiveness of five urban development scenarios. It is observed that optimal combination of intensive and extensive strategies can be used to sustain urban ecosystems. BMPs are found critical to reduce storm water and water quality impacts on urban development. Conservation zoning and incentives for voluntary adoption of BMPs can be used in sustaining urbanizing watersheds.

  11. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  12. Charge plasma technique based dopingless accumulation mode junctionless cylindrical surrounding gate MOSFET: analog performance improvement

    NASA Astrophysics Data System (ADS)

    Trivedi, Nitin; Kumar, Manoj; Haldar, Subhasis; Deswal, S. S.; Gupta, Mridula; Gupta, R. S.

    2017-09-01

    A charge plasma technique based dopingless (DL) accumulation mode (AM) junctionless (JL) cylindrical surrounding gate (CSG) MOSFET has been proposed and extensively investigated. Proposed device has no physical junction at source to channel and channel to drain interface. The complete silicon pillar has been considered as undoped. The high free electron density or induced N+ region is designed by keeping the work function of source/drain metal contacts lower than the work function of undoped silicon. Thus, its fabrication complexity is drastically reduced by curbing the requirement of high temperature doping techniques. The electrical/analog characteristics for the proposed device has been extensively investigated using the numerical simulation and are compared with conventional junctionless cylindrical surrounding gate (JL-CSG) MOSFET with identical dimensions. For the numerical simulation purpose ATLAS-3D device simulator is used. The results show that the proposed device is more short channel immune to conventional JL-CSG MOSFET and suitable for faster switching applications due to higher I ON/ I OFF ratio.

  13. In Vivo Investigation of the Effectiveness of a Hyper-viscoelastic Model in Simulating Brain Retraction

    PubMed Central

    Li, Ping; Wang, Weiwei; Zhang, Chenxi; An, Yong; Song, Zhijian

    2016-01-01

    Intraoperative brain retraction leads to a misalignment between the intraoperative positions of the brain structures and their previous positions, as determined from preoperative images. In vitro swine brain sample uniaxial tests showed that the mechanical response of brain tissue to compression and extension could be described by the hyper-viscoelasticity theory. The brain retraction caused by the mechanical process is a combination of brain tissue compression and extension. In this paper, we first constructed a hyper-viscoelastic framework based on the extended finite element method (XFEM) to simulate intraoperative brain retraction. To explore its effectiveness, we then applied this framework to an in vivo brain retraction simulation. The simulation strictly followed the clinical scenario, in which seven swine were subjected to brain retraction. Our experimental results showed that the hyper-viscoelastic XFEM framework is capable of simulating intraoperative brain retraction and improving the navigation accuracy of an image-guided neurosurgery system (IGNS). PMID:27387301

  14. Simulator certification methods and the vertical motion simulator

    NASA Technical Reports Server (NTRS)

    Showalter, T. W.

    1981-01-01

    The vertical motion simulator (VMS) is designed to simulate a variety of experimental helicopter and STOL/VTOL aircraft as well as other kinds of aircraft with special pitch and Z axis characteristics. The VMS includes a large motion base with extensive vertical and lateral travel capabilities, a computer generated image visual system, and a high speed CDC 7600 computer system, which performs aero model calculations. Guidelines on how to measure and evaluate VMS performance were developed. A survey of simulation users was conducted to ascertain they evaluated and certified simulators for use. The results are presented.

  15. The Roland Maze Project school-based extensive air shower network

    NASA Astrophysics Data System (ADS)

    Feder, J.; Jȩdrzejczak, K.; Karczmarczyk, J.; Lewandowski, R.; Swarzyński, J.; Szabelska, B.; Szabelski, J.; Wibig, T.

    2006-01-01

    We plan to construct the large area network of extensive air shower detectors placed on the roofs of high school buildings in the city of Łódź. Detection points will be connected by INTERNET to the central server and their work will be synchronized by GPS. The main scientific goal of the project are studies of ultra high energy cosmic rays. Using existing town infrastructure (INTERNET, power supply, etc.) will significantly reduce the cost of the experiment. Engaging high school students in the research program should significantly increase their knowledge of science and modern technologies, and can be a very efficient way of science popularisation. We performed simulations of the projected network capabilities of registering Extensive Air Showers and reconstructing energies of primary particles. Results of the simulations and the current status of project realisation will be presented.

  16. Mesoscale Particle-Based Model of Electrophoresis

    DOE PAGES

    Giera, Brian; Zepeda-Ruiz, Luis A.; Pascall, Andrew J.; ...

    2015-07-31

    Here, we develop and evaluate a semi-empirical particle-based model of electrophoresis using extensive mesoscale simulations. We parameterize the model using only measurable quantities from a broad set of colloidal suspensions with properties that span the experimentally relevant regime. With sufficient sampling, simulated diffusivities and electrophoretic velocities match predictions of the ubiquitous Stokes-Einstein and Henry equations, respectively. This agreement holds for non-polar and aqueous solvents or ionic liquid colloidal suspensions under a wide range of applied electric fields.

  17. Mesoscale Particle-Based Model of Electrophoresis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giera, Brian; Zepeda-Ruiz, Luis A.; Pascall, Andrew J.

    Here, we develop and evaluate a semi-empirical particle-based model of electrophoresis using extensive mesoscale simulations. We parameterize the model using only measurable quantities from a broad set of colloidal suspensions with properties that span the experimentally relevant regime. With sufficient sampling, simulated diffusivities and electrophoretic velocities match predictions of the ubiquitous Stokes-Einstein and Henry equations, respectively. This agreement holds for non-polar and aqueous solvents or ionic liquid colloidal suspensions under a wide range of applied electric fields.

  18. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    NASA Astrophysics Data System (ADS)

    España, S; Herraiz, J L; Vicente, E; Vaquero, J J; Desco, M; Udias, J M

    2009-03-01

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  19. Tracking of Maneuvering Complex Extended Object with Coupled Motion Kinematics and Extension Dynamics Using Range Extent Measurements

    PubMed Central

    Sun, Lifan; Ji, Baofeng; Lan, Jian; He, Zishu; Pu, Jiexin

    2017-01-01

    The key to successful maneuvering complex extended object tracking (MCEOT) using range extent measurements provided by high resolution sensors lies in accurate and effective modeling of both the extension dynamics and the centroid kinematics. During object maneuvers, the extension dynamics of an object with a complex shape is highly coupled with the centroid kinematics. However, this difficult but important problem is rarely considered and solved explicitly. In view of this, this paper proposes a general approach to modeling a maneuvering complex extended object based on Minkowski sum, so that the coupled turn maneuvers in both the centroid states and extensions can be described accurately. The new model has a concise and unified form, in which the complex extension dynamics can be simply and jointly characterized by multiple simple sub-objects’ extension dynamics based on Minkowski sum. The proposed maneuvering model fits range extent measurements very well due to its favorable properties. Based on this model, an MCEOT algorithm dealing with motion and extension maneuvers is also derived. Two different cases of the turn maneuvers with known/unknown turn rates are specifically considered. The proposed algorithm which jointly estimates the kinematic state and the object extension can also be easily implemented. Simulation results demonstrate the effectiveness of the proposed modeling and tracking approaches. PMID:28937629

  20. Ground-based facilities for simulation of microgravity: organism-specific recommendations for their use, and recommended terminology.

    PubMed

    Herranz, Raul; Anken, Ralf; Boonstra, Johannes; Braun, Markus; Christianen, Peter C M; de Geest, Maarten; Hauslage, Jens; Hilbig, Reinhard; Hill, Richard J A; Lebert, Michael; Medina, F Javier; Vagt, Nicole; Ullrich, Oliver; van Loon, Jack J W A; Hemmersbach, Ruth

    2013-01-01

    Research in microgravity is indispensable to disclose the impact of gravity on biological processes and organisms. However, research in the near-Earth orbit is severely constrained by the limited number of flight opportunities. Ground-based simulators of microgravity are valuable tools for preparing spaceflight experiments, but they also facilitate stand-alone studies and thus provide additional and cost-efficient platforms for gravitational research. The various microgravity simulators that are frequently used by gravitational biologists are based on different physical principles. This comparative study gives an overview of the most frequently used microgravity simulators and demonstrates their individual capacities and limitations. The range of applicability of the various ground-based microgravity simulators for biological specimens was carefully evaluated by using organisms that have been studied extensively under the conditions of real microgravity in space. In addition, current heterogeneous terminology is discussed critically, and recommendations are given for appropriate selection of adequate simulators and consistent use of nomenclature.

  1. Modeling and Simulation of Shuttle Launch and Range Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge; Thirumalainambi, Rajkumar

    2004-01-01

    The simulation and modeling test bed is based on a mockup of a space flight operations control suitable to experiment physical, procedural, software, hardware and psychological aspects of space flight operations. The test bed consists of a weather expert system to advise on the effect of weather to the launch operations. It also simulates toxic gas dispersion model, impact of human health risk, debris dispersion model in 3D visualization. Since all modeling and simulation is based on the internet, it could reduce the cost of operations of launch and range safety by conducting extensive research before a particular launch. Each model has an independent decision making module to derive the best decision for launch.

  2. Progress in the Simulation of Steady and Time-Dependent Flows with 3D Parallel Unstructured Cartesian Methods

    NASA Technical Reports Server (NTRS)

    Aftosmis, M. J.; Berger, M. J.; Murman, S. M.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    The proposed paper will present recent extensions in the development of an efficient Euler solver for adaptively-refined Cartesian meshes with embedded boundaries. The paper will focus on extensions of the basic method to include solution adaptation, time-dependent flow simulation, and arbitrary rigid domain motion. The parallel multilevel method makes use of on-the-fly parallel domain decomposition to achieve extremely good scalability on large numbers of processors, and is coupled with an automatic coarse mesh generation algorithm for efficient processing by a multigrid smoother. Numerical results are presented demonstrating parallel speed-ups of up to 435 on 512 processors. Solution-based adaptation may be keyed off truncation error estimates using tau-extrapolation or a variety of feature detection based refinement parameters. The multigrid method is extended to for time-dependent flows through the use of a dual-time approach. The extension to rigid domain motion uses an Arbitrary Lagrangian-Eulerlarian (ALE) formulation, and results will be presented for a variety of two- and three-dimensional example problems with both simple and complex geometry.

  3. Behavior of stem cells under outer-space microgravity and ground-based microgravity simulation.

    PubMed

    Zhang, Cui; Li, Liang; Chen, Jianling; Wang, Jinfu

    2015-06-01

    With rapid development of space engineering, research on life sciences in space is being conducted extensively, especially cellular and molecular studies on space medicine. Stem cells, undifferentiated cells that can differentiate into specialized cells, are considered a key resource for regenerative medicine. Research on stem cells under conditions of microgravity during a space flight or a ground-based simulation has generated several excellent findings. To help readers understand the effects of outer space and ground-based simulation conditions on stem cells, we reviewed recent studies on the effects of microgravity (as an obvious environmental factor in space) on morphology, proliferation, migration, and differentiation of stem cells. © 2015 International Federation for Cell Biology.

  4. Distributed Web-Based Expert System for Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar

    2005-01-01

    The simulation and modeling of launch operations is based on a representation of the organization of the operations suitable to experiment of the physical, procedural, software, hardware and psychological aspects of space flight operations. The virtual test bed consists of a weather expert system to advice on the effect of weather to the launch operations. It also simulates toxic gas dispersion model, and the risk impact on human health. Since all modeling and simulation is based on the internet, it could reduce the cost of operations of launch and range safety by conducting extensive research before a particular launch. Each model has an independent decision making module to derive the best decision for launch.

  5. A dynamic motion simulator for future European docking systems

    NASA Technical Reports Server (NTRS)

    Brondino, G.; Marchal, PH.; Grimbert, D.; Noirault, P.

    1990-01-01

    Europe's first confrontation with docking in space will require extensive testing to verify design and performance and to qualify hardware. For this purpose, a Docking Dynamics Test Facility (DDTF) was developed. It allows reproduction on the ground of the same impact loads and relative motion dynamics which would occur in space during docking. It uses a 9 degree of freedom, servo-motion system, controlled by a real time computer, which simulates the docking spacecraft in a zero-g environment. The test technique involves and active loop based on six axis force and torque detection, a mathematical simulation of individual spacecraft dynamics, and a 9 degree of freedom servomotion of which 3 DOFs allow extension of the kinematic range to 5 m. The configuration was checked out by closed loop tests involving spacecraft control models and real sensor hardware. The test facility at present has an extensive configuration that allows evaluation of both proximity control and docking systems. It provides a versatile tool to verify system design, hardware items and performance capabilities in the ongoing HERMES and COLUMBUS programs. The test system is described and its capabilities are summarized.

  6. Promoting Simulation Globally: Networking with Nursing Colleagues Across Five Continents.

    PubMed

    Alfes, Celeste M; Madigan, Elizabeth A

    Simulation education is gaining momentum internationally and may provide the opportunity to enhance clinical education while disseminating evidence-based practice standards for clinical simulation and learning. There is a need to develop a cohesive leadership group that fosters support, networking, and sharing of simulation resources globally. The Frances Payne Bolton School of Nursing at Case Western Reserve University has had the unique opportunity to establish academic exchange programs with schools of nursing across five continents. Although the joint and mutual simulation activities have been extensive, each international collaboration has also provided insight into the innovations developed by global partners.

  7. Study on photon transport problem based on the platform of molecular optical simulation environment.

    PubMed

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (SP(n)), and physical measurement to verify the performance of our study method on both accuracy and efficiency.

  8. Study on Photon Transport Problem Based on the Platform of Molecular Optical Simulation Environment

    PubMed Central

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (S P n), and physical measurement to verify the performance of our study method on both accuracy and efficiency. PMID:20445737

  9. Evaluation of a computational model to predict elbow range of motion

    PubMed Central

    Nishiwaki, Masao; Johnson, James A.; King, Graham J. W.; Athwal, George S.

    2014-01-01

    Computer models capable of predicting elbow flexion and extension range of motion (ROM) limits would be useful for assisting surgeons in improving the outcomes of surgical treatment of patients with elbow contractures. A simple and robust computer-based model was developed that predicts elbow joint ROM using bone geometries calculated from computed tomography image data. The model assumes a hinge-like flexion-extension axis, and that elbow passive ROM limits can be based on terminal bony impingement. The model was validated against experimental results with a cadaveric specimen, and was able to predict the flexion and extension limits of the intact joint to 0° and 3°, respectively. The model was also able to predict the flexion and extension limits to 1° and 2°, respectively, when simulated osteophytes were inserted into the joint. Future studies based on this approach will be used for the prediction of elbow flexion-extension ROM in patients with primary osteoarthritis to help identify motion-limiting hypertrophic osteophytes, and will eventually permit real-time computer-assisted navigated excisions. PMID:24841799

  10. Ground-Based Facilities for Simulation of Microgravity: Organism-Specific Recommendations for Their Use, and Recommended Terminology

    PubMed Central

    Anken, Ralf; Boonstra, Johannes; Braun, Markus; Christianen, Peter C.M.; de Geest, Maarten; Hauslage, Jens; Hilbig, Reinhard; Hill, Richard J.A.; Lebert, Michael; Medina, F. Javier; Vagt, Nicole; Ullrich, Oliver

    2013-01-01

    Abstract Research in microgravity is indispensable to disclose the impact of gravity on biological processes and organisms. However, research in the near-Earth orbit is severely constrained by the limited number of flight opportunities. Ground-based simulators of microgravity are valuable tools for preparing spaceflight experiments, but they also facilitate stand-alone studies and thus provide additional and cost-efficient platforms for gravitational research. The various microgravity simulators that are frequently used by gravitational biologists are based on different physical principles. This comparative study gives an overview of the most frequently used microgravity simulators and demonstrates their individual capacities and limitations. The range of applicability of the various ground-based microgravity simulators for biological specimens was carefully evaluated by using organisms that have been studied extensively under the conditions of real microgravity in space. In addition, current heterogeneous terminology is discussed critically, and recommendations are given for appropriate selection of adequate simulators and consistent use of nomenclature. Key Words: 2-D clinostat—3-D clinostat—Gravity—Magnetic levitation—Random positioning machine—Simulated microgravity—Space biology. Astrobiology 13, 1–17. PMID:23252378

  11. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    PubMed

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  12. The Fire and Fuels Extension to the Forest Vegetation Simulator

    Treesearch

    Elizabeth Reinhardt; Nicholas L. Crookston

    2003-01-01

    The Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) simulates fuel dynamics and potential fire behaviour over time, in the context of stand development and management. Existing models of fire behavior and fire effects were added to FVS to form this extension. New submodels representing snag and fuel dynamics were created to complete the linkages...

  13. An extensive coronagraphic simulation applied to LBT

    NASA Astrophysics Data System (ADS)

    Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.

    2016-08-01

    In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.

  14. Primitive chain network simulations for entangled DNA solutions

    NASA Astrophysics Data System (ADS)

    Masubuchi, Yuichi; Furuichi, Kenji; Horio, Kazushi; Uneyama, Takashi; Watanabe, Hiroshi; Ianniruberto, Giovanni; Greco, Francesco; Marrucci, Giuseppe

    2009-09-01

    Molecular theories for polymer rheology are based on conformational dynamics of the polymeric chain. Hence, measurements directly related to molecular conformations appear more appealing than indirect ones obtained from rheology. In this study, primitive chain network simulations are compared to experimental data of entangled DNA solutions [Teixeira et al., Macromolecules 40, 2461 (2007)]. In addition to rheological comparisons of both linear and nonlinear viscoelasticities, a molecular extension measure obtained by Teixeira et al. through fluorescent microscopy is compared to simulations, in terms of both averages and distributions. The influence of flow on conformational distributions has never been simulated for the case of entangled polymers, and how DNA molecular individualism extends to the entangled regime is not known. The linear viscoelastic response and the viscosity growth curve in the nonlinear regime are found in good agreement with data for various DNA concentrations. Conversely, the molecular extension measure shows significant departures, even under equilibrium conditions. The reason for such discrepancies remains unknown.

  15. A versatile petri net based architecture for modeling and simulation of complex biological processes.

    PubMed

    Nagasaki, Masao; Doi, Atsushi; Matsuno, Hiroshi; Miyano, Satoru

    2004-01-01

    The research on modeling and simulation of complex biological systems is getting more important in Systems Biology. In this respect, we have developed Hybrid Function Petri net (HFPN) that was newly developed from existing Petri net because of their intuitive graphical representation and their capabilities for mathematical analyses. However, in the process of modeling metabolic, gene regulatory or signal transduction pathways with the architecture, we have realized three extensions of HFPN, (i) an entity should be extended to contain more than one value, (ii) an entity should be extended to handle other primitive types, e.g. boolean, string, (iii) an entity should be extended to handle more advanced type called object that consists of variables and methods, are necessary for modeling biological systems with Petri net based architecture. To deal with it, we define a new enhanced Petri net called hybrid functional Petri net with extension (HFPNe). To demonstrate the effectiveness of the enhancements, we model and simulate with HFPNe four biological processes that are diffcult to represent with the previous architecture HFPN.

  16. Framework for multi-resolution analyses of advanced traffic management strategies [summary].

    DOT National Transportation Integrated Search

    2017-01-01

    Transportation planning relies extensively on software that can simulate and predict travel behavior in response to alternative transportation networks. However, different software packages view traffic at different scales. Some programs are based on...

  17. Systematic Validation of Protein Force Fields against Experimental Data

    PubMed Central

    Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.

    2012-01-01

    Molecular dynamics simulations provide a vehicle for capturing the structures, motions, and interactions of biological macromolecules in full atomic detail. The accuracy of such simulations, however, is critically dependent on the force field—the mathematical model used to approximate the atomic-level forces acting on the simulated molecular system. Here we present a systematic and extensive evaluation of eight different protein force fields based on comparisons of experimental data with molecular dynamics simulations that reach a previously inaccessible timescale. First, through extensive comparisons with experimental NMR data, we examined the force fields' abilities to describe the structure and fluctuations of folded proteins. Second, we quantified potential biases towards different secondary structure types by comparing experimental and simulation data for small peptides that preferentially populate either helical or sheet-like structures. Third, we tested the force fields' abilities to fold two small proteins—one α-helical, the other with β-sheet structure. The results suggest that force fields have improved over time, and that the most recent versions, while not perfect, provide an accurate description of many structural and dynamical properties of proteins. PMID:22384157

  18. Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    NASA Astrophysics Data System (ADS)

    Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan

    2014-03-01

    We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.

  19. Modeling AFM-induced PEVK extension and the reversible unfolding of Ig/FNIII domains in single and multiple titin molecules.

    PubMed Central

    Zhang, B; Evans, J S

    2001-01-01

    Molecular elasticity is associated with a select number of polypeptides and proteins, such as titin, Lustrin A, silk fibroin, and spider silk dragline protein. In the case of titin, the globular (Ig) and non-globular (PEVK) regions act as extensible springs under stretch; however, their unfolding behavior and force extension characteristics are different. Using our time-dependent macroscopic method for simulating AFM-induced titin Ig domain unfolding and refolding, we simulate the extension and relaxation of hypothetical titin chains containing Ig domains and a PEVK region. Two different models are explored: 1) a series-linked WLC expression that treats the PEVK region as a distinct entropic spring, and 2) a summation of N single WLC expressions that simulates the extension and release of a discrete number of parallel titin chains containing constant or variable amounts of PEVK. In addition to these simulations, we also modeled the extension of a hypothetical PEVK domain using a linear Hooke's spring model to account for "enthalpic" contributions to PEVK elasticity. We find that the modified WLC simulations feature chain length compensation, Ig domain unfolding/refolding, and force-extension behavior that more closely approximate AFM, laser tweezer, and immunolocalization experimental data. In addition, our simulations reveal the following: 1) PEVK extension overlaps with the onset of Ig domain unfolding, and 2) variations in PEVK content within a titin chain ensemble lead to elastic diversity within that ensemble. PMID:11159428

  20. Process Simulation of Gas Metal Arc Welding Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Paul E.

    2005-09-06

    ARCWELDER is a Windows-based application that simulates gas metal arc welding (GMAW) of steel and aluminum. The software simulates the welding process in an accurate and efficient manner, provides menu items for process parameter selection, and includes a graphical user interface with the option to animate the process. The user enters the base and electrode material, open circuit voltage, wire diameter, wire feed speed, welding speed, and standoff distance. The program computes the size and shape of a square-groove or V-groove weld in the flat position. The program also computes the current, arc voltage, arc length, electrode extension, transfer ofmore » droplets, heat input, filler metal deposition, base metal dilution, and centerline cooling rate, in English or SI units. The simulation may be used to select welding parameters that lead to desired operation conditions.« less

  1. Compensation based on linearized analysis for a six degree of freedom motion simulator

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Dieudonne, J. E.; Martin, D. J., Jr.; Copeland, J. L.

    1973-01-01

    The inertial response characteristics of a synergistic, six-degree-of-freedom motion base are presented in terms of amplitude ratio and phase lag as functions of frequency data for the frequency range of interest (0 to 2 Hz) in real time, digital, flight simulators. The notch filters which smooth the digital-drive signals to continuous drive signals are presented, and appropriate compensation, based on the inertial response data, is suggested. The existence of an inverse transformation that converts actuator extensions into inertial positions makes it possible to gather the response data in the inertial axis system.

  2. Motion-base simulator results of advanced supersonic transport handling qualities with active controls

    NASA Technical Reports Server (NTRS)

    Feather, J. B.; Joshi, D. S.

    1981-01-01

    Handling qualities of the unaugmented advanced supersonic transport (AST) are deficient in the low-speed, landing approach regime. Consequently, improvement in handling with active control augmentation systems has been achieved using implicit model-following techniques. Extensive fixed-based simulator evaluations were used to validate these systems prior to tests with full motion and visual capabilities on a six-axis motion-base simulator (MBS). These tests compared the handling qualities of the unaugmented AST with several augmented configurations to ascertain the effectiveness of these systems. Cooper-Harper ratings, tracking errors, and control activity data from the MBS tests have been analyzed statistically. The results show the fully augmented AST handling qualities have been improved to an acceptable level.

  3. An overview of the fire and fuels extension to the forest vegetation simulator

    Treesearch

    Sarah J. Beukema; Elizabeth D. Reinhardt; Werner A. Kurz; Nicholas L. Crookston

    2000-01-01

    The Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) has been developed to assess the risk, behavior, and impact of fire in forest ecosystems. This extension to the widely-used stand-dynamics model FVS simulates the dynamics of snags and surface fuels as they are affected by stand management (of trees or fuels), live tree growth and mortality,...

  4. OSCAR a Matlab based optical FFT code

    NASA Astrophysics Data System (ADS)

    Degallaix, Jérôme

    2010-05-01

    Optical simulation softwares are essential tools for designing and commissioning laser interferometers. This article aims to introduce OSCAR, a Matlab based FFT code, to the experimentalist community. OSCAR (Optical Simulation Containing Ansys Results) is used to simulate the steady state electric fields in optical cavities with realistic mirrors. The main advantage of OSCAR over other similar packages is the simplicity of its code requiring only a short time to master. As a result, even for a beginner, it is relatively easy to modify OSCAR to suit other specific purposes. OSCAR includes an extensive manual and numerous detailed examples such as simulating thermal aberration, calculating cavity eigen modes and diffraction loss, simulating flat beam cavities and three mirror ring cavities. An example is also provided about how to run OSCAR on the GPU of modern graphic cards instead of the CPU, making the simulation up to 20 times faster.

  5. NASA Constellation Distributed Simulation Middleware Trade Study

    NASA Technical Reports Server (NTRS)

    Hasan, David; Bowman, James D.; Fisher, Nancy; Cutts, Dannie; Cures, Edwin Z.

    2008-01-01

    This paper presents the results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL.

  6. Event-driven simulations of nonlinear integrate-and-fire neurons.

    PubMed

    Tonnelier, Arnaud; Belmabrouk, Hana; Martinez, Dominique

    2007-12-01

    Event-driven strategies have been used to simulate spiking neural networks exactly. Previous work is limited to linear integrate-and-fire neurons. In this note, we extend event-driven schemes to a class of nonlinear integrate-and-fire models. Results are presented for the quadratic integrate-and-fire model with instantaneous or exponential synaptic currents. Extensions to conductance-based currents and exponential integrate-and-fire neurons are discussed.

  7. Control of Transitional and Turbulent Flows Using Plasma-Based Actuators

    DTIC Science & Technology

    2006-06-01

    by means of asymmetric dielectric-barrier-discharge ( DBD ) actuators is presented. The flow fields are simulated employ- ing an extensively validated...effective use of DBD devices. As a consequence, meaningful computations require the use of three-dimensional large-eddy simulation approaches capable of...counter-flow DBD actuator is shown to provide an effective on-demand tripping device . This prop- erty is exploited for the suppression of laminar

  8. Evaluation Of Model Based Systems Engineering Processes For Integration Into Rapid Acquisition Programs

    DTIC Science & Technology

    2016-09-01

    Failure MTBCF Mean Time Between Critical Failure MIRV Multiple Independently-targetable Reentry Vehicle MK6LE MK6 Guidance System Life Extension...programs were the MK54 Lightweight Torpedo program, a Raytheon Radar program, and the Life Extension of the MK6 Guidance System (MK6LE) of the...activities throughout the later life -cycle phases. MBSE allowed the programs to manage the evolution of simulation capabilities, as well as to assess the

  9. Detecting DNA regulatory motifs by incorporating positional trendsin information content

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kechris, Katherina J.; van Zwet, Erik; Bickel, Peter J.

    2004-05-04

    On the basis of the observation that conserved positions in transcription factor binding sites are often clustered together, we propose a simple extension to the model-based motif discovery methods. We assign position-specific prior distributions to the frequency parameters of the model, penalizing deviations from a specified conservation profile. Examples with both simulated and real data show that this extension helps discover motifs as the data become noisier or when there is a competing false motif.

  10. Validation of the k- ω turbulence model for the thermal boundary layer profile of effusive cooled walls

    NASA Astrophysics Data System (ADS)

    Hink, R.

    2015-09-01

    The choice of materials for rocket chamber walls is limited by its thermal resistance. The thermal loads can be reduced substantially by the blowing out of gases through a porous surface. The k- ω-based turbulence models for computational fluid dynamic simulations are designed for smooth, non-permeable walls and have to be adjusted to account for the influence of injected fluids. Wilcox proposed therefore an extension for the k- ω turbulence model for the correct prediction of turbulent boundary layer velocity profiles. In this study, this extension is validated against experimental thermal boundary layer data from the Thermosciences Division of the Department of Mechanical Engineering from the Stanford University. All simulations are performed with a finite volume-based in-house code of the German Aerospace Center. Several simulations with different blowing settings were conducted and discussed in comparison to the results of the original model and in comparison to an additional roughness implementation. This study has permitted to understand that velocity profile corrections are necessary in contrast to additional roughness corrections to predict the correct thermal boundary layer profile of effusive cooled walls. Finally, this approach is applied to a two-dimensional simulation of an effusive cooled rocket chamber wall.

  11. Simulated Radioscapholunate Fusion Alters Carpal Kinematics While Preserving Dart-Thrower's Motion

    PubMed Central

    Calfee, Ryan P.; Leventhal, Evan L.; Wilkerson, Jim; Moore, Douglas C.; Akelman, Edward; Crisco, Joseph J.

    2014-01-01

    Purpose Midcarpal degeneration is well documented after radioscapholunate fusion. This study tested the hypothesis that radioscapholunate fusion alters the kinematic behavior of the remaining lunotriquetral and midcarpal joints, with specific focus on the dart-thrower's motion. Methods Simulated radioscapholunate fusions were performed on 6 cadaveric wrists in an anatomically neutral posture. Two 0.060-in. carbon fiber pins were placed from proximal to distal across the radiolunate and radioscaphoid joints, respectively. The wrists were passively positioned in a custom jig toward a full range of motion along the orthogonal axes as well as oblique motions, with additional intermediate positions along the dart-thrower's path. Using a computed tomography– based markerless bone registration technique, each carpal bone's three-dimensional rotation was defined as a function of wrist flexion/extension from the pinned neutral position. Kinematic data was analyzed against data collected on the same wrist prior to fixation using hierarchical linear regression analysis and paired Student's t-tests. Results After simulated fusion, wrist motion was restricted to an average flexion-extension arc of 48°, reduced from 77°, and radial-ulnar deviation arc of 19°, reduced from 33°. The remaining motion was maximally preserved along the dart-thrower's path from radial-extension toward ulnar-flexion. The simulated fusion significantly increased rotation through the scaphotrapezial joint, scaphocapitate joint, triquetrohamate joint, and lunotriquetral joint. For example, in the pinned wrist, the rotation of the hamate relative to the triquetrum increased 85%. Therefore, during every 10° of total wrist motion, the hamate rotated an average of nearly 8° relative to the triquetrum after pinning versus 4° in the normal state. Conclusions Simulated radioscapholunate fusion altered midcarpal and lunotriquetral kinematics. The increased rotations across these remaining joints provide one potential explanation for midcarpal degeneration after radioscapholunate fusion. Additionally, this fusion model confirms the dart-thrower's hypothesis, as wrist motion after simulated radioscapholunate fusion was primarily preserved from radial-extension toward ulnar-flexion. PMID:18406953

  12. X-33 Integrated Test Facility Extended Range Simulation

    NASA Technical Reports Server (NTRS)

    Sharma, Ashley

    1998-01-01

    In support of the X-33 single-stage-to-orbit program, NASA Dryden Flight Research Center was selected to provide continuous range communications of the X-33 vehicle from launch at Edwards Air Force Base, California, through landing at Malmstrom Air Force Base Montana, or at Michael Army Air Field, Utah. An extensive real-time range simulation capability is being developed to ensure successful communications with the autonomous X-33 vehicle. This paper provides an overview of various levels of simulation, integration, and test being developed to support the X-33 extended range subsystems. These subsystems include the flight termination system, L-band command uplink subsystem, and S-band telemetry downlink subsystem.

  13. Influence of lumbar spine extension on vertical jump height during maximal squat jumping.

    PubMed

    Blache, Yoann; Monteil, Karine

    2014-01-01

    The purpose of this study was to determine the influence of lumbar spine extension and erector spinae muscle activation on vertical jump height during maximal squat jumping. Eight male athletes performed maximal squat jumps. Electromyograms of the erector spinae were recorded during these jumps. A simulation model of the musculoskeletal system was used to simulate maximal squat jumping with and without spine extension. The effect on vertical jump height of changing erector spinae strength was also tested through the simulated jumps. Concerning the participant jumps, the kinematics indicated a spine extension and erector spinae activation. Concerning the simulated jumps, vertical jump height was about 5.4 cm lower during squat jump without trunk extension compared to squat jump. These results were explained by greater total muscle work during squat jump, more especially by the erector spinae work (+119.5 J). The erector spinae may contribute to spine extension during maximal squat jumping. The simulated jumps confirmed this hypothesis showing that vertical jumping was decreased if this muscle was not taken into consideration in the model. Therefore it is concluded that the erector spinae should be considered as a trunk extensor, which enables to enhance total muscle work and consequently vertical jump height.

  14. Evaluation of dispersive mixing, extension rate and bubble size distribution using numerical simulation of a non-Newtonian fluid in a twin-screw mixer

    NASA Astrophysics Data System (ADS)

    Rathod, Maureen L.

    Initially 3D FEM simulation of a simplified mixer was used to examine the effect of mixer configuration and operating conditions on dispersive mixing of a non-Newtonian fluid. Horizontal and vertical velocity magnitudes increased with increasing mixer speed, while maximum axial velocity and shear rate were greater with staggered paddles. In contrast, parallel paddles produced an area of efficient dispersive mixing between the center of the paddle and the barrel wall. This study was expanded to encompass the complete nine-paddle mixing section using power-law and Bird-Carreau fluid models. In the center of the mixer, simple shear flow was seen, corresponding with high [special character omitted]. Efficient dispersive mixing appeared near the barrel wall at all flow rates and near the barrel center with parallel paddles. Areas of backflow, improving fluid retention time, occurred with staggered paddles. The Bird-Carreau fluid showed greater influence of paddle motion under the same operating conditions due to the inelastic nature of the fluid. Shear-thinning behavior also resulted in greater maximum shear rate as shearing became easier with decreasing fluid viscosity. Shear rate distributions are frequently calculated, but extension rate calculations have not been made in a complex geometry since Debbaut and Crochet (1988) defined extension rate as the ratio of the third to the second invariant of the strain rate tensor. Extension rate was assumed to be negligible in most studies, but here extension rate is shown to be significant. It is possible to calculate maximum stable bubble diameter from capillary number if shear and extension rates in a flow field are known. Extension rate distributions were calculated for Newtonian and non-Newtonian fluids. High extension and shear rates were found in the intermeshing region. Extension is the major influence on critical capillary number and maximum stable bubble diameter, but when extension rate values are low shear rate has a larger impact. Examination of maximum stable bubble diameter through the mixer predicted areas of higher bubble dispersion based on flow type. This research has advanced simulation of non-Newtonian fluid and shown that direct calculation of extension rate is possible, demonstrating the effect of extension rate on bubble break-up.

  15. Agent-based model for the h-index - exact solution

    NASA Astrophysics Data System (ADS)

    Żogała-Siudem, Barbara; Siudem, Grzegorz; Cena, Anna; Gagolewski, Marek

    2016-01-01

    Hirsch's h-index is perhaps the most popular citation-based measure of scientific excellence. In 2013, Ionescu and Chopard proposed an agent-based model describing a process for generating publications and citations in an abstract scientific community [G. Ionescu, B. Chopard, Eur. Phys. J. B 86, 426 (2013)]. Within such a framework, one may simulate a scientist's activity, and - by extension - investigate the whole community of researchers. Even though the Ionescu and Chopard model predicts the h-index quite well, the authors provided a solution based solely on simulations. In this paper, we complete their results with exact, analytic formulas. What is more, by considering a simplified version of the Ionescu-Chopard model, we obtained a compact, easy to compute formula for the h-index. The derived approximate and exact solutions are investigated on a simulated and real-world data sets.

  16. Middleware Trade Study for NASA Domain

    NASA Technical Reports Server (NTRS)

    Bowman, Dan

    2007-01-01

    This presentation presents preliminary results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are: the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL

  17. Anonymity and Historical-Anonymity in Location-Based Services

    NASA Astrophysics Data System (ADS)

    Bettini, Claudio; Mascetti, Sergio; Wang, X. Sean; Freni, Dario; Jajodia, Sushil

    The problem of protecting user’s privacy in Location-Based Services (LBS) has been extensively studied recently and several defense techniques have been proposed. In this contribution, we first present a categorization of privacy attacks and related defenses. Then, we consider the class of defense techniques that aim at providing privacy through anonymity and in particular algorithms achieving “historical k- anonymity” in the case of the adversary obtaining a trace of requests recognized as being issued by the same (anonymous) user. Finally, we investigate the issues involved in the experimental evaluation of anonymity based defense techniques; we show that user movement simulations based on mostly random movements can lead to overestimate the privacy protection in some cases and to overprotective techniques in other cases. The above results are obtained by comparison to a more realistic simulation with an agent-based simulator, considering a specific deployment scenario.

  18. Agent-based model for rural-urban migration: A dynamic consideration

    NASA Astrophysics Data System (ADS)

    Cai, Ning; Ma, Hai-Ying; Khan, M. Junaid

    2015-10-01

    This paper develops a dynamic agent-based model for rural-urban migration, based on the previous relevant works. The model conforms to the typical dynamic linear multi-agent systems model concerned extensively in systems science, in which the communication network is formulated as a digraph. Simulations reveal that consensus of certain variable could be harmful to the overall stability and should be avoided.

  19. Validating Human Behavioral Models for Combat Simulations Using Techniques for the Evaluation of Human Performance

    DTIC Science & Technology

    2004-01-01

    Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.

  20. A New Numerical Simulation technology of Multistage Fracturing in Horizontal Well

    NASA Astrophysics Data System (ADS)

    Cheng, Ning; Kang, Kaifeng; Li, Jianming; Liu, Tao; Ding, Kun

    2017-11-01

    Horizontal multi-stage fracturing is recognized the effective development technology of unconventional oil resources. Geological mechanics in the numerical simulation of hydraulic fracturing technology occupies very important position, compared with the conventional numerical simulation technology, because of considering the influence of geological mechanics. New numerical simulation of hydraulic fracturing can more effectively optimize the design of fracturing and evaluate the production after fracturing. This paper studies is based on the three-dimensional stress and rock physics parameters model, using the latest fluid-solid coupling numerical simulation technology to engrave the extension process of fracture and describes the change of stress field in fracturing process, finally predict the production situation.

  1. Microsecond-Scale MD Simulations of HIV-1 DIS Kissing-Loop Complexes Predict Bulged-In Conformation of the Bulged Bases and Reveal Interesting Differences between Available Variants of the AMBER RNA Force Fields.

    PubMed

    Havrila, Marek; Zgarbová, Marie; Jurečka, Petr; Banáš, Pavel; Krepl, Miroslav; Otyepka, Michal; Šponer, Jiří

    2015-12-10

    We report an extensive set of explicit solvent molecular dynamics (MD) simulations (∼25 μs of accumulated simulation time) of the RNA kissing-loop complex of the HIV-1 virus initiation dimerization site. Despite many structural investigations by X-ray, NMR, and MD techniques, the position of the bulged purines of the kissing complex has not been unambiguously resolved. The X-ray structures consistently show bulged-out positions of the unpaired bases, while several NMR studies show bulged-in conformations. The NMR studies are, however, mutually inconsistent regarding the exact orientations of the bases. The earlier simulation studies predicted the bulged-out conformation; however, this finding could have been biased by the short simulation time scales. Our microsecond-long simulations reveal that all unpaired bases of the kissing-loop complex stay preferably in the interior of the kissing-loop complex. The MD results are discussed in the context of the available experimental data and we suggest that both conformations are biochemically relevant. We also show that MD provides a quite satisfactory description of this RNA system, contrasting recent reports of unsatisfactory performance of the RNA force fields for smaller systems such as tetranucleotides and tetraloops. We explain this by the fact that the kissing complex is primarily stabilized by an extensive network of Watson-Crick interactions which are rather well described by the force fields. We tested several different sets of water/ion parameters but they all lead to consistent results. However, we demonstrate that a recently suggested modification of van der Waals interactions of the Cornell et al. force field deteriorates the description of the kissing complex by the loss of key stacking interactions stabilizing the interhelical junction and excessive hydrogen-bonding interactions.

  2. Methods for simulation-based analysis of fluid-structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less

  3. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  4. Multi-day activity scheduling reactions to planned activities and future events in a dynamic model of activity-travel behavior

    NASA Astrophysics Data System (ADS)

    Nijland, Linda; Arentze, Theo; Timmermans, Harry

    2014-01-01

    Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of individuals indicate the importance of incorporating those pre-planned activities in the new generation of dynamic travel demand models. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that takes into account possible influences of pre-planned activities and events. This paper describes the theory and shows the results of simulations of the extension. The simulation was conducted for six different activities, and the parameter values used were consistent with an earlier estimation study. The results show that the model works well and that the influences of the parameters are consistent, logical, and have clear interpretations. These findings offer further evidence of face and construct validity to the suggested modeling approach.

  5. Surgical simulation: Current practices and future perspectives for technical skills training.

    PubMed

    Bjerrum, Flemming; Thomsen, Ann Sofia Skou; Nayahangan, Leizl Joy; Konge, Lars

    2018-06-17

    Simulation-based training (SBT) has become a standard component of modern surgical education, yet successful implementation of evidence-based training programs remains challenging. In this narrative review, we use Kern's framework for curriculum development to describe where we are now and what lies ahead for SBT within surgery with a focus on technical skills in operative procedures. Despite principles for optimal SBT (proficiency-based, distributed, and deliberate practice) having been identified, massed training with fixed time intervals or a fixed number of repetitions is still being extensively used, and simulators are generally underutilized. SBT should be part of surgical training curricula, including theoretical, technical, and non-technical skills, and be based on relevant needs assessments. Furthermore, training should follow evidence-based theoretical principles for optimal training, and the effect of training needs to be evaluated using relevant outcomes. There is a larger, still unrealized potential of surgical SBT, which may be realized in the near future as simulator technologies evolve, more evidence-based training programs are implemented, and cost-effectiveness and impact on patient safety is clearly demonstrated.

  6. A glacier runoff extension to the Precipitation Runoff Modeling System

    Treesearch

    A. E. Van Beusekom; R. J. Viger

    2016-01-01

    A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...

  7. Smoldyn: particle-based simulation with rule-based modeling, improved molecular interaction and a library interface.

    PubMed

    Andrews, Steven S

    2017-03-01

    Smoldyn is a spatial and stochastic biochemical simulator. It treats each molecule of interest as an individual particle in continuous space, simulating molecular diffusion, molecule-membrane interactions and chemical reactions, all with good accuracy. This article presents several new features. Smoldyn now supports two types of rule-based modeling. These are a wildcard method, which is very convenient, and the BioNetGen package with extensions for spatial simulation, which is better for complicated models. Smoldyn also includes new algorithms for simulating the diffusion of surface-bound molecules and molecules with excluded volume. Both are exact in the limit of short time steps and reasonably good with longer steps. In addition, Smoldyn supports single-molecule tracking simulations. Finally, the Smoldyn source code can be accessed through a C/C ++ language library interface. Smoldyn software, documentation, code, and examples are at http://www.smoldyn.org . steven.s.andrews@gmail.com. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  8. System crash as dynamics of complex networks.

    PubMed

    Yu, Yi; Xiao, Gaoxi; Zhou, Jie; Wang, Yubo; Wang, Zhen; Kurths, Jürgen; Schellnhuber, Hans Joachim

    2016-10-18

    Complex systems, from animal herds to human nations, sometimes crash drastically. Although the growth and evolution of systems have been extensively studied, our understanding of how systems crash is still limited. It remains rather puzzling why some systems, appearing to be doomed to fail, manage to survive for a long time whereas some other systems, which seem to be too big or too strong to fail, crash rapidly. In this contribution, we propose a network-based system dynamics model, where individual actions based on the local information accessible in their respective system structures may lead to the "peculiar" dynamics of system crash mentioned above. Extensive simulations are carried out on synthetic and real-life networks, which further reveal the interesting system evolution leading to the final crash. Applications and possible extensions of the proposed model are discussed.

  9. Climate change and Arctic ecosystems: 2. Modeling, paleodata-model comparisons, and future projections

    USGS Publications Warehouse

    Kaplan, J.O.; Bigelow, N.H.; Prentice, I.C.; Harrison, S.P.; Bartlein, P.J.; Christensen, T.R.; Cramer, W.; Matveyeva, N.V.; McGuire, A.D.; Murray, D.F.; Razzhivin, V.Y.; Smith, B.; Walker, D.A.; Anderson, P.M.; Andreev, A.A.; Brubaker, L.B.; Edwards, M.E.; Lozhkin, A.V.

    2003-01-01

    Large variations in the composition, structure, and function of Arctic ecosystems are determined by climatic gradients, especially of growing-season warmth, soil moisture, and snow cover. A unified circumpolar classification recognizing five types of tundra was developed. The geographic distributions of vegetation types north of 55??N, including the position of the forest limit and the distributions of the tundra types, could be predicted from climatology using a small set of plant functional types embedded in the biogeochemistry-biogeography model BIOME4. Several palaeoclimate simulations for the last glacial maximum (LGM) and mid-Holocene were used to explore the possibility of simulating past vegetation patterns, which are independently known based on pollen data. The broad outlines of observed changes in vegetation were captured. LGM simulations showed the major reduction of forest, the great extension of graminoid and forb tundra, and the restriction of low- and high-shrub tundra (although not all models produced sufficiently dry conditions to mimic the full observed change). Mid-Holocene simulations reproduced the contrast between northward forest extension in western and central Siberia and stability of the forest limit in Beringia. Projection of the effect of a continued exponential increase in atmospheric CO2 concentration, based on a transient ocean-atmosphere simulation including sulfate aerosol effects, suggests a potential for larger changes in Arctic ecosystems during the 21st century than have occurred between mid-Holocene and present. Simulated physiological effects of the CO2 increase (to > 700 ppm) at high latitudes were slight compared with the effects of the change in climate.

  10. Poverty Simulations: Building Relationships among Extension, Schools, and the Community

    ERIC Educational Resources Information Center

    Franck, Karen L.; Barnes, Shelly; Harrison, Julie

    2016-01-01

    Poverty simulations can be effective experiential learning tools for educating community members about the impact of poverty on families. The project described here includes survey results from three simulations with community leaders and teachers. This project illustrated how such workshops can help Extension professionals extend their reach and…

  11. Using Predictability for Lexical Segmentation

    ERIC Educational Resources Information Center

    Çöltekin, Çagri

    2017-01-01

    This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic…

  12. Tutoring electronic troubleshooting in a simulated maintenance work environment

    NASA Technical Reports Server (NTRS)

    Gott, Sherrie P.

    1987-01-01

    A series of intelligent tutoring systems, or intelligent maintenance simulators, is being developed based on expert and novice problem solving data. A graded series of authentic troubleshooting problems provides the curriculum, and adaptive instructional treatments foster active learning in trainees who engage in extensive fault isolation practice and thus in conditionalizing what they know. A proof of concept training study involving human tutoring was conducted as a precursor to the computer tutors to assess this integrated, problem based approach to task analysis and instruction. Statistically significant improvements in apprentice technicians' troubleshooting efficiency were achieved after approximately six hours of training.

  13. Flight test experience and controlled impact of a large, four-engine, remotely piloted airplane

    NASA Technical Reports Server (NTRS)

    Kempel, R. W.; Horton, T. W.

    1985-01-01

    A controlled impact demonstration (CID) program using a large, four engine, remotely piloted transport airplane was conducted. Closed loop primary flight control was performed from a ground based cockpit and digital computer in conjunction with an up/down telemetry link. Uplink commands were received aboard the airplane and transferred through uplink interface systems to a highly modified Bendix PB-20D autopilot. Both proportional and discrete commands were generated by the ground pilot. Prior to flight tests, extensive simulation was conducted during the development of ground based digital control laws. The control laws included primary control, secondary control, and racetrack and final approach guidance. Extensive ground checks were performed on all remotely piloted systems. However, manned flight tests were the primary method of verification and validation of control law concepts developed from simulation. The design, development, and flight testing of control laws and the systems required to accomplish the remotely piloted mission are discussed.

  14. The Resource Usage Aware Backfilling

    NASA Astrophysics Data System (ADS)

    Guim, Francesc; Rodero, Ivan; Corbalan, Julita

    Job scheduling policies for HPC centers have been extensively studied in the last few years, especially backfilling based policies. Almost all of these studies have been done using simulation tools. All the existent simulators use the runtime (either estimated or real) provided in the workload as a basis of their simulations. In our previous work we analyzed the impact on system performance of considering the resource sharing (memory bandwidth) of running jobs including a new resource model in the Alvio simulator. Based on this studies we proposed the LessConsume and LessConsume Threshold resource selection policies. Both are oriented to reduce the saturation of the shared resources thus increasing the performance of the system. The results showed how both resource allocation policies shown how the performance of the system can be improved by considering where the jobs are finally allocated.

  15. Throughput and delay analysis of IEEE 802.15.6-based CSMA/CA protocol.

    PubMed

    Ullah, Sana; Chen, Min; Kwak, Kyung Sup

    2012-12-01

    The IEEE 802.15.6 is a new communication standard on Wireless Body Area Network (WBAN) that focuses on a variety of medical, Consumer Electronics (CE) and entertainment applications. In this paper, the throughput and delay performance of the IEEE 802.15.6 is presented. Numerical formulas are derived to determine the maximum throughput and minimum delay limits of the IEEE 802.15.6 for an ideal channel with no transmission errors. These limits are derived for different frequency bands and data rates. Our analysis is validated by extensive simulations using a custom C+ + simulator. Based on analytical and simulation results, useful conclusions are derived for network provisioning and packet size optimization for different applications.

  16. Qualitative modeling of normal blood coagulation and its pathological states using stochastic activity networks.

    PubMed

    Mounts, W M; Liebman, M N

    1997-07-01

    We have developed a method for representing biological pathways and simulating their behavior based on the use of stochastic activity networks (SANs). SANs, an extension of the original Petri net, have been used traditionally to model flow systems including data-communications networks and manufacturing processes. We apply the methodology to the blood coagulation cascade, a biological flow system, and present the representation method as well as results of simulation studies based on published experimental data. In addition to describing the dynamic model, we also present the results of its utilization to perform simulations of clinical states including hemophilia's A and B as well as sensitivity analysis of individual factors and their impact on thrombin production.

  17. A Continuous Labour Supply Model in Microsimulation: A Life-Cycle Modelling Approach with Heterogeneity and Uncertainty Extension

    PubMed Central

    Li, Jinjing; Sologon, Denisa Maria

    2014-01-01

    This paper advances a structural inter-temporal model of labour supply that is able to simulate the dynamics of labour supply in a continuous setting and addresses two main drawbacks of most existing models. The first limitation is the inability to incorporate individual heterogeneity as every agent is sharing the same parameters of the utility function. The second one is the strong assumption that individuals make decisions in a world of perfect certainty. Essentially, this paper offers an extension of marginal-utility-of-wealth-constant labour supply functions known as “Frisch functions” under certainty and uncertainty with homogenous and heterogeneous preferences. The lifetime models based on the fixed effect vector decomposition yield the most stable simulation results, under both certain and uncertain future wage assumptions. Due to its improved accuracy and stability, this lifetime labour supply model is particularly suitable for enhancing the performance of the life cycle simulation models, thus providing a better reference for policymaking. PMID:25391021

  18. Sampling enhancement for the quantum mechanical potential based molecular dynamics simulations: a general algorithm and its extension for free energy calculation on rugged energy surface.

    PubMed

    Li, Hongzhi; Yang, Wei

    2007-03-21

    An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.

  19. Hierarchical analytical and simulation modelling of human-machine systems with interference

    NASA Astrophysics Data System (ADS)

    Braginsky, M. Ya; Tarakanov, D. V.; Tsapko, S. G.; Tsapko, I. V.; Baglaeva, E. A.

    2017-01-01

    The article considers the principles of building the analytical and simulation model of the human operator and the industrial control system hardware and software. E-networks as the extension of Petri nets are used as the mathematical apparatus. This approach allows simulating complex parallel distributed processes in human-machine systems. The structural and hierarchical approach is used as the building method for the mathematical model of the human operator. The upper level of the human operator is represented by the logical dynamic model of decision making based on E-networks. The lower level reflects psychophysiological characteristics of the human-operator.

  20. A Collaborative Extensible User Environment for Simulation and Knowledge Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Lansing, Carina S.; Porter, Ellen A.

    2015-06-01

    In scientific simulation, scientists use measured data to create numerical models, execute simulations and analyze results from advanced simulators executing on high performance computing platforms. This process usually requires a team of scientists collaborating on data collection, model creation and analysis, and on authorship of publications and data. This paper shows that scientific teams can benefit from a user environment called Akuna that permits subsurface scientists in disparate locations to collaborate on numerical modeling and analysis projects. The Akuna user environment is built on the Velo framework that provides both a rich client environment for conducting and analyzing simulations andmore » a Web environment for data sharing and annotation. Akuna is an extensible toolset that integrates with Velo, and is designed to support any type of simulator. This is achieved through data-driven user interface generation, use of a customizable knowledge management platform, and an extensible framework for simulation execution, monitoring and analysis. This paper describes how the customized Velo content management system and the Akuna toolset are used to integrate and enhance an effective collaborative research and application environment. The extensible architecture of Akuna is also described and demonstrates its usage for creation and execution of a 3D subsurface simulation.« less

  1. Building the ECON extension: Functionality and lessons learned

    Treesearch

    Fred C. Martin

    2008-01-01

    The functionality of the ECON extension to FVS is described with emphasis on the ability to dynamically interact with all elements of the FVS simulation process. Like other extensions, ECON is fully integrated within FVS. This integration allows: (1) analysis of multiple alternative tree-removal actions within a single simulation without altering “normal” stand...

  2. [Review on HSPF model for simulation of hydrology and water quality processes].

    PubMed

    Li, Zhao-fu; Liu, Hong-Yu; Li, Yan

    2012-07-01

    Hydrological Simulation Program-FORTRAN (HSPF), written in FORTRAN, is one ol the best semi-distributed hydrology and water quality models, which was first developed based on the Stanford Watershed Model. Many studies on HSPF model application were conducted. It can represent the contributions of sediment, nutrients, pesticides, conservatives and fecal coliforms from agricultural areas, continuously simulate water quantity and quality processes, as well as the effects of climate change and land use change on water quantity and quality. HSPF consists of three basic application components: PERLND (Pervious Land Segment) IMPLND (Impervious Land Segment), and RCHRES (free-flowing reach or mixed reservoirs). In general, HSPF has extensive application in the modeling of hydrology or water quality processes and the analysis of climate change and land use change. However, it has limited use in China. The main problems with HSPF include: (1) some algorithms and procedures still need to revise, (2) due to the high standard for input data, the accuracy of the model is limited by spatial and attribute data, (3) the model is only applicable for the simulation of well-mixed rivers, reservoirs and one-dimensional water bodies, it must be integrated with other models to solve more complex problems. At present, studies on HSPF model development are still undergoing, such as revision of model platform, extension of model function, method development for model calibration, and analysis of parameter sensitivity. With the accumulation of basic data and imorovement of data sharing, the HSPF model will be applied more extensively in China.

  3. The use of real-time, hardware-in-the-loop simulation in the design and development of the new Hughes HS601 spacecraft attitude control system

    NASA Technical Reports Server (NTRS)

    Slafer, Loren I.

    1989-01-01

    Realtime simulation and hardware-in-the-loop testing is being used extensively in all phases of the design, development, and testing of the attitude control system (ACS) for the new Hughes HS601 satellite bus. Realtime, hardware-in-the-loop simulation, integrated with traditional analysis and pure simulation activities is shown to provide a highly efficient and productive overall development program. Implementation of high fidelity simulations of the satellite dynamics and control system algorithms, capable of real-time execution (using applied Dynamics International's System 100), provides a tool which is capable of being integrated with the critical flight microprocessor to create a mixed simulation test (MST). The MST creates a highly accurate, detailed simulated on-orbit test environment, capable of open and closed loop ACS testing, in which the ACS design can be validated. The MST is shown to provide a valuable extension of traditional test methods. A description of the MST configuration is presented, including the spacecraft dynamics simulation model, sensor and actuator emulators, and the test support system. Overall system performance parameters are presented. MST applications are discussed; supporting ACS design, developing on-orbit system performance predictions, flight software development and qualification testing (augmenting the traditional software-based testing), mission planning, and a cost-effective subsystem-level acceptance test. The MST is shown to provide an ideal tool in which the ACS designer can fly the spacecraft on the ground.

  4. Investigations of electromagnetic scattering by columnar ice crystals

    NASA Technical Reports Server (NTRS)

    Weil, H.; Senior, T. B. A.

    1976-01-01

    An integral equation approach was developed to determine the scattering and absorption of electromagnetic radiation by thin walled cylinders of arbitrary cross-section and refractive index. Based on this method, extensive numerical data was presented at infrared wavelengths for hollow hexagonal cross section cylinders which simulate columnar sheath ice crystals.

  5. PACE: A Browsable Graphical Interface.

    ERIC Educational Resources Information Center

    Beheshti, Jamshid; And Others

    1996-01-01

    Describes PACE (Public Access Catalogue Extension), an alternative interface designed to enhance online catalogs by simulating images of books and library shelves to help users browse through the catalog. Results of a test in a college library against a text-based online public access catalog, including student attitudes, are described.…

  6. Demonstration of a 3D vision algorithm for space applications

    NASA Technical Reports Server (NTRS)

    Defigueiredo, Rui J. P. (Editor)

    1987-01-01

    This paper reports an extension of the MIAG algorithm for recognition and motion parameter determination of general 3-D polyhedral objects based on model matching techniques and using movement invariants as features of object representation. Results of tests conducted on the algorithm under conditions simulating space conditions are presented.

  7. Chapter 2: Fire and Fuels Extension: Model description

    Treesearch

    Sarah J. Beukema; Elizabeth D. Reinhardt; Julee A. Greenough; Donald C. E. Robinson; Werner A. Kurz

    2003-01-01

    The Fire and Fuels Extension to the Forest Vegetation Simulator is a model that simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. Existing models are used to represent forest stand development (the Forest Vegetation Simulator, Wykoff and others 1982), fire behavior (Rothermel 1972, Van Wagner 1977, and...

  8. Life Prediction of Turbine Blade Nickel Base Superalloy Single Crystals.

    DTIC Science & Technology

    1986-08-01

    mechanical properties between single crystals and the DS version of Mar-M200. Soon it was recognized again through the mechanical property - structure ... property achievements demonstrated by screening and simulated engine tests. 1 Single crystals are the results of extensive investigation on the mechanical ...behavior, (especially fatigue and creep) of, and the structure - property correlations in the equiaxed and directionally solidified (DS) nickel-base

  9. Versatile RED-based buffer management mechanism for the efficient support of internet traffic

    NASA Astrophysics Data System (ADS)

    Nelissen, Jordi; De Cnodder, Stefaan

    1999-11-01

    This paper presents an evaluation of various GFR (Guaranteed Frame Rate) implementation proposals. By means of extensive simulations performed in different network environments we compare two ATM Forum example implementations, namely the `simple FIFO-based GFR.2 implementation' and the `per-VC threshold and scheduling implementation'. The lessons learned from this study are as well applicable to non-ATM network technologies.

  10. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGES

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  11. Swarm Intelligence for Urban Dynamics Modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghnemat, Rawan; Bertelle, Cyrille; Duchamp, Gerard H. E.

    2009-04-16

    In this paper, we propose swarm intelligence algorithms to deal with dynamical and spatial organization emergence. The goal is to model and simulate the developement of spatial centers using multi-criteria. We combine a decentralized approach based on emergent clustering mixed with spatial constraints or attractions. We propose an extension of the ant nest building algorithm with multi-center and adaptive process. Typically, this model is suitable to analyse and simulate urban dynamics like gentrification or the dynamics of the cultural equipment in urban area.

  12. Simulating Self-Assembly with Simple Models

    NASA Astrophysics Data System (ADS)

    Rapaport, D. C.

    Results from recent molecular dynamics simulations of virus capsid self-assembly are described. The model is based on rigid trapezoidal particles designed to form polyhedral shells of size 60, together with an atomistic solvent. The underlying bonding process is fully reversible. More extensive computations are required than in previous work on icosahedral shells built from triangular particles, but the outcome is a high yield of closed shells. Intermediate clusters have a variety of forms, and bond counts provide a useful classification scheme

  13. Swarm Intelligence for Urban Dynamics Modelling

    NASA Astrophysics Data System (ADS)

    Ghnemat, Rawan; Bertelle, Cyrille; Duchamp, Gérard H. E.

    2009-04-01

    In this paper, we propose swarm intelligence algorithms to deal with dynamical and spatial organization emergence. The goal is to model and simulate the developement of spatial centers using multi-criteria. We combine a decentralized approach based on emergent clustering mixed with spatial constraints or attractions. We propose an extension of the ant nest building algorithm with multi-center and adaptive process. Typically, this model is suitable to analyse and simulate urban dynamics like gentrification or the dynamics of the cultural equipment in urban area.

  14. Microcomputer based software for biodynamic simulation

    NASA Technical Reports Server (NTRS)

    Rangarajan, N.; Shams, T.

    1993-01-01

    This paper presents a description of a microcomputer based software package, called DYNAMAN, which has been developed to allow an analyst to simulate the dynamics of a system consisting of a number of mass segments linked by joints. One primary application is in predicting the motion of a human occupant in a vehicle under the influence of a variety of external forces, specially those generated during a crash event. Extensive use of a graphical user interface has been made to aid the user in setting up the input data for the simulation and in viewing the results from the simulation. Among its many applications, it has been successfully used in the prototype design of a moving seat that aids in occupant protection during a crash, by aircraft designers in evaluating occupant injury in airplane crashes, and by users in accident reconstruction for reconstructing the motion of the occupant and correlating the impacts with observed injuries.

  15. Modeling Common-Sense Decisions

    NASA Astrophysics Data System (ADS)

    Zak, Michail

    This paper presents a methodology for efficient synthesis of dynamical model simulating a common-sense decision making process. The approach is based upon the extension of the physics' First Principles that includes behavior of living systems. The new architecture consists of motor dynamics simulating actual behavior of the object, and mental dynamics representing evolution of the corresponding knowledge-base and incorporating it in the form of information flows into the motor dynamics. The autonomy of the decision making process is achieved by a feedback from mental to motor dynamics. This feedback replaces unavailable external information by an internal knowledgebase stored in the mental model in the form of probability distributions.

  16. Design of a Modular Monolithic Implicit Solver for Multi-Physics Applications

    NASA Technical Reports Server (NTRS)

    Carton De Wiart, Corentin; Diosady, Laslo T.; Garai, Anirban; Burgess, Nicholas; Blonigan, Patrick; Ekelschot, Dirk; Murman, Scott M.

    2018-01-01

    The design of a modular multi-physics high-order space-time finite-element framework is presented together with its extension to allow monolithic coupling of different physics. One of the main objectives of the framework is to perform efficient high- fidelity simulations of capsule/parachute systems. This problem requires simulating multiple physics including, but not limited to, the compressible Navier-Stokes equations, the dynamics of a moving body with mesh deformations and adaptation, the linear shell equations, non-re effective boundary conditions and wall modeling. The solver is based on high-order space-time - finite element methods. Continuous, discontinuous and C1-discontinuous Galerkin methods are implemented, allowing one to discretize various physical models. Tangent and adjoint sensitivity analysis are also targeted in order to conduct gradient-based optimization, error estimation, mesh adaptation, and flow control, adding another layer of complexity to the framework. The decisions made to tackle these challenges are presented. The discussion focuses first on the "single-physics" solver and later on its extension to the monolithic coupling of different physics. The implementation of different physics modules, relevant to the capsule/parachute system, are also presented. Finally, examples of coupled computations are presented, paving the way to the simulation of the full capsule/parachute system.

  17. Surgical stent planning: simulation parameter study for models based on DICOM standards.

    PubMed

    Scherer, S; Treichel, T; Ritter, N; Triebel, G; Drossel, W G; Burgert, O

    2011-05-01

    Endovascular Aneurysm Repair (EVAR) can be facilitated by a realistic simulation model of stent-vessel-interaction. Therefore, numerical feasibility and integrability in the clinical environment was evaluated. The finite element method was used to determine necessary simulation parameters for stent-vessel-interaction in EVAR. Input variables and result data of the simulation model were examined for their standardization using DICOM supplements. The study identified four essential parameters for the stent-vessel simulation: blood pressure, intima constitution, plaque occurrence and the material properties of vessel and plaque. Output quantities such as radial force of the stent and contact pressure between stent/vessel can help the surgeon to evaluate implant fixation and sealing. The model geometry can be saved with DICOM "Surface Segmentation" objects and the upcoming "Implant Templates" supplement. Simulation results can be stored using the "Structured Report". A standards-based general simulation model for optimizing stent-graft selection may be feasible. At present, there are limitations due to specification of individual vessel material parameters and for simulating the proximal fixation of stent-grafts with hooks. Simulation data with clinical relevance for documentation and presentation can be stored using existing or new DICOM extensions.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Thomas M.; Berndt, Markus; Baglietto, Emilio

    The purpose of this report is to document a multi-year plan for enhancing turbulence modeling in Hydra-TH for the Consortium for Advanced Simulation of Light Water Reactors (CASL) program. Hydra-TH is being developed to the meet the high- fidelity, high-Reynolds number CFD based thermal hydraulic simulation needs of the program. This work is being conducted within the thermal hydraulics methods (THM) focus area. This report is an extension of THM CASL milestone L3:THM.CFD.P10.02 [33] (March, 2015) and picks up where it left off. It will also serve to meet the requirements of CASL THM level three milestone, L3:THM.CFD.P11.04, scheduled formore » completion September 30, 2015. The objectives of this plan will be met by: maturation of recently added turbulence models, strategic design/development of new models and systematic and rigorous testing of existing and new models and model extensions. While multi-phase turbulent flow simulations are important to the program, only single-phase modeling will be considered in this report. Large Eddy Simulation (LES) is also an important modeling methodology. However, at least in the first year, the focus is on steady-state Reynolds Averaged Navier-Stokes (RANS) turbulence modeling.« less

  19. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  20. Appendices to the user's manual for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A user's Manual for the Emulation Simulation Computer Model was published previously. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation/simulation combination in the design, development, and test of a piece of ARS hardware - SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from the test. In addition, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system, the second a potential Space Station air revitalization system.

  1. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  2. Intelligent fault management for the Space Station active thermal control system

    NASA Technical Reports Server (NTRS)

    Hill, Tim; Faltisco, Robert M.

    1992-01-01

    The Thermal Advanced Automation Project (TAAP) approach and architecture is described for automating the Space Station Freedom (SSF) Active Thermal Control System (ATCS). The baseline functionally and advanced automation techniques for Fault Detection, Isolation, and Recovery (FDIR) will be compared and contrasted. Advanced automation techniques such as rule-based systems and model-based reasoning should be utilized to efficiently control, monitor, and diagnose this extremely complex physical system. TAAP is developing advanced FDIR software for use on the SSF thermal control system. The goal of TAAP is to join Knowledge-Based System (KBS) technology, using a combination of rules and model-based reasoning, with conventional monitoring and control software in order to maximize autonomy of the ATCS. TAAP's predecessor was NASA's Thermal Expert System (TEXSYS) project which was the first large real-time expert system to use both extensive rules and model-based reasoning to control and perform FDIR on a large, complex physical system. TEXSYS showed that a method is needed for safely and inexpensively testing all possible faults of the ATCS, particularly those potentially damaging to the hardware, in order to develop a fully capable FDIR system. TAAP therefore includes the development of a high-fidelity simulation of the thermal control system. The simulation provides realistic, dynamic ATCS behavior and fault insertion capability for software testing without hardware related risks or expense. In addition, thermal engineers will gain greater confidence in the KBS FDIR software than was possible prior to this kind of simulation testing. The TAAP KBS will initially be a ground-based extension of the baseline ATCS monitoring and control software and could be migrated on-board as additional computation resources are made available.

  3. Transient Side Load Analysis of Out-of-Round Film-Cooled Nozzle Extensions

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2012-01-01

    There was interest in understanding the impact of out-of-round nozzle extension on the nozzle side load during transient startup operations. The out-of-round nozzle extension could be the result of asymmetric internal stresses, deformation induced by previous tests, and asymmetric loads induced by hardware attached to the nozzle. The objective of this study was therefore to computationally investigate the effect of out-of-round nozzle extension on the nozzle side loads during an engine startup transient. The rocket engine studied encompasses a regeneratively cooled chamber and nozzle, along with a film cooled nozzle extension. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and transient inlet boundary flow properties derived from an engine system simulation. Six three-dimensional cases were performed with the out-of-roundness achieved by three different degrees of ovalization, elongated on lateral y and z axes: one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The results show that the separation line jump was the primary source of the peak side loads. Comparing to the peak side load of the perfectly round nozzle, the peak side loads increased for the slightly and more ovalized nozzle extensions, and either increased or decreased for the two significantly ovalized nozzle extensions. A theory based on the counteraction of the flow destabilizing effect of an exacerbated asymmetrical flow caused by a lower degree of ovalization, and the flow stabilizing effect of a more symmetrical flow, created also by ovalization, is presented to explain the observations obtained in this effort.

  4. Little by Little Does the Trick: Design and Construction of a Discrete Event Agent-Based Simulation Framework

    DTIC Science & Technology

    2007-12-01

    model. Finally, we build a small agent-based model using the component architecture to demonstrate the library’s functionality. 15. NUMBER OF...and a Behavioral model. Finally, we build a small agent-based model using the component architecture to demonstrate the library’s functionality...prototypes an architectural design which is generalizable, reusable, and extensible. We have created an initial set of model elements that demonstrate

  5. Enhanced TCAS 2/CDTI traffic Sensor digital simulation model and program description

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1984-01-01

    Digital simulation models of enhanced TCAS 2/CDTI traffic sensors are developed, based on actual or projected operational and performance characteristics. Two enhanced Traffic (or Threat) Alert and Collision Avoidance Systems are considered. A digital simulation program is developed in FORTRAN. The program contains an executive with a semireal time batch processing capability. The simulation program can be interfaced with other modules with a minimum requirement. Both the traffic sensor and CAS logic modules are validated by means of extensive simulation runs. Selected validation cases are discussed in detail, and capabilities and limitations of the actual and simulated systems are noted. The TCAS systems are not specifically intended for Cockpit Display of Traffic Information (CDTI) applications. These systems are sufficiently general to allow implementation of CDTI functions within the real systems' constraints.

  6. pysimm: A Python Package for Simulation of Molecular Systems

    NASA Astrophysics Data System (ADS)

    Fortunato, Michael; Colina, Coray

    pysimm, short for python simulation interface for molecular modeling, is a python package designed to facilitate the structure generation and simulation of molecular systems through convenient and programmatic access to object-oriented representations of molecular system data. This poster presents core features of pysimm and design philosophies that highlight a generalized methodology for incorporation of third-party software packages through API interfaces. The integration with the LAMMPS simulation package is explained to demonstrate this methodology. pysimm began as a back-end python library that powered a cloud-based application on nanohub.org for amorphous polymer simulation. The extension from a specific application library to general purpose simulation interface is explained. Additionally, this poster highlights the rapid development of new applications to construct polymer chains capable of controlling chain morphology such as molecular weight distribution and monomer composition.

  7. European consensus on a competency-based virtual reality training program for basic endoscopic surgical psychomotor skills.

    PubMed

    van Dongen, Koen W; Ahlberg, Gunnar; Bonavina, Luigi; Carter, Fiona J; Grantcharov, Teodor P; Hyltander, Anders; Schijven, Marlies P; Stefani, Alessandro; van der Zee, David C; Broeders, Ivo A M J

    2011-01-01

    Virtual reality (VR) simulators have been demonstrated to improve basic psychomotor skills in endoscopic surgery. The exercise configuration settings used for validation in studies published so far are default settings or are based on the personal choice of the tutors. The purpose of this study was to establish consensus on exercise configurations and on a validated training program for a virtual reality simulator, based on the experience of international experts to set criterion levels to construct a proficiency-based training program. A consensus meeting was held with eight European teams, all extensively experienced in using the VR simulator. Construct validity of the training program was tested by 20 experts and 60 novices. The data were analyzed by using the t test for equality of means. Consensus was achieved on training designs, exercise configuration, and examination. Almost all exercises (7/8) showed construct validity. In total, 50 of 94 parameters (53%) showed significant difference. A European, multicenter, validated, training program was constructed according to the general consensus of a large international team with extended experience in virtual reality simulation. Therefore, a proficiency-based training program can be offered to training centers that use this simulator for training in basic psychomotor skills in endoscopic surgery.

  8. ReaxFF Study of the Oxidation of Softwood Lignin in View of Carbon Fiber Production

    DOE PAGES

    Beste, Ariana

    2014-10-06

    We investigate the oxidative, thermal conversion of softwood lignin by performing molecular dynamics simulations based on a reactive force field (ReaxFF). The lignin samples are constructed from coniferyl alcohol units, which are connected through linkages that are randomly selected from a natural distribution of linkages in softwood. The goal of this work is to simulate the oxidative stabilization step during carbon fiber production from lignin precursor. We find that at simulation conditions where stabilization reactions occur, the lignin fragments have already undergone extensive degradation. The 5-5 linkage shows the highest reactivity towards cyclization and dehydrogenation.

  9. Low Gravity Freefall Facilities

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Composite of Marshall Space Flight Center's Low-Gravity Free Fall Facilities.These facilities include a 100-meter drop tower and a 100-meter drop tube. The drop tower simulates in-flight microgravity conditions for up to 4.2 seconds for containerless processing experiments, immiscible fluids and materials research, pre-flight hardware design test and flight experiment simulation. The drop tube simulates in-flight microgravity conditions for up to 4.6 seconds and is used extensively for ground-based microgravity convection research in which extremely small samples are studied. The facility can provide deep undercooling for containerless processing experiments that require materials to remain in a liquid phase when cooled below the normal solidification temperature.

  10. Microgravity

    NASA Image and Video Library

    1981-03-30

    Composite of Marshall Space Flight Center's Low-Gravity Free Fall Facilities.These facilities include a 100-meter drop tower and a 100-meter drop tube. The drop tower simulates in-flight microgravity conditions for up to 4.2 seconds for containerless processing experiments, immiscible fluids and materials research, pre-flight hardware design test and flight experiment simulation. The drop tube simulates in-flight microgravity conditions for up to 4.6 seconds and is used extensively for ground-based microgravity convection research in which extremely small samples are studied. The facility can provide deep undercooling for containerless processing experiments that require materials to remain in a liquid phase when cooled below the normal solidification temperature.

  11. Energy simulation and optimization for a small commercial building through Modelica

    NASA Astrophysics Data System (ADS)

    Rivas, Bryan

    Small commercial buildings make up the majority of buildings in the United States. Energy consumed by these buildings is expected to drastically increase in the next few decades, with a large percentage of the energy consumed attributed to cooling systems. This work presents the simulation and optimization of a thermostat schedule to minimize energy consumption in a small commercial building test bed during the cooling season. The simulation occurs through the use of the multi-engineering domain Dymola environment based on the Modelica open source programming language and is optimized with the Java based optimization program GenOpt. The simulation uses both physically based modeling utilizing heat transfer principles for the building and regression analysis for energy consumption. GenOpt is dynamically coupled to Dymola through various interface files. There are very few studies that have coupled GenOpt to a building simulation program and even fewer studies have used Dymola for building simulation as extensively as the work presented here. The work presented proves Dymola as a viable alternative to other building simulation programs such as EnergyPlus and MatLab. The model developed is used to simulate the energy consumption of a test bed, a commissioned real world small commercial building, while maintaining indoor thermal comfort. Potential applications include smart or intelligent building systems, predictive simulation of small commercial buildings, and building diagnostics.

  12. Sample Size Estimation in Cluster Randomized Educational Trials: An Empirical Bayes Approach

    ERIC Educational Resources Information Center

    Rotondi, Michael A.; Donner, Allan

    2009-01-01

    The educational field has now accumulated an extensive literature reporting on values of the intraclass correlation coefficient, a parameter essential to determining the required size of a planned cluster randomized trial. We propose here a simple simulation-based approach including all relevant information that can facilitate this task. An…

  13. Accurate atomistic potentials and training sets for boron-nitride nanostructures

    NASA Astrophysics Data System (ADS)

    Tamblyn, Isaac

    Boron nitride nanotubes exhibit exceptional structural, mechanical, and thermal properties. They are optically transparent and have high thermal stability, suggesting a wide range of opportunities for structural reinforcement of materials. Modeling can play an important role in determining the optimal approach to integrating nanotubes into a supporting matrix. Developing accurate, atomistic scale models of such nanoscale interfaces embedded within composites is challenging, however, due to the mismatch of length scales involved. Typical nanotube diameters range from 5-50 nm, with a length as large as a micron (i.e. a relevant length-scale for structural reinforcement). Unlike their carbon-based counterparts, well tested and transferable interatomic force fields are not common for BNNT. In light of this, we have developed an extensive training database of BN rich materials, under conditions relevant for BNNT synthesis and composites based on extensive first principles molecular dynamics simulations. Using this data, we have produced an artificial neural network potential capable of reproducing the accuracy of first principles data at significantly reduced computational cost, allowing for accurate simulation at the much larger length scales needed for composite design.

  14. Flight test experience and controlled impact of a remotely piloted jet transport aircraft

    NASA Technical Reports Server (NTRS)

    Horton, Timothy W.; Kempel, Robert W.

    1988-01-01

    The Dryden Flight Research Center Facility of NASA Ames Research Center (Ames-Dryden) and the FAA conducted the controlled impact demonstration (CID) program using a large, four-engine, remotely piloted jet transport airplane. Closed-loop primary flight was controlled through the existing onboard PB-20D autopilot which had been modified for the CID program. Uplink commands were sent from a ground-based cockpit and digital computer in conjunction with an up-down telemetry link. These uplink commands were received aboard the airplane and transferred through uplink interface systems to the modified PB-20D autopilot. Both proportional and discrete commands were produced by the ground system. Prior to flight tests, extensive simulation was conducted during the development of ground-based digital control laws. The control laws included primary control, secondary control, and racetrack and final approach guidance. Extensive ground checks were performed on all remotely piloted systems; however, piloted flight tests were the primary method and validation of control law concepts developed from simulation. The design, development, and flight testing of control laws and systems required to accomplish the remotely piloted mission are discussed.

  15. Propagation based phase retrieval of simulated intensity measurements using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Kemp, Z. D. C.

    2018-04-01

    Determining the phase of a wave from intensity measurements has many applications in fields such as electron microscopy, visible light optics, and medical imaging. Propagation based phase retrieval, where the phase is obtained from defocused images, has shown significant promise. There are, however, limitations in the accuracy of the retrieved phase arising from such methods. Sources of error include shot noise, image misalignment, and diffraction artifacts. We explore the use of artificial neural networks (ANNs) to improve the accuracy of propagation based phase retrieval algorithms applied to simulated intensity measurements. We employ a phase retrieval algorithm based on the transport-of-intensity equation to obtain the phase from simulated micrographs of procedurally generated specimens. We then train an ANN with pairs of retrieved and exact phases, and use the trained ANN to process a test set of retrieved phase maps. The total error in the phase is significantly reduced using this method. We also discuss a variety of potential extensions to this work.

  16. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    NASA Astrophysics Data System (ADS)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  17. TOWARDS ICE FORMATION CLOSURE IN MIXED-PHASE BOUNDARY LAYER CLOUDS DURING ISDAC

    NASA Astrophysics Data System (ADS)

    Avramov, A.; Ackerman, A. S.; Fridlind, A. M.; van Diedenhoven, B.; Korolev, A. V.

    2009-12-01

    Mixed-phase stratus clouds are ubiquitous in the Arctic during the winter and transition seasons. Despite their important role in various climate feedback mechanisms they are not well understood and are difficult to represent faithfully in cloud models. In particular, models of all types experience difficulties reproducing observed ice concentrations and liquid/ice water partitioning in these clouds. Previous studies have demonstrated that simulated ice concentrations and ice water content are critically dependent on ice nucleation modes and ice crystal habit assumed in simulations. In this study we use large-eddy simulations with size-resolved microphysics to determine whether uncertainties in ice nucleus concentrations, ice nucleation mechanisms, ice crystal habits and large-scale forcing are sufficient to account for the difference between simulated and observed quantities. We present results of simulations of two case studies based on observations taken during the recent Indirect and Semi-Direct Aerosol Campaign (ISDAC) on April 8 and 26, 2008. The model simulations are evaluated through extensive comparison with in-situ observations and ground-based remote sensing measurements.

  18. Comparative study on gene set and pathway topology-based enrichment methods.

    PubMed

    Bayerlová, Michaela; Jung, Klaus; Kramer, Frank; Klemm, Florian; Bleckmann, Annalen; Beißbarth, Tim

    2015-10-22

    Enrichment analysis is a popular approach to identify pathways or sets of genes which are significantly enriched in the context of differentially expressed genes. The traditional gene set enrichment approach considers a pathway as a simple gene list disregarding any knowledge of gene or protein interactions. In contrast, the new group of so called pathway topology-based methods integrates the topological structure of a pathway into the analysis. We comparatively investigated gene set and pathway topology-based enrichment approaches, considering three gene set and four topological methods. These methods were compared in two extensive simulation studies and on a benchmark of 36 real datasets, providing the same pathway input data for all methods. In the benchmark data analysis both types of methods showed a comparable ability to detect enriched pathways. The first simulation study was conducted with KEGG pathways, which showed considerable gene overlaps between each other. In this study with original KEGG pathways, none of the topology-based methods outperformed the gene set approach. Therefore, a second simulation study was performed on non-overlapping pathways created by unique gene IDs. Here, methods accounting for pathway topology reached higher accuracy than the gene set methods, however their sensitivity was lower. We conducted one of the first comprehensive comparative works on evaluating gene set against pathway topology-based enrichment methods. The topological methods showed better performance in the simulation scenarios with non-overlapping pathways, however, they were not conclusively better in the other scenarios. This suggests that simple gene set approach might be sufficient to detect an enriched pathway under realistic circumstances. Nevertheless, more extensive studies and further benchmark data are needed to systematically evaluate these methods and to assess what gain and cost pathway topology information introduces into enrichment analysis. Both types of methods for enrichment analysis require further improvements in order to deal with the problem of pathway overlaps.

  19. Numerical simulation of geomorphic, climatic and anthropogenic drivers of soil distribution on semi-arid hillslopes

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.; Cohen, S.; Svoray, T.; Sela, S.; Hancock, G. R.

    2010-12-01

    Numerical models are an important tool for studying landscape processes as they allow us to isolate specific processes and drivers and test various physics and spatio-temporal scenarios. Here we use a distributed physically-based soil evolution model (mARM4D) to describe the drivers and processes controlling soil-landscape evolution on a field-site at the fringe between the Mediterranean and desert regions of Israel. This study is an initial effort in a larger project aimed at improving our understanding of the mechanisms and drivers that led to the extensive removal of soils from the loess covered hillslopes of this region. This specific region is interesting as it is located between the Mediterranean climate region in which widespread erosion from hillslopes was attributed to human activity during the Holocene and the arid region in which extensive removal of loess from hillslopes was shown to have been driven by climatic changes during the late-Pleistocene. First we study the sediment transport mechanism of the soil-landscape evolution processes in our study-site. We simulate soil-landscape evolution with only one sediment transport process (fluvial or diffusive) at a time. We find that diffusive sediment transport is likely the dominant process in this site as it resulted in soil distributions that better corresponds to current observations. We then simulate several realistic climatic/anthropogenic scenarios (based on the literature) in order to quantify the sensitivity of the soil-landscape evolution process to temporal fluctuations. We find that this site is relatively insensitive to short term (several thousands of years) sharp, changes. This suggests that climate, rather then human activity, was the main driver for the extensive removal of loess from the hillslopes.

  20. NASA/ESA CV-990 spacelab simulation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Due to interest in the application of simplified techniques used to conduct airborne science missions at NASA's Ames Research Center, a joint NASA/ESA endeavor was established to conduct an extensive Spacelab simulation using the NASA CV-990 airborne laboratory. The scientific payload was selected to perform studies in upper atmospheric physics and infrared astronomy with principal investigators from France, the Netherlands, England, and several groups from the United States. Communication links between the 'Spacelab' and a ground based mission operations center were limited consistent with Spacelab plans. The mission was successful and provided extensive data relevant to Spacelab objectives on overall management of a complex international payload; experiment preparation, testing, and integration; training for proxy operation in space; data handling; multiexperimenter use of common experimenter facilities (telescopes); multiexperiment operation by experiment operators; selection criteria for Spacelab experiment operators; and schedule requirements to prepare for such a Spacelab mission.

  1. Power independent EMG based gesture recognition for robotics.

    PubMed

    Li, Ling; Looney, David; Park, Cheolsoo; Rehman, Naveed U; Mandic, Danilo P

    2011-01-01

    A novel method for detecting muscle contraction is presented. This method is further developed for identifying four different gestures to facilitate a hand gesture controlled robot system. It is achieved based on surface Electromyograph (EMG) measurements of groups of arm muscles. The cross-information is preserved through a simultaneous processing of EMG channels using a recent multivariate extension of Empirical Mode Decomposition (EMD). Next, phase synchrony measures are employed to make the system robust to different power levels due to electrode placements and impedances. The multiple pairwise muscle synchronies are used as features of a discrete gesture space comprising four gestures (flexion, extension, pronation, supination). Simulations on real-time robot control illustrate the enhanced accuracy and robustness of the proposed methodology.

  2. An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design

    NASA Technical Reports Server (NTRS)

    Lin, Risheng; Afjeh, Abdollah A.

    2003-01-01

    Crucial to an efficient aircraft simulation-based design is a robust data modeling methodology for both recording the information and providing data transfer readily and reliably. To meet this goal, data modeling issues involved in the aircraft multidisciplinary design are first analyzed in this study. Next, an XML-based. extensible data object model for multidisciplinary aircraft design is constructed and implemented. The implementation of the model through aircraft databinding allows the design applications to access and manipulate any disciplinary data with a lightweight and easy-to-use API. In addition, language independent representation of aircraft disciplinary data in the model fosters interoperability amongst heterogeneous systems thereby facilitating data sharing and exchange between various design tools and systems.

  3. A Network Coding Based Hybrid ARQ Protocol for Underwater Acoustic Sensor Networks

    PubMed Central

    Wang, Hao; Wang, Shilian; Zhang, Eryang; Zou, Jianbin

    2016-01-01

    Underwater Acoustic Sensor Networks (UASNs) have attracted increasing interest in recent years due to their extensive commercial and military applications. However, the harsh underwater channel causes many challenges for the design of reliable underwater data transport protocol. In this paper, we propose an energy efficient data transport protocol based on network coding and hybrid automatic repeat request (NCHARQ) to ensure reliability, efficiency and availability in UASNs. Moreover, an adaptive window length estimation algorithm is designed to optimize the throughput and energy consumption tradeoff. The algorithm can adaptively change the code rate and can be insensitive to the environment change. Extensive simulations and analysis show that NCHARQ significantly reduces energy consumption with short end-to-end delay. PMID:27618044

  4. Online model checking approach based parameter estimation to a neuronal fate decision simulation model in Caenorhabditis elegans with hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Koh, Chuan Hock; Miyano, Satoru

    2011-05-01

    Mathematical modeling and simulation studies are playing an increasingly important role in helping researchers elucidate how living organisms function in cells. In systems biology, researchers typically tune many parameters manually to achieve simulation results that are consistent with biological knowledge. This severely limits the size and complexity of simulation models built. In order to break this limitation, we propose a computational framework to automatically estimate kinetic parameters for a given network structure. We utilized an online (on-the-fly) model checking technique (which saves resources compared to the offline approach), with a quantitative modeling and simulation architecture named hybrid functional Petri net with extension (HFPNe). We demonstrate the applicability of this framework by the analysis of the underlying model for the neuronal cell fate decision model (ASE fate model) in Caenorhabditis elegans. First, we built a quantitative ASE fate model containing 3327 components emulating nine genetic conditions. Then, using our developed efficient online model checker, MIRACH 1.0, together with parameter estimation, we ran 20-million simulation runs, and were able to locate 57 parameter sets for 23 parameters in the model that are consistent with 45 biological rules extracted from published biological articles without much manual intervention. To evaluate the robustness of these 57 parameter sets, we run another 20 million simulation runs using different magnitudes of noise. Our simulation results concluded that among these models, one model is the most reasonable and robust simulation model owing to the high stability against these stochastic noises. Our simulation results provide interesting biological findings which could be used for future wet-lab experiments.

  5. Extreme events in a vortex gas simulation of a turbulent half-jet

    NASA Astrophysics Data System (ADS)

    Suryanarayanan, Saikishan; Pathikonda, Gokul; Narasimha, Roddam

    2012-11-01

    Extensive simulations [arXiv:1008.2876v1 [physics.flu-dyn], BAPS.2010.DFD.LE.4] have shown that the temporally evolving vortex gas mixing layer has 3 regimes, including one which has a universal spreading rate. The present study explores the development of spatially evolving mixing layers, using a vortex gas model based on Basu et al. (1995 Appl. Math. Modelling). The effects of the velocity ratio (r) are analyzed via the most extensive simulations of this kind till date, involving up to 10000 vortices and averaging over up to 1000 convective times. While the temporal limit is approached as r approaches unity, striking features such as extreme events involving coherent structures, bending, deviation of the convection velocity from mean velocity, spatial feedback and greater sensitivity to downstream and free stream boundary conditions are observed in the half-jet (r = 0) limit. A detailed statistical analysis reveals possible causes for the large scatter across experiments, as opposed to the commonly adopted explanation of asymptotic dependence on initial conditions. Supported in part by contract no. Intel/RN/4288.

  6. Rank-preserving regression: a more robust rank regression model against outliers.

    PubMed

    Chen, Tian; Kowalski, Jeanne; Chen, Rui; Wu, Pan; Zhang, Hui; Feng, Changyong; Tu, Xin M

    2016-08-30

    Mean-based semi-parametric regression models such as the popular generalized estimating equations are widely used to improve robustness of inference over parametric models. Unfortunately, such models are quite sensitive to outlying observations. The Wilcoxon-score-based rank regression (RR) provides more robust estimates over generalized estimating equations against outliers. However, the RR and its extensions do not sufficiently address missing data arising in longitudinal studies. In this paper, we propose a new approach to address outliers under a different framework based on the functional response models. This functional-response-model-based alternative not only addresses limitations of the RR and its extensions for longitudinal data, but, with its rank-preserving property, even provides more robust estimates than these alternatives. The proposed approach is illustrated with both real and simulated data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. SU-F-T-156: Monte Carlo Simulation Using TOPAS for Synchrotron Based Proton Discrete Spot Scanning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Tsiamas, P

    Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less

  8. The sixth generation robot in space

    NASA Technical Reports Server (NTRS)

    Butcher, A.; Das, A.; Reddy, Y. V.; Singh, H.

    1990-01-01

    The knowledge based simulator developed in the artificial intelligence laboratory has become a working test bed for experimenting with intelligent reasoning architectures. With this simulator, recently, small experiments have been done with an aim to simulate robot behavior to avoid colliding paths. An automatic extension of such experiments to intelligently planning robots in space demands advanced reasoning architectures. One such architecture for general purpose problem solving is explored. The robot, seen as a knowledge base machine, goes via predesigned abstraction mechanism for problem understanding and response generation. The three phases in one such abstraction scheme are: abstraction for representation, abstraction for evaluation, and abstraction for resolution. Such abstractions require multimodality. This multimodality requires the use of intensional variables to deal with beliefs in the system. Abstraction mechanisms help in synthesizing possible propagating lattices for such beliefs. The machine controller enters into a sixth generation paradigm.

  9. Atomic oxygen effects on spacecraft materials: The state of the art of our knowledge

    NASA Technical Reports Server (NTRS)

    Koontz, Steven L.

    1989-01-01

    In the flight materials exposure data base extensive quantitative data is available from limited exposures in a narrow range of orbital environments. More data is needed in a wider range of environments as well as longer exposure times. Synergistic effects with other environmental factors; polar orbit and higher altitude environments; and real time materials degradation data is needed to understand degradation kinetics and mechanism. Almost no laboratory data exists from high fidelity simulations of the LEO environment. Simulation and test system are under development, and the data base is scanty. Theoretical understanding of hyperthermal atom surface reactions in the LEO environment is not good enough to support development of reliable accelerated test methods. The laser sustained discharge, atom beam sources are the most promising high fidelity simulation-test systems at this time.

  10. BEAM DYNAMICS SIMULATIONS FOR A DC GUN BASED INJECTOR FOR PERL.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ZHOU,F.; BEN-ZVI,I.; WANG,X.J.

    2001-06-18

    The National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory (BNL) is considering an upgrade based on the Photoinjected Energy Recovering Linac (PERL). The various injector schemes for this machine are being extensively investigated at BNL. One of the possible options is photocathode DC gun. The schematic layout of a PERL DC gun based injector and its preliminary beam dynamics are presented in this paper. The transverse and longitudinal emittance of photo-electron beam were optimized for a DC field 500 kV.

  11. Boat, wake, and wave real-time simulation

    NASA Astrophysics Data System (ADS)

    Świerkowski, Leszek; Gouthas, Efthimios; Christie, Chad L.; Williams, Owen M.

    2009-05-01

    We describe the extension of our real-time scene generation software VIRSuite to include the dynamic simulation of small boats and their wakes within an ocean environment. Extensive use has been made of the programmabilty available in the current generation of GPUs. We have demonstrated that real-time simulation is feasible, even including such complexities as dynamical calculation of the boat motion, wake generation and calculation of an FFTgenerated sea state.

  12. A continuous analog of run length distributions reflecting accumulated fractionation events.

    PubMed

    Yu, Zhe; Sankoff, David

    2016-11-11

    We propose a new, continuous model of the fractionation process (duplicate gene deletion after polyploidization) on the real line. The aim is to infer how much DNA is deleted at a time, based on segment lengths for alternating deleted (invisible) and undeleted (visible) regions. After deriving a number of analytical results for "one-sided" fractionation, we undertake a series of simulations that help us identify the distribution of segment lengths as a gamma with shape and rate parameters evolving over time. This leads to an inference procedure based on observed length distributions for visible and invisible segments. We suggest extensions of this mathematical and simulation work to biologically realistic discrete models, including two-sided fractionation.

  13. Enhancing the Behaviorial Fidelity of Synthetic Entities with Human Behavior Models

    DTIC Science & Technology

    2004-05-05

    reflecting the soldier’s extensive training. A civilian’s behavior in the same situation will be determined more by emotions , such as fear, and goals...of intelligent behavior , from path-planning to emotional effects, data on the environment must be gathered from the simulation to serve as sensor...model of decision-making based on emotional utility. AI.Implant takes a composite behavior -based approach to individual and crowd navigation

  14. Improved estimation of ligand macromolecule binding affinities by linear response approach using a combination of multi-mode MD simulation and QM/MM methods

    NASA Astrophysics Data System (ADS)

    Khandelwal, Akash; Balaz, Stefan

    2007-01-01

    Structure-based predictions of binding affinities of ligands binding to proteins by coordination bonds with transition metals, covalent bonds, and bonds involving charge re-distributions are hindered by the absence of proper force fields. This shortcoming affects all methods which use force-field-based molecular simulation data on complex formation for affinity predictions. One of the most frequently used methods in this category is the Linear Response (LR) approach of Åquist, correlating binding affinities with van der Waals and electrostatic energies, as extended by Jorgensen's inclusion of solvent-accessible surface areas. All these terms represent the differences, upon binding, in the ensemble averages of pertinent quantities, obtained from molecular dynamics (MD) or Monte Carlo simulations of the complex and of single components. Here we report a modification of the LR approach by: (1) the replacement of the two energy terms through the single-point QM/MM energy of the time-averaged complex structure from an MD simulation; and (2) a rigorous consideration of multiple modes (mm) of binding. The first extension alleviates the force-field related problems, while the second extension deals with the ligands exhibiting large-scale motions in the course of an MD simulation. The second modification results in the correlation equation that is nonlinear in optimized coefficients, but does not lead to an increase in the number of optimized coefficients. The application of the resulting mm QM/MM LR approach to the inhibition of zinc-dependent gelatinase B (matrix metalloproteinase 9) by 28 hydroxamate ligands indicates a significant improvement of descriptive and predictive abilities.

  15. A virtual source model for Monte Carlo simulation of helical tomotherapy.

    PubMed

    Yuan, Jiankui; Rong, Yi; Chen, Quan

    2015-01-08

    The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase-space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS-generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of < 1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of < 2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM-based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose-volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were consistent with literature. The VSM-based MC simulation approach can be feasibly built from the gold standard beam model of a tomotherapy unit. The accuracy of the VSM was validated against measurements in homogeneous media, as well as published full MC model in heterogeneous media.

  16. A virtual source model for Monte Carlo simulation of helical tomotherapy

    PubMed Central

    Yuan, Jiankui; Rong, Yi

    2015-01-01

    The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase‐space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS‐generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of <1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of <2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM‐based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose‐volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were consistent with literature. The VSM‐based MC simulation approach can be feasibly built from the gold standard beam model of a tomotherapy unit. The accuracy of the VSM was validated against measurements in homogeneous media, as well as published full MC model in heterogeneous media. PACS numbers: 87.53.‐j, 87.55.K‐ PMID:25679157

  17. Physical environment virtualization for human activities recognition

    NASA Astrophysics Data System (ADS)

    Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen

    2015-05-01

    Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.

  18. Verifying Diagnostic Software

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Pecheur, Charles

    2004-01-01

    Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

  19. Modeling and simulation for space medicine operations: preliminary requirements considered

    NASA Technical Reports Server (NTRS)

    Dawson, D. L.; Billica, R. D.; McDonald, P. V.

    2001-01-01

    The NASA Space Medicine program is now developing plans for more extensive use of high-fidelity medical simulation systems. The use of simulation is seen as means to more effectively use the limited time available for astronaut medical training. Training systems should be adaptable for use in a variety of training environments, including classrooms or laboratories, space vehicle mockups, analog environments, and in microgravity. Modeling and simulation can also provide the space medicine development program a mechanism for evaluation of other medical technologies under operationally realistic conditions. Systems and procedures need preflight verification with ground-based testing. Traditionally, component testing has been accomplished, but practical means for "human in the loop" verification of patient care systems have been lacking. Medical modeling and simulation technology offer potential means to accomplish such validation work. Initial considerations in the development of functional requirements and design standards for simulation systems for space medicine are discussed.

  20. Requirements for Modeling and Simulation for Space Medicine Operations: Preliminary Considerations

    NASA Technical Reports Server (NTRS)

    Dawson, David L.; Billica, Roger D.; Logan, James; McDonald, P. Vernon

    2001-01-01

    The NASA Space Medicine program is now developing plans for more extensive use of high-fidelity medical Simulation systems. The use of simulation is seen as means to more effectively use the limited time available for astronaut medical training. Training systems should be adaptable for use in a variety of training environments, including classrooms or laboratories, space vehicle mockups, analog environments, and in microgravity. Modeling and simulation can also provide the space medicine development program a mechanism for evaluation of other medical technologies under operationally realistic conditions. Systems and procedures need preflight verification with ground-based testing. Traditionally, component testing has been accomplished, but practical means for "human in the loop" verification of patient care systems have been lacking. Medical modeling and simulation technology offer potential means to accomplish such validation work. Initial considerations in the development of functional requirements and design standards for simulation systems for space medicine are discussed.

  1. NMR diffusion simulation based on conditional random walk.

    PubMed

    Gudbjartsson, H; Patz, S

    1995-01-01

    The authors introduce here a new, very fast, simulation method for free diffusion in a linear magnetic field gradient, which is an extension of the conventional Monte Carlo (MC) method or the convolution method described by Wong et al. (in 12th SMRM, New York, 1993, p.10). In earlier NMR-diffusion simulation methods, such as the finite difference method (FD), the Monte Carlo method, and the deterministic convolution method, the outcome of the calculations depends on the simulation time step. In the authors' method, however, the results are independent of the time step, although, in the convolution method the step size has to be adequate for spins to diffuse to adjacent grid points. By always selecting the largest possible time step the computation time can therefore be reduced. Finally the authors point out that in simple geometric configurations their simulation algorithm can be used to reduce computation time in the simulation of restricted diffusion.

  2. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.

  3. CFD-based optimization in plastics extrusion

    NASA Astrophysics Data System (ADS)

    Eusterholz, Sebastian; Elgeti, Stefanie

    2018-05-01

    This paper presents novel ideas in numerical design of mixing elements in single-screw extruders. The actual design process is reformulated as a shape optimization problem, given some functional, but possibly inefficient initial design. Thereby automatic optimization can be incorporated and the design process is advanced, beyond the simulation-supported, but still experience-based approach. This paper proposes concepts to extend a method which has been developed and validated for die design to the design of mixing-elements. For simplicity, it focuses on single-phase flows only. The developed method conducts forward-simulations to predict the quasi-steady melt behavior in the relevant part of the extruder. The result of each simulation is used in a black-box optimization procedure based on an efficient low-order parameterization of the geometry. To minimize user interaction, an objective function is formulated that quantifies the products' quality based on the forward simulation. This paper covers two aspects: (1) It reviews the set-up of the optimization framework as discussed in [1], and (2) it details the necessary extensions for the optimization of mixing elements in single-screw extruders. It concludes with a presentation of first advances in the unsteady flow simulation of a metering and mixing section with the SSMUM [2] using the Carreau material model.

  4. Analysis of utilization of desert habitats with dynamic simulation

    USGS Publications Warehouse

    Williams, B.K.

    1986-01-01

    The effects of climate and herbivores on cool desert shrubs in north-western Utah were investigated with a dynamic simulation model. Cool desert shrublands are extensively managed as grazing lands, and are defoliated annually by domestic livestock. A primary production model was used to simulate harvest yields and shrub responses under a variety of climatic regimes and defoliation patterns. The model consists of six plant components, and it is based on equations of growth analysis. Plant responses were simulated under various combinations of 20 annual weather patterns and 14 defoliation strategies. Results of the simulations exhibit some unexpected linearities in model behavior, and emphasize the importance of both the pattern of climate and the level of plant vigor in determining optimal harvest strategies. Model behaviors are interpreted in terms of shrub morphology, physiology and ecology.

  5. Molecular Optical Simulation Environment (MOSE): A Platform for the Simulation of Light Propagation in Turbid Media

    PubMed Central

    Ren, Shenghan; Chen, Xueli; Wang, Hailong; Qu, Xiaochao; Wang, Ge; Liang, Jimin; Tian, Jie

    2013-01-01

    The study of light propagation in turbid media has attracted extensive attention in the field of biomedical optical molecular imaging. In this paper, we present a software platform for the simulation of light propagation in turbid media named the “Molecular Optical Simulation Environment (MOSE)”. Based on the gold standard of the Monte Carlo method, MOSE simulates light propagation both in tissues with complicated structures and through free-space. In particular, MOSE synthesizes realistic data for bioluminescence tomography (BLT), fluorescence molecular tomography (FMT), and diffuse optical tomography (DOT). The user-friendly interface and powerful visualization tools facilitate data analysis and system evaluation. As a major measure for resource sharing and reproducible research, MOSE aims to provide freeware for research and educational institutions, which can be downloaded at http://www.mosetm.net. PMID:23577215

  6. A Homogenization Approach for Design and Simulation of Blast Resistant Composites

    NASA Astrophysics Data System (ADS)

    Sheyka, Michael

    Structural composites have been used in aerospace and structural engineering due to their high strength to weight ratio. Composite laminates have been successfully and extensively used in blast mitigation. This dissertation examines the use of the homogenization approach to design and simulate blast resistant composites. Three case studies are performed to examine the usefulness of different methods that may be used in designing and optimizing composite plates for blast resistance. The first case study utilizes a single degree of freedom system to simulate the blast and a reliability based approach. The first case study examines homogeneous plates and the optimal stacking sequence and plate thicknesses are determined. The second and third case studies use the homogenization method to calculate the properties of composite unit cell made of two different materials. The methods are integrated with dynamic simulation environments and advanced optimization algorithms. The second case study is 2-D and uses an implicit blast simulation, while the third case study is 3-D and simulates blast using the explicit blast method. Both case studies 2 and 3 rely on multi-objective genetic algorithms for the optimization process. Pareto optimal solutions are determined in case studies 2 and 3. Case study 3 is an integrative method for determining optimal stacking sequence, microstructure and plate thicknesses. The validity of the different methods such as homogenization, reliability, explicit blast modeling and multi-objective genetic algorithms are discussed. Possible extension of the methods to include strain rate effects and parallel computation is also examined.

  7. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  8. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  9. CDPOP Users Manual

    Treesearch

    E. L. Landguth; B. K. Hand; J. M. Glassy; S. A. Cushman; M. Jacobi; T. J. Julian

    2011-01-01

    The goal of this user manual is to explain the technical aspects of the current release of the CDPOP program. CDPOP v1.0 is a major extension of the CDPOP program (Landguth and Cushman 2010). CDPOP is an individual-based program that simulates the influences of landscape structure on emergence of spatial patterns in population genetic data as functions of individual-...

  10. A Prescriptive Model for Resource Allocation at the Intermediate Level Engine Facility.

    DTIC Science & Technology

    1981-06-01

    opinion was to use a linear rela- tionship in the absence of any other method (4). This sug- gestion was based on his extensive knowledge in the area 21 of...Business Publica - tions, Inc., 1980. 16. Shannon, Robert E. Systems Simulation--the Art and Science. Englewood Cliffs NJ: Prentice Hall, Inc., 1975. 17

  11. State estimation of stochastic non-linear hybrid dynamic system using an interacting multiple model algorithm.

    PubMed

    Elenchezhiyan, M; Prakash, J

    2015-09-01

    In this work, state estimation schemes for non-linear hybrid dynamic systems subjected to stochastic state disturbances and random errors in measurements using interacting multiple-model (IMM) algorithms are formulated. In order to compute both discrete modes and continuous state estimates of a hybrid dynamic system either an IMM extended Kalman filter (IMM-EKF) or an IMM based derivative-free Kalman filters is proposed in this study. The efficacy of the proposed IMM based state estimation schemes is demonstrated by conducting Monte-Carlo simulation studies on the two-tank hybrid system and switched non-isothermal continuous stirred tank reactor system. Extensive simulation studies reveal that the proposed IMM based state estimation schemes are able to generate fairly accurate continuous state estimates and discrete modes. In the presence and absence of sensor bias, the simulation studies reveal that the proposed IMM unscented Kalman filter (IMM-UKF) based simultaneous state and parameter estimation scheme outperforms multiple-model UKF (MM-UKF) based simultaneous state and parameter estimation scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Capturing RNA Folding Free Energy with Coarse-Grained Molecular Dynamics Simulations

    PubMed Central

    Bell, David R.; Cheng, Sara Y.; Salazar, Heber; Ren, Pengyu

    2017-01-01

    We introduce a coarse-grained RNA model for molecular dynamics simulations, RACER (RnA CoarsE-gRained). RACER achieves accurate native structure prediction for a number of RNAs (average RMSD of 2.93 Å) and the sequence-specific variation of free energy is in excellent agreement with experimentally measured stabilities (R2 = 0.93). Using RACER, we identified hydrogen-bonding (or base pairing), base stacking, and electrostatic interactions as essential driving forces for RNA folding. Also, we found that separating pairing vs. stacking interactions allowed RACER to distinguish folded vs. unfolded states. In RACER, base pairing and stacking interactions each provide an approximate stability of 3–4 kcal/mol for an A-form helix. RACER was developed based on PDB structural statistics and experimental thermodynamic data. In contrast with previous work, RACER implements a novel effective vdW potential energy function, which led us to re-parameterize hydrogen bond and electrostatic potential energy functions. Further, RACER is validated and optimized using a simulated annealing protocol to generate potential energy vs. RMSD landscapes. Finally, RACER is tested using extensive equilibrium pulling simulations (0.86 ms total) on eleven RNA sequences (hairpins and duplexes). PMID:28393861

  13. Sub-half-micron contact window design with 3D photolithography simulator

    NASA Astrophysics Data System (ADS)

    Brainerd, Steve K.; Bernard, Douglas A.; Rey, Juan C.; Li, Jiangwei; Granik, Yuri; Boksha, Victor V.

    1997-07-01

    In state of the art IC design and manufacturing certain lithography layers have unique requirements. Latitudes and tolerances that apply to contacts and polysilicon gates are tight for such critical layers. Industry experts are discussing the most cost effective ways to use feature- oriented equipment and materials already developed for these layers. Such requirements introduce new dimensions into the traditionally challenging task for the photolithography engineer when considering various combinations of multiple factors to optimize and control the process. In addition, he/she faces a rapidly increasing cost of experiments, limited time and scarce access to equipment to conduct them. All the reasons presented above support simulation as an ideal method to satisfy these demands. However lithography engineers may be easily dissatisfied with a simulation tool when discovering disagreement between the simulation and experimental data. The problem is that several parameters used in photolithography simulation are very process specific. Calibration, i.e. matching experimental and simulation data using a specific set of procedures allows one to effectively use the simulation tool. We present results of a simulation based approach to optimize photolithography processes for sub-0.5 micron contact windows. Our approach consists of: (1) 3D simulation to explore different lithographic options, (2) calibration to a range of process conditions with extensive use of specifically developed optimization techniques. The choice of a 3D simulator is essential because of 3D nature of the problem of contact window design. We use DEPICT 4.1. This program performs fast aerial image simulation as presented before. For 3D exposure the program uses an extension to three-dimensions of the high numerical aperture model combined with Fast Fourier Transforms for maximum performance and accuracy. We use Kim (U.C. Berkeley) model and the fast marching Level Set method respectively for the calculation of resist development rates and resist surface movement during development process. Calibration efforts were aimed at matching experimental results on contact windows obtained after exposure of a binary mask. Additionally, simulation was applied to conduct quantitative analysis of PSM design capabilities, optical proximity correction, and stepper parameter optimization. Extensive experiments covered exposure (ASML 5500/100D stepper), pre- and post-exposure bake and development (2.38% TMAH, puddle process) of JSR IX725D2G and TOK iP3500 photoresists films on 200 mm test wafers. `Aquatar' was used as top antireflective coating, SEM pictures of developed patterns were analyzed and compared with simulation results for different values of defocus, exposure energies, numerical aperture and partial coherence.

  14. A control-oriented dynamic wind farm flow model: “WFSim”

    NASA Astrophysics Data System (ADS)

    Boersma, S.; Gebraad, P. M. O.; Vali, M.; Doekemeijer, B. M.; van Wingerden, J. W.

    2016-09-01

    In this paper, we present and extend the dynamic medium fidelity control-oriented Wind Farm Simulator (WFSim) model. WFSim resolves flow fields in wind farms in a horizontal, two dimensional plane. It is based on the spatially and temporally discretised two dimensional Navier-Stokes equations and the continuity equation and solves for a predefined grid and wind farm topology. The force on the flow field generated by turbines is modelled using actuator disk theory. Sparsity in system matrices is exploited in WFSim, which enables a relatively fast flow field computation. The extensions to WFSim we present in this paper are the inclusion of a wake redirection model, a turbulence model and a linearisation of the nonlinear WFSim model equations. The first is important because it allows us to carry out wake redirection control and simulate situations with an inflow that is misaligned with the rotor plane. The wake redirection model is validated against a theoretical wake centreline known from literature. The second extension makes WFSim more realistic because it accounts for wake recovery. The amount of recovery is validated using a high fidelity simulation model Simulator fOr Wind Farm Applications (SOWFA) for a two turbine test case. Finally, a linearisation is important since it allows the application of more standard analysis, observer and control techniques.

  15. Simulation of orientational coherent effects via Geant4

    NASA Astrophysics Data System (ADS)

    Bagli, E.; Asai, M.; Brandt, D.; Dotti, A.; Guidi, V.; Verderi, M.; Wright, D.

    2017-10-01

    Simulation of orientational coherent effects via Geant4 beam manipulation of high-and very-high-energy particle beams is a hot topic in accelerator physics. Coherent effects of ultra-relativistic particles in bent crystals allow the steering of particle trajectories thanks to the strong electrical field generated between atomic planes. Recently, a collimation experiment with bent crystals was carried out at the CERN-LHC, paving the way to the usage of such technology in current and future accelerators. Geant4 is a widely used object-oriented tool-kit for the Monte Carlo simulation of the interaction of particles with matter in high-energy physics. Moreover, its areas of application include also nuclear and accelerator physics, as well as studies in medical and space science. We present the first Geant4 extension for the simulation of orientational effects in straight and bent crystals for high energy charged particles. The model allows the manipulation of particle trajectories by means of straight and bent crystals and the scaling of the cross sections of hadronic and electromagnetic processes for channeled particles. Based on such a model, an extension of the Geant4 toolkit has been developed. The code and the model have been validated by comparison with published experimental data regarding the deflection efficiency via channeling and the variation of the rate of inelastic nuclear interactions.

  16. Simulation of Constrained Musculoskeletal Systems in Task Space.

    PubMed

    Stanev, Dimitar; Moustakas, Konstantinos

    2018-02-01

    This paper proposes an operational task space formalization of constrained musculoskeletal systems, motivated by its promising results in the field of robotics. The change of representation requires different algorithms for solving the inverse and forward dynamics simulation in the task space domain. We propose an extension to the direct marker control and an adaptation of the computed muscle control algorithms for solving the inverse kinematics and muscle redundancy problems, respectively. Experimental evaluation demonstrates that this framework is not only successful in dealing with the inverse dynamics problem, but also provides an intuitive way of studying and designing simulations, facilitating assessment prior to any experimental data collection. The incorporation of constraints in the derivation unveils an important extension of this framework toward addressing systems that use absolute coordinates and topologies that contain closed kinematic chains. Task space projection reveals a more intuitive encoding of the motion planning problem, allows for better correspondence between observed and estimated variables, provides the means to effectively study the role of kinematic redundancy, and most importantly, offers an abstract point of view and control, which can be advantageous toward further integration with high level models of the precommand level. Task-based approaches could be adopted in the design of simulation related to the study of constrained musculoskeletal systems.

  17. Filtering in Hybrid Dynamic Bayesian Networks

    NASA Technical Reports Server (NTRS)

    Andersen, Morten Nonboe; Andersen, Rasmus Orum; Wheeler, Kevin

    2000-01-01

    We implement a 2-time slice dynamic Bayesian network (2T-DBN) framework and make a 1-D state estimation simulation, an extension of the experiment in (v.d. Merwe et al., 2000) and compare different filtering techniques. Furthermore, we demonstrate experimentally that inference in a complex hybrid DBN is possible by simulating fault detection in a watertank system, an extension of the experiment in (Koller & Lerner, 2000) using a hybrid 2T-DBN. In both experiments, we perform approximate inference using standard filtering techniques, Monte Carlo methods and combinations of these. In the watertank simulation, we also demonstrate the use of 'non-strict' Rao-Blackwellisation. We show that the unscented Kalman filter (UKF) and UKF in a particle filtering framework outperform the generic particle filter, the extended Kalman filter (EKF) and EKF in a particle filtering framework with respect to accuracy in terms of estimation RMSE and sensitivity with respect to choice of network structure. Especially we demonstrate the superiority of UKF in a PF framework when our beliefs of how data was generated are wrong. Furthermore, we investigate the influence of data noise in the watertank simulation using UKF and PFUKD and show that the algorithms are more sensitive to changes in the measurement noise level that the process noise level. Theory and implementation is based on (v.d. Merwe et al., 2000).

  18. Fixed gain and adaptive techniques for rotorcraft vibration control

    NASA Technical Reports Server (NTRS)

    Roy, R. H.; Saberi, H. A.; Walker, R. A.

    1985-01-01

    The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.

  19. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  20. Object oriented studies into artificial space debris

    NASA Technical Reports Server (NTRS)

    Adamson, J. M.; Marshall, G.

    1988-01-01

    A prototype simulation is being developed under contract to the Royal Aerospace Establishment (RAE), Farnborough, England, to assist in the discrimination of artificial space objects/debris. The methodology undertaken has been to link Object Oriented programming, intelligent knowledge based system (IKBS) techniques and advanced computer technology with numeric analysis to provide a graphical, symbolic simulation. The objective is to provide an additional layer of understanding on top of conventional classification methods. Use is being made of object and rule based knowledge representation, multiple reasoning, truth maintenance and uncertainty. Software tools being used include Knowledge Engineering Environment (KEE) and SymTactics for knowledge representation. Hooks are being developed within the SymTactics framework to incorporate mathematical models describing orbital motion and fragmentation. Penetration and structural analysis can also be incorporated. SymTactics is an Object Oriented discrete event simulation tool built as a domain specific extension to the KEE environment. The tool provides facilities for building, debugging and monitoring dynamic (military) simulations.

  1. The College of Anaesthetists of Ireland Simulation Training programme: a descriptive report and analysis of course participants' feedback.

    PubMed

    Cafferkey, Aine; Coyle, Elizabeth; Greaney, David; Harte, Sinead; Hayes, Niamh; Langdon, Miriam; Straub, Birgitt; Burlacu, Crina

    2018-03-20

    Simulation-based education is a modern training modality that allows healthcare professionals to develop knowledge and practice skills in a safe learning environment. The College of Anaesthetists of Ireland (CAI) was the first Irish postgraduate medical training body to introduce mandatory simulation training into its curriculum. Extensive quality assurance and improvement data has been collected on all simulation courses to date. Describe The College of Anaesthetists of Ireland Simulation Training (CAST) programme and report the analysis of course participants' feedback. A retrospective review of feedback forms from four simulation courses from March 2010 to August 2016 took place. Qualitative and quantitative data from 1069 participants who attended 112 courses was analysed. Feedback was overall very positive. Course content and delivery were deemed to be appropriate. Participants agreed that course participation would influence their future practice. A statistically significant difference (P < 0.001) between self-reported pre- and post-course confidence scores was observed in 19 out of 25 scenarios. The learning environment, learning method and debrief were highlighted as aspects of the courses that participants liked most. The mandatory integration of CAST has been welcomed with widespread enthusiasm among specialist anaesthesia trainees. Intuitively, course participation instils confidence in trainees and better equips them to manage anaesthesia emergencies in the clinical setting. It remains to be seen if translational outcomes result from this increase in confidence. Nevertheless, the findings of this extensive review have cemented the place of mandatory simulation training in specialist anaesthesia training in Ireland.

  2. Lightweight computational steering of very large scale molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less

  3. Simulation of Radiation Damage to Neural Cells with the Geant4-DNA Toolkit

    NASA Astrophysics Data System (ADS)

    Bayarchimeg, Lkhagvaa; Batmunkh, Munkhbaatar; Belov, Oleg; Lkhagva, Oidov

    2018-02-01

    To help in understanding the physical and biological mechanisms underlying effects of cosmic and therapeutic types of radiation on the central nervous system (CNS), we have developed an original neuron application based on the Geant4 Monte Carlo simulation toolkit, in particular on its biophysical extension Geant4-DNA. The applied simulation technique provides a tool for the simulation of physical, physico-chemical and chemical processes (e.g. production of water radiolysis species in the vicinity of neurons) in realistic geometrical model of neural cells exposed to ionizing radiation. The present study evaluates the microscopic energy depositions and water radiolysis species yields within a detailed structure of a selected neuron taking into account its soma, dendrites, axon and spines following irradiation with carbon and iron ions.

  4. Application of the Ecosystem Assessment Model to Lake Norman: A cooling lake in North Carolina: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porcella, D.B.; Bowie, G.L.; Campbell, C.L.

    The Ecosystem Assessment Model (EAM) of the Cooling Lake Assessment Methodology was applied to the extensive ecological field data collected at Lake Norman, North Carolina by Duke Power Company to evaluate its capability to simulate lake ecosystems and the ecological effects of steam electric power plants. The EAM provided simulations over a five-year verification period that behaved as expected based on a one-year calibration. Major state variables of interest to utilities and regulatory agencies are: temperature, dissolved oxygen, and fish community variables. In qualitative terms, temperature simulation was very accurate, dissolved oxygen simulation was accurate, and fish prediction was reasonablymore » accurate. The need for more accurate fisheries data collected at monthly intervals and non-destructive sampling techniques was identified.« less

  5. Observers' focus of attention in the simulation of self-perception.

    PubMed

    Wegner, D M; Finstuen, K

    1977-01-01

    This research was designed to assess the effects of a manipulation of observers' focus of attention--from a focus on the actor to a focus on the actor's situation--upon observers' attributions of attitude to an actor in a simulation of a forced-compliance cognitive dissonance experiment. Observers induced through empathy instructions to focus attention on the actor's situation inferred less actor attitude positivity than did observers given no specific observational set. In addition, situation-focused observers inferred that the actor's attitude was directly related to reward magnitude, whereas actor-focused observers inferred that the actor's attitude was inversely related to reward magnitude. An extension of self-perception theory, offered as an interpretation of these and other results, suggested that motivation attribution made by actors and observers in dissonance and simulation studies are dependent on focus of attention. The attributions made by actor-focused observers simulate those of objectively self-aware actors and are based upon perceived intrinsic motivation; the attributions of situation-focused observers simulate those of subjectively self-aware actors and are based upon perceived extrinsic motivation.

  6. Simulations of stretching a flexible polyelectrolyte with varying charge separation

    DOE PAGES

    Stevens, Mark J.; Saleh, Omar A.

    2016-07-22

    We calculated the force-extension curves for a flexible polyelectrolyte chain with varying charge separations by performing Monte Carlo simulations of a 5000 bead chain using a screened Coulomb interaction. At all charge separations, the force-extension curves exhibit a Pincus-like scaling regime at intermediate forces and a logarithmic regime at large forces. As the charge separation increases, the Pincus regime shifts to a larger range of forces and the logarithmic regime starts are larger forces. We also found that force-extension curve for the corresponding neutral chain has a logarithmic regime. Decreasing the diameter of bead in the neutral chain simulations removedmore » the logarithmic regime, and the force-extension curve tends to the freely jointed chain limit. In conclusion, this result shows that only excluded volume is required for the high force logarithmic regime to occur.« less

  7. A model for growth of a single fungal hypha based on well-mixed tanks in series: simulation of nutrient and vesicle transport in aerial reproductive hyphae.

    PubMed

    Balmant, Wellington; Sugai-Guérios, Maura Harumi; Coradin, Juliana Hey; Krieger, Nadia; Furigo Junior, Agenor; Mitchell, David Alexander

    2015-01-01

    Current models that describe the extension of fungal hyphae and development of a mycelium either do not describe the role of vesicles in hyphal extension or do not correctly describe the experimentally observed profile for distribution of vesicles along the hypha. The present work uses the n-tanks-in-series approach to develop a model for hyphal extension that describes the intracellular transport of nutrient to a sub-apical zone where vesicles are formed and then transported to the tip, where tip extension occurs. The model was calibrated using experimental data from the literature for the extension of reproductive aerial hyphae of three different fungi, and was able to describe different profiles involving acceleration and deceleration of the extension rate. A sensitivity analysis showed that the supply of nutrient to the sub-apical vesicle-producing zone is a key factor influencing the rate of extension of the hypha. Although this model was used to describe the extension of a single reproductive aerial hypha, the use of the n-tanks-in-series approach to representing the hypha means that the model has the flexibility to be extended to describe the growth of other types of hyphae and the branching of hyphae to form a complete mycelium.

  8. Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  9. Mixed-field GCR Simulations for Radiobiological Research Using Ground Based Accelerators

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis A.

    2014-01-01

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20% accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  10. An Aircraft Lifecycle Approach for the Cost-Benefit Analysis of Prognostics and Condition-Based Maintenance-Based on Discrete-Event Simulation

    DTIC Science & Technology

    2014-10-02

    MPD. This manufacturer documentation contains maintenance tasks with specification of intervals and required man-hours that are to be carried out...failures, without consideration of false alarms and missed failures (see also section 4.1.3). The task redundancy rate is the percentage of preventive...Prognostics and Health Management ROI return on investment RUL remaining useful life TCG task code group SB Service Bulletin XML Extensible Markup

  11. Cutting the Wires: Modularization of Cellular Networks for Experimental Design

    PubMed Central

    Lang, Moritz; Summers, Sean; Stelling, Jörg

    2014-01-01

    Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. PMID:24411264

  12. Dissipative particle dynamics simulations of polymer chains: scaling laws and shearing response compared to DNA experiments.

    PubMed

    Symeonidis, Vasileios; Em Karniadakis, George; Caswell, Bruce

    2005-08-12

    Dissipative particle dynamics simulations of several bead-spring representations of polymer chains in dilute solution are used to demonstrate the correct static scaling laws for the radius of gyration. Shear flow results for the wormlike chain simulating single DNA molecules compare well with average extensions from experiments, irrespective of the number of beads. However, coarse graining with more than a few beads degrades the agreement of the autocorrelation of the extension.

  13. The research of hourglass worm dynamic balancing simulation based on SolidWorks motion

    NASA Astrophysics Data System (ADS)

    Wang, Zhuangzhuang; Yang, Jie; Liu, Pingyi; Zhao, Junpeng

    2018-02-01

    Hourglass worm is extensively used in industry due to its characteristic of heavy-load and a large reduction ratio. Varying sizes of unbalanced mass distribution appeared in the design of a single head worm. With machines developing towards higher speed and precision, the vibration and shock caused by the unbalanced mass distribution of rotating parts must be considered. Therefore, the balance grade of these parts must meet higher requirements. A method based on theoretical analysis and SolidWorks motion software simulation is presented in this paper; the virtual dynamic balance simulation test of the hourglass worm was carried out during the design of the product, so as to ensure that the hourglass worm meet the requirements of dynamic balance in the design process. This can effectively support the structural design of the hourglass worm and provide a way of thinking and designing the same type of products.

  14. Simulation investigation of the effect of the NASA Ames 80-by 120-foot wind tunnel exhaust flow on light aircraft operating in the Moffett field trafffic pattern

    NASA Technical Reports Server (NTRS)

    Streeter, Barry G.

    1986-01-01

    A preliminary study of the exhaust flow from the Ames Research Center 80 by 120 Foot Wind Tunnel indicated that the flow might pose a hazard to low-flying light aircraft operating in the Moffett Field traffic pattern. A more extensive evaluation of the potential hazard was undertaken using a fixed-base, piloted simulation of a light, twin-engine, general-aviation aircraft. The simulated aircraft was flown through a model of the wind tunnel exhaust by pilots of varying experience levels to develop a data base of aircraft and pilot reactions. It is shown that a light aircraft would be subjected to a severe disturbance which, depending upon entry condition and pilot reaction, could result in a low-altitude stall or cause damage to the aircraft tail structure.

  15. Analysis of in-trail following dynamics of CDTI-equipped aircraft. [Cockpit Displays of Traffic Information

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Goka, T.

    1982-01-01

    In connection with the necessity to provide greater terminal area capacity, attention is given to approaches in which the required increase in capacity will be obtained by making use of more automation and by involving the pilot to a larger degree in the air traffic control (ATC) process. It was recommended that NASA should make extensive use of its research aircraft and cockpit simulators to assist the FAA in examining the capabilities and limitations of cockpit displays of traffic information (CDTI). A program was organized which utilizes FAA ATC (ground-based) simulators and NASA aircraft and associated cockpit simulators in a research project which explores applications of the CDTI system. The present investigation is concerned with several questions related to the CDTI-based terminal area traffic tactical control concepts. Attention is given to longitudinal separation criteria, a longitudinal following model, longitudinal capture, combined longitudinal/vertical control, and lateral control.

  16. 3D numerical simulations of multiphase continental rifting

    NASA Astrophysics Data System (ADS)

    Naliboff, J.; Glerum, A.; Brune, S.

    2017-12-01

    Observations of rifted margin architecture suggest continental breakup occurs through multiple phases of extension with distinct styles of deformation. The initial rifting stages are often characterized by slow extension rates and distributed normal faulting in the upper crust decoupled from deformation in the lower crust and mantle lithosphere. Further rifting marks a transition to higher extension rates and coupling between the crust and mantle lithosphere, with deformation typically focused along large-scale detachment faults. Significantly, recent detailed reconstructions and high-resolution 2D numerical simulations suggest that rather than remaining focused on a single long-lived detachment fault, deformation in this phase may progress toward lithospheric breakup through a complex process of fault interaction and development. The numerical simulations also suggest that an initial phase of distributed normal faulting can play a key role in the development of these complex fault networks and the resulting finite deformation patterns. Motivated by these findings, we will present 3D numerical simulations of continental rifting that examine the role of temporal increases in extension velocity on rifted margin structure. The numerical simulations are developed with the massively parallel finite-element code ASPECT. While originally designed to model mantle convection using advanced solvers and adaptive mesh refinement techniques, ASPECT has been extended to model visco-plastic deformation that combines a Drucker Prager yield criterion with non-linear dislocation and diffusion creep. To promote deformation localization, the internal friction angle and cohesion weaken as a function of accumulated plastic strain. Rather than prescribing a single zone of weakness to initiate deformation, an initial random perturbation of the plastic strain field combined with rapid strain weakening produces distributed normal faulting at relatively slow rates of extension in both 2D and 3D simulations. Our presentation will focus on both the numerical assumptions required to produce these results and variations in 3D rifted margin architecture arising from a transition from slow to rapid rates of extension.

  17. Human factors evaluations of Free Flight Issues solved and issues remaining.

    PubMed

    Ruigrok, Rob C J; Hoekstra, Jacco M

    2007-07-01

    The Dutch National Aerospace Laboratory (NLR) has conducted extensive human-in-the-loop simulation experiments in NLR's Research Flight Simulator (RFS), focussed on human factors evaluation of Free Flight. Eight years of research, in co-operation with partners in the United States and Europe, has shown that Free Flight has the potential to increase airspace capacity by at least a factor of 3. Expected traffic loads and conflict rates for the year 2020 appear to be no major problem for professional airline crews participating in flight simulation experiments. Flight efficiency is significantly improved by user-preferred routings, including cruise climbs, while pilot workload is only slightly increased compared to today's reference. Detailed results from three projects and six human-in-the-loop experiments in NLR's Research Flight Simulator are reported. The main focus of these results is on human factors issues and particularly workload, measured both subjectively and objectively. An extensive discussion is included on many human factors issues resolved during the experiments, but also open issues are identified. An intent-based Conflict Detection and Resolution (CD&R) system provides "benefits" in terms of reduced pilot workload, but also "costs" in terms of complexity, need for priority rules, potential compatibility problems between different brands of Flight Management Systems and large bandwidth. Moreover, the intent-based system is not effective at solving multi-aircraft conflicts. A state-based CD&R system also provides "benefits" and "costs". Benefits compared to the full intent-based system are simplicity, low bandwidth requirements, easy to retrofit (no requirements to change avionics infrastructure) and the ability to solve multi-aircraft conflicts in parallel. The "costs" involve a somewhat higher pilot workload in similar circumstances, the smaller look-ahead time which results in less efficient resolution manoeuvres and the sometimes false/nuisance alerts due to missing intent information. The optimal CD&R system (in terms of costs versus benefits) has been suggested to be state-based CD&R with the addition of intended or target flight level. This combination of state-based CD&R with a limited amount of intent provides "the best of both worlds". Studying this CD&R system is still an open issue.

  18. Computer Simulations Reveal Substrate Specificity of Glycosidic Bond Cleavage in Native and Mutant Human Purine Nucleoside Phosphorylase.

    PubMed

    Isaksen, Geir Villy; Hopmann, Kathrin Helen; Åqvist, Johan; Brandsdal, Bjørn Olav

    2016-04-12

    Purine nucleoside phosphorylase (PNP) catalyzes the reversible phosphorolysis of purine ribonucleosides and 2'-deoxyribonucleosides, yielding the purine base and (2'-deoxy)ribose 1-phosphate as products. While this enzyme has been extensively studied, several questions with respect to the catalytic mechanism have remained largely unanswered. The role of the phosphate and key amino acid residues in the catalytic reaction as well as the purine ring protonation state is elucidated using density functional theory calculations and extensive empirical valence bond (EVB) simulations. Free energy surfaces for adenosine, inosine, and guanosine are fitted to ab initio data and yield quantitative agreement with experimental data when the surfaces are used to model the corresponding enzymatic reactions. The cognate substrates 6-aminopurines (inosine and guanosine) interact with PNP through extensive hydrogen bonding, but the substrate specificity is found to be a direct result of the electrostatic preorganization energy along the reaction coordinate. Asn243 has previously been identified as a key residue providing substrate specificity. Mutation of Asn243 to Asp has dramatic effects on the substrate specificity, making 6-amino- and 6-oxopurines equally good as substrates. The principal effect of this particular mutation is the change in the electrostatic preorganization energy between the native enzyme and the Asn243Asp mutant, clearly favoring adenosine over inosine and guanosine. Thus, the EVB simulations show that this particular mutation affects the electrostatic preorganization of the active site, which in turn can explain the substrate specificity.

  19. Active and hibernating turbulence in drag-reducing plane Couette flows

    NASA Astrophysics Data System (ADS)

    Pereira, Anselmo S.; Mompean, Gilmar; Thais, Laurent; Soares, Edson J.; Thompson, Roney L.

    2017-08-01

    In this paper we analyze the active and hibernating turbulence in drag-reducing plane Couette flows using direct numerical simulations of the viscoelastic finitely extensible nonlinear elastic model with the Peterlin approximation fluids. The polymer-turbulence interactions are studied from an energetic standpoint for a range of Weissenberg numbers (from 2 up to 30), fixing the Reynolds number based on the plate velocities at 4000, the viscosity ratio at 0.9, and the maximum polymer molecule extensibility at 100. The qualitative picture that emerges from this investigation is a cyclic mechanism of energy exchange between the polymers and turbulence that drives the flow through an oscillatory behavior.

  20. Probing the folded state and mechanical unfolding pathways of T4 lysozyme using all-atom and coarse-grained molecular simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Wenjun, E-mail: wjzheng@buffalo.edu; Glenn, Paul

    2015-01-21

    The Bacteriophage T4 Lysozyme (T4L) is a prototype modular protein comprised of an N-terminal and a C-domain domain, which was extensively studied to understand the folding/unfolding mechanism of modular proteins. To offer detailed structural and dynamic insights to the folded-state stability and the mechanical unfolding behaviors of T4L, we have performed extensive equilibrium and steered molecular dynamics simulations of both the wild-type (WT) and a circular permutation (CP) variant of T4L using all-atom and coarse-grained force fields. Our all-atom and coarse-grained simulations of the folded state have consistently found greater stability of the C-domain than the N-domain in isolation, whichmore » is in agreement with past thermostatic studies of T4L. While the all-atom simulation cannot fully explain the mechanical unfolding behaviors of the WT and the CP variant observed in an optical tweezers study, the coarse-grained simulations based on the Go model or a modified elastic network model (mENM) are in qualitative agreement with the experimental finding of greater unfolding cooperativity in the WT than the CP variant. Interestingly, the two coarse-grained models predict different structural mechanisms for the observed change in cooperativity between the WT and the CP variant—while the Go model predicts minor modification of the unfolding pathways by circular permutation (i.e., preserving the general order that the N-domain unfolds before the C-domain), the mENM predicts a dramatic change in unfolding pathways (e.g., different order of N/C-domain unfolding in the WT and the CP variant). Based on our simulations, we have analyzed the limitations of and the key differences between these models and offered testable predictions for future experiments to resolve the structural mechanism for cooperative folding/unfolding of T4L.« less

  1. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  2. Radicals and Reservoirs in the GMI Chemistry and Transport Model: Comparison to Measurements

    NASA Technical Reports Server (NTRS)

    Douglass, Anne R.; Stolarski, Richard S.; Strahan, Susan E.; Connell, Peter S.

    2004-01-01

    We have used a three-dimensional chemistry and transport model (CTM), developed under the Global Modeling Initiative (GMI), to carry out two simulations of the composition of the stratosphere under changing halogen loading for 1995 through 2030. The two simulations differ only in that one uses meteorological fields from a general circulation model while the other uses meteorological fields from a data assimilation system. A single year's winds and temperatures are repeated for each 36-year simulation. We compare results from these two simulations with an extensive collection of data from satellite and ground-based measurements for 1993-2000. Comparisons of simulated fields with observations of radical and reservoir species for some of the major ozone-destroying compounds are of similar quality for both simulations. Differences in the upper stratosphere, caused by transport of total reactive nitrogen and methane, impact the balance among the ozone loss processes and the sensitivity of the two simulations to the change in composition.

  3. Digital Sound Encryption with Logistic Map and Number Theoretic Transform

    NASA Astrophysics Data System (ADS)

    Satria, Yudi; Gabe Rizky, P. H.; Suryadi, MT

    2018-03-01

    Digital sound security has limits on encrypting in Frequency Domain. Number Theoretic Transform based on field (GF 2521 – 1) improve and solve that problem. The algorithm for this sound encryption is based on combination of Chaos function and Number Theoretic Transform. The Chaos function that used in this paper is Logistic Map. The trials and the simulations are conducted by using 5 different digital sound files data tester in Wave File Extension Format and simulated at least 100 times each. The key stream resulted is random with verified by 15 NIST’s randomness test. The key space formed is very big which more than 10469. The processing speed of algorithm for encryption is slightly affected by Number Theoretic Transform.

  4. Photonic simulation of entanglement growth and engineering after a spin chain quench.

    PubMed

    Pitsios, Ioannis; Banchi, Leonardo; Rab, Adil S; Bentivegna, Marco; Caprara, Debora; Crespi, Andrea; Spagnolo, Nicolò; Bose, Sougato; Mataloni, Paolo; Osellame, Roberto; Sciarrino, Fabio

    2017-11-17

    The time evolution of quantum many-body systems is one of the most important processes for benchmarking quantum simulators. The most curious feature of such dynamics is the growth of quantum entanglement to an amount proportional to the system size (volume law) even when interactions are local. This phenomenon has great ramifications for fundamental aspects, while its optimisation clearly has an impact on technology (e.g., for on-chip quantum networking). Here we use an integrated photonic chip with a circuit-based approach to simulate the dynamics of a spin chain and maximise the entanglement generation. The resulting entanglement is certified by constructing a second chip, which measures the entanglement between multiple distant pairs of simulated spins, as well as the block entanglement entropy. This is the first photonic simulation and optimisation of the extensive growth of entanglement in a spin chain, and opens up the use of photonic circuits for optimising quantum devices.

  5. Guidance law simulation studies for complex approaches using the Microwave Landing System (MLS)

    NASA Technical Reports Server (NTRS)

    Feather, J. B.

    1986-01-01

    This report documents results for MLS guidance algorithm development conducted by DAC for NASA under the Advance Transport Operating Systems (ATOPS) Technology Studies program (NAS1-18028). The study consisted of evaluating guidance laws for vertical and lateral path control, as well as speed control, by simulating an MLS approach for the Washington National Airport. This work is an extension and generalization of a previous ATOPS contract (NAS1-16202) completed by DAC in 1985. The Washington river approach was simulated by six waypoints and one glideslope change and consisted of an eleven nautical mile approach path. Tracking performance was generated for 10 cases representing several different conditions, which included MLS noise, steady wind, turbulence, and windshear. Results of this simulation phase are suitable for use in future fixed-base simulator evaluations employing actual hardware (autopilot and a performance management system), as well as crew procedures and information requirements for MLS.

  6. Diffusion control for a tempered anomalous diffusion system using fractional-order PI controllers.

    PubMed

    Juan Chen; Zhuang, Bo; Chen, YangQuan; Cui, Baotong

    2017-05-09

    This paper is concerned with diffusion control problem of a tempered anomalous diffusion system based on fractional-order PI controllers. The contribution of this paper is to introduce fractional-order PI controllers into the tempered anomalous diffusion system for mobile actuators motion and spraying control. For the proposed control force, convergence analysis of the system described by mobile actuator dynamical equations is presented based on Lyapunov stability arguments. Moreover, a new Centroidal Voronoi Tessellation (CVT) algorithm based on fractional-order PI controllers, henceforth called FOPI-based CVT algorithm, is provided together with a modified simulation platform called Fractional-Order Diffusion Mobile Actuator-Sensor 2-Dimension Fractional-Order Proportional Integral (FO-Diff-MAS2D-FOPI). Finally, extensive numerical simulations for the tempered anomalous diffusion process are presented to verify the effectiveness of our proposed fractional-order PI controllers. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. An elementary singularity-free Rotational Brownian Dynamics algorithm for anisotropic particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilie, Ioana M.; Briels, Wim J.; MESA+ Institute for Nanotechnology, University of Twente, P.O. Box 217, 7500 AE Enschede

    2015-03-21

    Brownian Dynamics is the designated technique to simulate the collective dynamics of colloidal particles suspended in a solution, e.g., the self-assembly of patchy particles. Simulating the rotational dynamics of anisotropic particles by a first-order Langevin equation, however, gives rise to a number of complications, ranging from singularities when using a set of three rotational coordinates to subtle metric and drift corrections. Here, we derive and numerically validate a quaternion-based Rotational Brownian Dynamics algorithm that handles these complications in a simple and elegant way. The extension to hydrodynamic interactions is also discussed.

  8. Cylindrical gate all around Schottky barrier MOSFET with insulated shallow extensions at source/drain for removal of ambipolarity: a novel approach

    NASA Astrophysics Data System (ADS)

    Kumar, Manoj; Pratap, Yogesh; Haldar, Subhasis; Gupta, Mridula; Gupta, R. S.

    2017-12-01

    In this paper TCAD-based simulation of a novel insulated shallow extension (ISE) cylindrical gate all around (CGAA) Schottky barrier (SB) MOSFET has been reported, to eliminate the suicidal ambipolar behavior (bias-dependent OFF state leakage current) of conventional SB-CGAA MOSFET by blocking the metal-induced gap states as well as unwanted charge sharing between source/channel and drain/channel regions. This novel structure offers low barrier height at the source and offers high ON-state current. The I ON/I OFF of ISE-CGAA-SB-MOSFET increases by 1177 times and offers steeper subthreshold slope (~60 mV/decade). However a little reduction in peak cut off frequency is observed and to further improve the cut-off frequency dual metal gate architecture has been employed and a comparative assessment of single metal gate, dual metal gate, single metal gate with ISE, and dual metal gate with ISE has been presented. The improved performance of Schottky barrier CGAA MOSFET by the incorporation of ISE makes it an attractive candidate for CMOS digital circuit design. The numerical simulation is performed using the ATLAS-3D device simulator.

  9. Direct Numerical Simulations of Aerofoils with Serrated Trailing-Edge Extensions

    NASA Astrophysics Data System (ADS)

    Shahab, Muhammad Farrukh; Omidyeganeh, Mohammad; Pinelli, Alfredo

    2017-11-01

    Owl-feather-inspired technology motivates engineers to develop quieter wings. Direct numerical simulations of NACA-4412 aerofoil with retrofitted flat plate, serrated sawtooth shaped and porous (serrations with filaments) extensions have been performed to study the effects of these modifications on the hydrodynamic characteristics of the turbulent wake and their upstream influence on the interacting boundary layer. A chord based Reynolds number of 100,000 and an angle of attack of 5° has been chosen for all simulations, moreover the surface boundary layers are tripped using a a volume forcing method. This contribution will present a detailed statistical analysis of the mean and fluctuating behaviour of the flow and the key differences in the flow topologies will be highlighted. The preliminary analysis of results identifies a system of counter rotating streamwise vortices for the case of saw-tooth shaped serrations. The presence of the latter is generally considered responsible for an increased parasitic higher frequency noise for serrated aerofoils. To palliate the effect of aforementioned system of streamwise vortices, a filamentous layer occupying the voids of the serrations has been added which is expected to improve the aeroacoustic performance of the system.

  10. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  11. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  12. Cortical hypometabolism and hypoperfusion in Parkinson's disease is extensive: probably even at early disease stages.

    PubMed

    Borghammer, Per; Chakravarty, Mallar; Jonsdottir, Kristjana Yr; Sato, Noriko; Matsuda, Hiroshi; Ito, Kengo; Arahata, Yutaka; Kato, Takashi; Gjedde, Albert

    2010-05-01

    Recent cerebral blood flow (CBF) and glucose consumption (CMRglc) studies of Parkinson's disease (PD) revealed conflicting results. Using simulated data, we previously demonstrated that the often-reported subcortical hypermetabolism in PD could be explained as an artifact of biased global mean (GM) normalization, and that low-magnitude, extensive cortical hypometabolism is best detected by alternative data-driven normalization methods. Thus, we hypothesized that PD is characterized by extensive cortical hypometabolism but no concurrent widespread subcortical hypermetabolism and tested it on three independent samples of PD patients. We compared SPECT CBF images of 32 early-stage and 33 late-stage PD patients with that of 60 matched controls. We also compared PET FDG images from 23 late-stage PD patients with that of 13 controls. Three different normalization methods were compared: (1) GM normalization, (2) cerebellum normalization, (3) reference cluster normalization (Yakushev et al.). We employed standard voxel-based statistics (fMRIstat) and principal component analysis (SSM). Additionally, we performed a meta-analysis of all quantitative CBF and CMRglc studies in the literature to investigate whether the global mean (GM) values in PD are decreased. Voxel-based analysis with GM normalization and the SSM method performed similarly, i.e., both detected decreases in small cortical clusters and concomitant increases in extensive subcortical regions. Cerebellum normalization revealed more widespread cortical decreases but no subcortical increase. In all comparisons, the Yakushev method detected nearly identical patterns of very extensive cortical hypometabolism. Lastly, the meta-analyses demonstrated that global CBF and CMRglc values are decreased in PD. Based on the results, we conclude that PD most likely has widespread cortical hypometabolism, even at early disease stages. In contrast, extensive subcortical hypermetabolism is probably not a feature of PD.

  13. Validation of a Wave Data Assimilation System Based on SWAN

    NASA Astrophysics Data System (ADS)

    Flampourisi, Stylianos; Veeramony, Jayaram; Orzech, Mark D.; Ngodock, Hans E.

    2013-04-01

    SWAN is one of the most broadly used models for wave predictions in the nearshore, with known and extensively studied limitations due to the physics and/or to the numerical implementation. In order to improve the performance of the model, a 4DVAR data assimilation system based on a tangent linear code and the corresponding adjoint from the numerical SWAN model has been developed at NRL(Orzech et. al., 2013), by implementing the methodology of Bennett 2002. The assimilation system takes into account the nonlinear triad and quadruplet interactions, depth-limited breaking, wind forcing, bottom friction and white-capping. Using conjugate gradient method, the assimilation system minimizes a quadratic penalty functional (which represents the overall error of the simulation) and generates the correction of the forward simulation in spatial, temporal and spectral domain. The weights are given to the output of the adjoint by calculating the covariance to an ensemble of forward simulations according to Evensen 2009. This presentation will focus on the extension of the system to a weak-constrainted data assimilation system and on the extensive validation of the system by using wave spectra for forcing, assimilation and validation, from FRF Duck, North Carolina, during August 2011. During this period, at the 17 m waverider buoy location, the wind speed was up to 35 m/s (due to Hurricane Irene) and the significant wave height varied from 0.5 m to 6 m and the peak period between 5 s and 18 s. In general, this study shows significant improvement of the integrated spectral properties, but the main benefit of assimilating the wave spectra (and not only their integrated properties) is that the accurate simulation of separated, in frequency and in direction, wave systems is possible even nearshore, where non-linear phenomena are dominant. The system is ready to be used for more precise reanalysis of the wave climate and climate variability, and determination of coastal hazards in regional or local scales, in case of available wave data. References: Orzech, M.D., J. Veeramony, and H.E. Ngodock, 2013: A variational assimilation system for nearshore wave modeling. J. Atm. & Oc. Tech., in press.

  14. A federated design for a neurobiological simulation engine: the CBI federated software architecture.

    PubMed

    Cornelis, Hugo; Coop, Allan D; Bower, James M

    2012-01-01

    Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or obsoleted components, (4) Stand-alone testing of components, and (5) Clear delineation of the development scope of new components.

  15. A Federated Design for a Neurobiological Simulation Engine: The CBI Federated Software Architecture

    PubMed Central

    Cornelis, Hugo; Coop, Allan D.; Bower, James M.

    2012-01-01

    Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or obsoleted components, (4) Stand-alone testing of components, and (5) Clear delineation of the development scope of new components. PMID:22242154

  16. Routing Algorithm based on Minimum Spanning Tree and Minimum Cost Flow for Hybrid Wireless-optical Broadband Access Network

    NASA Astrophysics Data System (ADS)

    Le, Zichun; Suo, Kaihua; Fu, Minglei; Jiang, Ling; Dong, Wen

    2012-03-01

    In order to minimize the average end to end delay for data transporting in hybrid wireless optical broadband access network, a novel routing algorithm named MSTMCF (minimum spanning tree and minimum cost flow) is devised. The routing problem is described as a minimum spanning tree and minimum cost flow model and corresponding algorithm procedures are given. To verify the effectiveness of MSTMCF algorithm, extensively simulations based on OWNS have been done under different types of traffic source.

  17. KMgene: a unified R package for gene-based association analysis for complex traits.

    PubMed

    Yan, Qi; Fang, Zhou; Chen, Wei; Stegle, Oliver

    2018-02-09

    In this report, we introduce an R package KMgene for performing gene-based association tests for familial, multivariate or longitudinal traits using kernel machine (KM) regression under a generalized linear mixed model (GLMM) framework. Extensive simulations were performed to evaluate the validity of the approaches implemented in KMgene. http://cran.r-project.org/web/packages/KMgene. qi.yan@chp.edu or wei.chen@chp.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  18. A Mobile IPv6 based Distributed Mobility Management Mechanism of Mobile Internet

    NASA Astrophysics Data System (ADS)

    Yan, Shi; Jiayin, Cheng; Shanzhi, Chen

    A flatter architecture is one of the trends of mobile Internet. Traditional centralized mobility management mechanism faces the challenges such as scalability and UE reachability. A MIPv6 based distributed mobility management mechanism is proposed in this paper. Some important network entities and signaling procedures are defined. UE reachability is also considered in this paper through extension to DNS servers. Simulation results show that the proposed approach can overcome the scalability problem of the centralized scheme.

  19. Methods for Reachability-based Hybrid Controller Design

    DTIC Science & Technology

    2012-05-10

    approaches for airport runways ( Teo and Tomlin, 2003). The results of the reachability calculations were validated in extensive simulations as well as...UAV flight experiments (Jang and Tomlin, 2005; Teo , 2005). While the focus of these previous applications lies largely in safety verification, the work...B([15, 0],a0)× [−π,π])\\ V,∀qi ∈ Q, where a0 = 30m is the protected radius (chosen based upon published data of the wingspan of a Boeing KC -135

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chremos, Alexandros, E-mail: achremos@imperial.ac.uk; Nikoubashman, Arash, E-mail: arashn@princeton.edu; Panagiotopoulos, Athanassios Z.

    In this contribution, we develop a coarse-graining methodology for mapping specific block copolymer systems to bead-spring particle-based models. We map the constituent Kuhn segments to Lennard-Jones particles, and establish a semi-empirical correlation between the experimentally determined Flory-Huggins parameter χ and the interaction of the model potential. For these purposes, we have performed an extensive set of isobaric–isothermal Monte Carlo simulations of binary mixtures of Lennard-Jones particles with the same size but with asymmetric energetic parameters. The phase behavior of these monomeric mixtures is then extended to chains with finite sizes through theoretical considerations. Such a top-down coarse-graining approach is importantmore » from a computational point of view, since many characteristic features of block copolymer systems are on time and length scales which are still inaccessible through fully atomistic simulations. We demonstrate the applicability of our method for generating parameters by reproducing the morphology diagram of a specific diblock copolymer, namely, poly(styrene-b-methyl methacrylate), which has been extensively studied in experiments.« less

  1. Influence of Scattering on Ballistic Nanotransistor Design

    NASA Technical Reports Server (NTRS)

    Anantram, M. P.; Svizhenko, Alexei; Biegel, Bryan, A. (Technical Monitor)

    2002-01-01

    Importance of this work: (1) This is the first work to model electron-phonon scattering within a quantum mechanical approach to nanotransistors. The simulations use the non equilibrium Green's function method. (2) A simple equation which captures the importance of scattering as a function of the spatial location from source to drain is presented. This equation helps interpret the numerical simulations. (3) We show that the resistance per unit length in the source side is much larger than in the drain side. Thus making scattering in the source side of the device much more important than scattering in the drain side. Numerical estimates of ballisticity for 10nm channel length devices in the presence of of electron-phonon scattering are given. Based on these calculations, we propose that to achieve a larger on-current in nanotransistors, it is crucial to keep the highly doped source extension region extremely small, even if this is at the cost of making the highly doped drain extension region longer.

  2. Driving-forces model on individual behavior in scenarios considering moving threat agents

    NASA Astrophysics Data System (ADS)

    Li, Shuying; Zhuang, Jun; Shen, Shifei; Wang, Jia

    2017-09-01

    The individual behavior model is a contributory factor to improve the accuracy of agent-based simulation in different scenarios. However, few studies have considered moving threat agents, which often occur in terrorist attacks caused by attackers with close-range weapons (e.g., sword, stick). At the same time, many existing behavior models lack validation from cases or experiments. This paper builds a new individual behavior model based on seven behavioral hypotheses. The driving-forces model is an extension of the classical social force model considering scenarios including moving threat agents. An experiment was conducted to validate the key components of the model. Then the model is compared with an advanced Elliptical Specification II social force model, by calculating the fitting errors between the simulated and experimental trajectories, and being applied to simulate a specific circumstance. Our results show that the driving-forces model reduced the fitting error by an average of 33.9% and the standard deviation by an average of 44.5%, which indicates the accuracy and stability of the model in the studied situation. The new driving-forces model could be used to simulate individual behavior when analyzing the risk of specific scenarios using agent-based simulation methods, such as risk analysis of close-range terrorist attacks in public places.

  3. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  4. The impact of demographic change on tax revenue.

    PubMed

    Goudswaard, K; Van De Kar, H

    1994-09-01

    "This paper [simulates] the impact of demographic change on direct tax revenue for the Netherlands using extensive survey data and population projections. Projected demographic development in the Netherlands fits in well with the OECD mainstream. The analysis thus has a more general relevance. The simulations indicate a 27 percent rise in tax revenue until 2010 because of population growth and a relatively older labor force. After 2030, revenue falls as a consequence of a declining population and a rapidly rising share of the elderly. The authors also simulated a variant in which labor-force participation rates are set on the substantially higher OECD average. In this case, the increase in tax revenue almost doubles as compared to the base variant." excerpt

  5. Possibilities of the ErgoScope high fidelity work simulator in skill assessment, skill development and vocational aptitude tests of physically disabled persons.

    PubMed

    Izsó, Lajos; Székely, Ildikó; Dános, László

    2015-01-01

    The aim of this paper - based on the extensive experiences of the authors gained by using one particular work simulator - is to present some promising possibilities of the application of this (and any other similar) work simulator in the field of skill assessment, skill development and vocational aptitude tests of physically disabled persons. During skill assessment and development, as parts of the therapy, the focus is on the disabled functions. During vocational aptitude tests, however, the focus is already on the functions that remained intact and therefore can be the basis of returning to work. Some factual examples are provided to realize the proposed possibilities in practice.

  6. Skylab

    NASA Image and Video Library

    1973-05-01

    This photograph was taken in the Marshall Space Flight Center (MSFC) Neutral Buoyancy Simulator (NBS) during the testing of an emergency procedure to deploy a twin-pole sunshade to protect the orbiting workshop from overheating due to the loss of its thermal shield. The spacecraft suffered damage to its sunshield during its launch on May 14, 1973. This photograph shows the base plate used to hold the twin-pole in place, the bag to hold the fabric sail, and the lines that were used to draw the sail into place. Extensive testing and many hours of practice in simulators, such as the NBS, helped prepare the Skylab crewmen for extravehicular performance in the weightless environment. This huge water tank simulated the weightless environment that the astronauts would encounter in space.

  7. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  8. About increase of the large transvere momentum processes fraction in hA interactions at energies 5.10(14) - 10(16) eV according to the data on E.A.S. hadrons

    NASA Technical Reports Server (NTRS)

    Danilova, T. V.; Dubovy, A. G.; Erlykin, A. D.; Nesterova, N. M.; Chubenko, A. P.

    1985-01-01

    The lateral distributions of extensive air showers (EAS) hadrons obtained at Tien-Shan array are compared with the simulations. The simulation data have been treated in the same way as experimental data, including the recording method. The comparison shows that the experimental hadron lateral distributions are wider than simulated ones. On the base of this result the conclusion is drawn that the fraction of processes with large p (perpendicular) increases in hadron-air interactions at energies 5 x 10 to the 14 to 10 to the 16 eV compared with accelerator data in p-p interactions at lower energies.

  9. Hybrid thermal link-wise artificial compressibility method

    NASA Astrophysics Data System (ADS)

    Obrecht, Christian; Kuznik, Frédéric

    2015-10-01

    Thermal flow prediction is a subject of interest from a scientific and engineering points of view. Our motivation is to develop an accurate, easy to implement and highly scalable method for convective flows simulation. To this end, we present an extension to the link-wise artificial compressibility method (LW-ACM) for thermal simulation of weakly compressible flows. The novel hybrid formulation uses second-order finite difference operators of the energy equation based on the same stencils as the LW-ACM. For validation purposes, the differentially heated cubic cavity was simulated. The simulations remained stable for Rayleigh numbers up to Ra =108. The Nusselt numbers at isothermal walls and dynamics quantities are in good agreement with reference values from the literature. Our results show that the hybrid thermal LW-ACM is an effective and easy-to-use solution to solve convective flows.

  10. The virtual morphology and the main movements of the human neck simulations used for car crash studies

    NASA Astrophysics Data System (ADS)

    Ciunel, St.; Tica, B.

    2016-08-01

    The paper presents the studies made on a similar biomechanical system composed by neck, head and thorax bones. The models were defined in a CAD environment which includes Adams algorithm for dynamic simulations. The virtual models and the entire morphology were obtained starting with CT images made on a living human subject. The main movements analyzed were: axial rotation (left-right), lateral bending (left-right) and flexion- extension movement. After simulation was obtained the entire biomechanical behavior based on data tables or diagrams. That virtual model composed by neck and head can be included in complex system (as a car system) and supposed to several impact simulations (virtual crash tests). Also, our research team built main components of a testing device for dummy car crash neck-head system using anatomical data.

  11. An empirical potential for simulating vacancy clusters in tungsten.

    PubMed

    Mason, D R; Nguyen-Manh, D; Becquart, C S

    2017-12-20

    We present an empirical interatomic potential for tungsten, particularly well suited for simulations of vacancy-type defects. We compare energies and structures of vacancy clusters generated with the empirical potential with an extensive new database of values computed using density functional theory, and show that the new potential predicts low-energy defect structures and formation energies with high accuracy. A significant difference to other popular embedded-atom empirical potentials for tungsten is the correct prediction of surface energies. Interstitial properties and short-range pairwise behaviour remain similar to the Ackford-Thetford potential on which it is based, making this potential well-suited to simulations of microstructural evolution following irradiation damage cascades. Using atomistic kinetic Monte Carlo simulations, we predict vacancy cluster dissociation in the range 1100-1300 K, the temperature range generally associated with stage IV recovery.

  12. Partially coherent X-ray wavefront propagation simulations including grazing-incidence focusing optics.

    PubMed

    Canestrari, Niccolo; Chubar, Oleg; Reininger, Ruben

    2014-09-01

    X-ray beamlines in modern synchrotron radiation sources make extensive use of grazing-incidence reflective optics, in particular Kirkpatrick-Baez elliptical mirror systems. These systems can focus the incoming X-rays down to nanometer-scale spot sizes while maintaining relatively large acceptance apertures and high flux in the focused radiation spots. In low-emittance storage rings and in free-electron lasers such systems are used with partially or even nearly fully coherent X-ray beams and often target diffraction-limited resolution. Therefore, their accurate simulation and modeling has to be performed within the framework of wave optics. Here the implementation and benchmarking of a wave-optics method for the simulation of grazing-incidence mirrors based on the local stationary-phase approximation or, in other words, the local propagation of the radiation electric field along geometrical rays, is described. The proposed method is CPU-efficient and fully compatible with the numerical methods of Fourier optics. It has been implemented in the Synchrotron Radiation Workshop (SRW) computer code and extensively tested against the geometrical ray-tracing code SHADOW. The test simulations have been performed for cases without and with diffraction at mirror apertures, including cases where the grazing-incidence mirrors can be hardly approximated by ideal lenses. Good agreement between the SRW and SHADOW simulation results is observed in the cases without diffraction. The differences between the simulation results obtained by the two codes in diffraction-dominated cases for illumination with fully or partially coherent radiation are analyzed and interpreted. The application of the new method for the simulation of wavefront propagation through a high-resolution X-ray microspectroscopy beamline at the National Synchrotron Light Source II (Brookhaven National Laboratory, USA) is demonstrated.

  13. Simulation of rotor blade element turbulence

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.; Duisenberg, Ken

    1995-01-01

    A piloted, motion-based simulation of Sikorsky's Black Hawk helicopter was used as a platform for the investigation of rotorcraft responses to vertical turbulence. By using an innovative temporal and geometrical distribution algorithm that preserved the statistical characteristics of the turbulence over the rotor disc, stochastic velocity components were applied at each of twenty blade-element stations. This model was implemented on NASA Ames' Vertical Motion Simulator (VMS), and ten test pilots were used to establish that the model created realistic cues. The objectives of this research included the establishment of a simulation-technology basis for future investigation into real-time turbulence modeling. This goal was achieved; our extensive additions to the rotor model added less than a 10 percent computational overhead. Using a VAX 9000 computer the entire simulation required a cycle time of less than 12 msec. Pilot opinion during this simulation was generally quite favorable. For low speed flight the consensus was that SORBET (acronym for title) was better than the conventional body-fixed model, which was used for comparison purposes, and was determined to be too violent (like a washboard). For high speed flight the pilots could not identify differences between these models. These opinions were something of a surprise because only the vertical turbulence component on the rotor system was implemented in SORBET. Because of the finite-element distribution of the inputs, induced outputs were observed in all translational and rotational axes. Extensive post-simulation spectral analyses of the SORBET model suggest that proper rotorcraft turbulence modeling requires that vertical atmospheric disturbances not be superimposed at the vehicle center of gravity but, rather, be input into the rotor system, where the rotor-to-body transfer function severely attenuates high frequency rotorcraft responses.

  14. Computational studies of horizontal axis wind turbines

    NASA Astrophysics Data System (ADS)

    Xu, Guanpeng

    A numerical technique has been developed for efficiently simulating fully three-dimensional viscous fluid flow around horizontal axis wind turbines (HAWT) using a zonal approach. The flow field is viewed as a combination of viscous regions, inviscid regions and vortices. The method solves the costly unsteady Reynolds averaged Navier-Stokes (RANS) equations only in the viscous region around the turbine blades. It solves the full potential equation in the inviscid region where flow is irrotational and isentropic. The tip vortices are simulated using a Lagrangean approach, thus removing the need to accurately resolve them on a fine grid. The hybrid method is shown to provide good results with modest CPU resources. A full Navier-Stokes based methodology has also been developed for modeling wind turbines at high wind conditions where extensive stall may occur. An overset grid based version that can model rotor-tower interactions has been developed. Finally, a blade element theory based methodology has been developed for the purpose of developing improved tip loss models and stall delay models. The effects of turbulence are simulated using a zero equation eddy viscosity model, or a one equation Spalart-Allmaras model. Two transition models, one based on the Eppler's criterion, and the other based on Michel's criterion, have been developed and tested. The hybrid method has been extensively validated for axial wind conditions for three rotors---NREL Phase II, Phase III, and Phase VI configurations. A limited set of calculations has been done for rotors operating under yaw conditions. Preliminary simulations have also been carried out to assess the effects of the tower wake on the rotor. In most of these cases, satisfactory agreement has been obtained with measurements. Using the numerical results from present methodologies as a guide, Prandtl's tip loss model and Corrigan's stall delay model were correlated with present calculations. An improved tip loss model has been obtained. A correction to the Corrigan's stall delay model has also been developed. Incorporation of these corrections is shown to considerably improve power predictions, even when a very simple aerodynamic theory---blade element method with annular inflow---is used.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Xin, E-mail: xinshih86029@gmail.com; Zhao, Xiangmo, E-mail: xinshih86029@gmail.com; Hui, Fei, E-mail: xinshih86029@gmail.com

    Clock synchronization in wireless sensor networks (WSNs) has been studied extensively in recent years and many protocols are put forward based on the point of statistical signal processing, which is an effective way to optimize accuracy. However, the accuracy derived from the statistical data can be improved mainly by sufficient packets exchange, which will consume the limited power resources greatly. In this paper, a reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization is proposed to optimize sync accuracy without expending additional sync packets. As a contribution, a linear weighted fusion scheme for multiple clock deviations ismore » constructed with the collaborative sensing of clock timestamp. And the fusion weight is defined by the covariance of sync errors for different clock deviations. Extensive simulation results show that the proposed approach can achieve better performance in terms of sync overhead and sync accuracy.« less

  16. Design and implementation of laser target simulator in hardware-in-the-loop simulation system based on LabWindows/CVI and RTX

    NASA Astrophysics Data System (ADS)

    Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong

    2016-11-01

    In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.

  17. A systematic molecular dynamics study of nearest-neighbor effects on base pair and base pair step conformations and fluctuations in B-DNA

    PubMed Central

    Lavery, Richard; Zakrzewska, Krystyna; Beveridge, David; Bishop, Thomas C.; Case, David A.; Cheatham, Thomas; Dixit, Surjit; Jayaram, B.; Lankas, Filip; Laughton, Charles; Maddocks, John H.; Michon, Alexis; Osman, Roman; Orozco, Modesto; Perez, Alberto; Singh, Tanya; Spackova, Nada; Sponer, Jiri

    2010-01-01

    It is well recognized that base sequence exerts a significant influence on the properties of DNA and plays a significant role in protein–DNA interactions vital for cellular processes. Understanding and predicting base sequence effects requires an extensive structural and dynamic dataset which is currently unavailable from experiment. A consortium of laboratories was consequently formed to obtain this information using molecular simulations. This article describes results providing information not only on all 10 unique base pair steps, but also on all possible nearest-neighbor effects on these steps. These results are derived from simulations of 50–100 ns on 39 different DNA oligomers in explicit solvent and using a physiological salt concentration. We demonstrate that the simulations are converged in terms of helical and backbone parameters. The results show that nearest-neighbor effects on base pair steps are very significant, implying that dinucleotide models are insufficient for predicting sequence-dependent behavior. Flanking base sequences can notably lead to base pair step parameters in dynamic equilibrium between two conformational sub-states. Although this study only provides limited data on next-nearest-neighbor effects, we suggest that such effects should be analyzed before attempting to predict the sequence-dependent behavior of DNA. PMID:19850719

  18. Climate Envelope Modeling and Dispersal Simulations Show Little Risk of Range Extension of the Shipworm, Teredo navalis (L.), in the Baltic Sea

    PubMed Central

    Appelqvist, Christin; Al-Hamdani, Zyad K.; Jonsson, Per R.; Havenhand, Jon N.

    2015-01-01

    The shipworm, Teredo navalis, is absent from most of the Baltic Sea. In the last 20 years, increased frequency of T. navalis has been reported along the southern Baltic Sea coasts of Denmark, Germany, and Sweden, indicating possible range-extensions into previously unoccupied areas. We evaluated the effects of historical and projected near-future changes in salinity, temperature, and oxygen on the risk of spread of T. navalis in the Baltic. Specifically, we developed a simple, GIS-based, mechanistic climate envelope model to predict the spatial distribution of favourable conditions for adult reproduction and larval metamorphosis of T. navalis, based on published environmental tolerances to these factors. In addition, we used a high-resolution three-dimensional hydrographic model to simulate the probability of spread of T. navalis larvae within the study area. Climate envelope modeling showed that projected near-future climate change is not likely to change the overall distribution of T. navalis in the region, but will prolong the breeding season and increase the risk of shipworm establishment at the margins of the current range. Dispersal simulations indicated that the majority of larvae were philopatric, but those that spread over a wider area typically spread to areas unfavourable for their survival. Overall, therefore, we found no substantive evidence for climate-change related shifts in the distribution of T. navalis in the Baltic Sea, and no evidence for increased risk of spread in the near-future. PMID:25768305

  19. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  20. Physical and digital simulations for IVA robotics

    NASA Technical Reports Server (NTRS)

    Hinman, Elaine; Workman, Gary L.

    1992-01-01

    Space based materials processing experiments can be enhanced through the use of IVA robotic systems. A program to determine requirements for the implementation of robotic systems in a microgravity environment and to develop some preliminary concepts for acceleration control of small, lightweight arms has been initiated with the development of physical and digital simulation capabilities. The physical simulation facilities incorporate a robotic workcell containing a Zymark Zymate II robot instrumented for acceleration measurements, which is able to perform materials transfer functions while flying on NASA's KC-135 aircraft during parabolic manuevers to simulate reduced gravity. Measurements of accelerations occurring during the reduced gravity periods will be used to characterize impacts of robotic accelerations in a microgravity environment in space. Digital simulations are being performed with TREETOPS, a NASA developed software package which is used for the dynamic analysis of systems with a tree topology. Extensive use of both simulation tools will enable the design of robotic systems with enhanced acceleration control for use in the space manufacturing environment.

  1. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  2. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  3. Profile of capillary bridges between two vertically stacked cylindrical fibers under gravitational effect

    NASA Astrophysics Data System (ADS)

    Sun, Xiaohang; Lee, Hoon Joo; Michielsen, Stephen; Wilusz, Eugene

    2018-05-01

    Although profiles of axisymmetric capillary bridges between two cylindrical fibers have been extensively studied, little research has been reported on capillary bridges under external forces such as the gravitational force. This is because external forces add significant complications to the Laplace-Young equation, making it difficult to predict drop profiles based on analytical approaches. In this paper, simulations of capillary bridges between two vertically stacked cylindrical fibers with gravitational effect taken into consideration are studied. The asymmetrical structure of capillary bridges that are hard to predict based on analytical approaches was studied via a numerical approach based on Surface Evolver (SE). The axial and the circumferential spreading of liquids on two identical fibers in the presence of gravitational effects are predicted to determine when the gravitational effects are significant or can be neglected. The effect of liquid volume, equilibrium contact angle, the distance between two fibers and fiber radii. The simulation results were verified by comparing them with experimental measurements. Based on SE simulations, curves representing the spreading of capillary bridges along the two cylindrical fibers were obtained. The gravitational effect was scaled based on the difference of the spreading on upper and lower fibers.

  4. Sensitivity Observing System Experiment (SOSE)-a new effective NWP-based tool in designing the global observing system

    NASA Astrophysics Data System (ADS)

    Marseille, Gert-Jan; Stoffelen, Ad; Barkmeijer, Jan

    2008-03-01

    Lacking an established methodology to test the potential impact of prospective extensions to the global observing system (GOS) in real atmospheric cases we developed such a method, called Sensitivity Observing System Experiment (SOSE). For example, since the GOS is non uniform it is of interest to investigate the benefit of complementary observing systems filling its gaps. In a SOSE adjoint sensitivity structures are used to define a pseudo true atmospheric state for the simulation of the prospective observing system. Next, the synthetic observations are used together with real observations from the existing GOS in a state-of-the-art Numerical Weather Prediction (NWP) model to assess the potential added value of the new observing system. Unlike full observing system simulation experiments (OSSE), SOSE can be applied to real extreme events that were badly forecast operationally and only requires the simulation of the new instrument. As such SOSE is an effective tool, for example, to define observation requirements for extensions to the GOS. These observation requirements may serve as input for the design of an operational network of prospective observing systems. In a companion paper we use SOSE to simulate potential future space borne Doppler Wind Lidar (DWL) scenarios and assess their capability to sample meteorologically sensitive areas not well captured by the current GOS, in particular over the Northern Hemisphere oceans.

  5. Density Deconvolution With EPI Splines

    DTIC Science & Technology

    2015-09-01

    effects of various substances on test subjects [11], [12]. Whereas in geophysics, a shot may be fired into the ground, in pharmacokinetics, a signal is...be significant, including medicine, bioinformatics, chemistry, as- tronomy, and econometrics , as well as an extensive review of kernel based methods...demonstrate the effectiveness of our model in simulations motivated by test instances in [32]. We consider an additive measurement model scenario where

  6. ECHO: A Computer Based Test for the Measurement of Individualistic, Cooperative, Defensive, and Aggressive Models of Behavior. Occasional Paper No. 30.

    ERIC Educational Resources Information Center

    Krus, David J.; And Others

    This paper describes a test which attempts to measure a group of personality traits by analyzing the actual behavior of the participant in a computer-simulated game. ECHO evolved from an extension and computerization of Horstein and Deutsch's allocation game. The computerized version of ECHO requires subjects to make decisions about the allocation…

  7. Time domain simulations of preliminary breakdown pulses in natural lightning.

    PubMed

    Carlson, B E; Liang, C; Bitzer, P; Christian, H

    2015-06-16

    Lightning discharge is a complicated process with relevant physical scales spanning many orders of magnitude. In an effort to understand the electrodynamics of lightning and connect physical properties of the channel to observed behavior, we construct a simulation of charge and current flow on a narrow conducting channel embedded in three-dimensional space with the time domain electric field integral equation, the method of moments, and the thin-wire approximation. The method includes approximate treatment of resistance evolution due to lightning channel heating and the corona sheath of charge surrounding the lightning channel. Focusing our attention on preliminary breakdown in natural lightning by simulating stepwise channel extension with a simplified geometry, our simulation reproduces the broad features observed in data collected with the Huntsville Alabama Marx Meter Array. Some deviations in pulse shape details are evident, suggesting future work focusing on the detailed properties of the stepping mechanism. Preliminary breakdown pulses can be reproduced by simulated channel extension Channel heating and corona sheath formation are crucial to proper pulse shape Extension processes and channel orientation significantly affect observations.

  8. Time domain simulations of preliminary breakdown pulses in natural lightning

    PubMed Central

    Carlson, B E; Liang, C; Bitzer, P; Christian, H

    2015-01-01

    Lightning discharge is a complicated process with relevant physical scales spanning many orders of magnitude. In an effort to understand the electrodynamics of lightning and connect physical properties of the channel to observed behavior, we construct a simulation of charge and current flow on a narrow conducting channel embedded in three-dimensional space with the time domain electric field integral equation, the method of moments, and the thin-wire approximation. The method includes approximate treatment of resistance evolution due to lightning channel heating and the corona sheath of charge surrounding the lightning channel. Focusing our attention on preliminary breakdown in natural lightning by simulating stepwise channel extension with a simplified geometry, our simulation reproduces the broad features observed in data collected with the Huntsville Alabama Marx Meter Array. Some deviations in pulse shape details are evident, suggesting future work focusing on the detailed properties of the stepping mechanism. Key Points Preliminary breakdown pulses can be reproduced by simulated channel extension Channel heating and corona sheath formation are crucial to proper pulse shape Extension processes and channel orientation significantly affect observations PMID:26664815

  9. A focused ultrasound treatment system for moving targets (part I): generic system design and in-silico first-stage evaluation.

    PubMed

    Schwenke, Michael; Strehlow, Jan; Demedts, Daniel; Haase, Sabrina; Barrios Romero, Diego; Rothlübbers, Sven; von Dresky, Caroline; Zidowitz, Stephan; Georgii, Joachim; Mihcin, Senay; Bezzi, Mario; Tanner, Christine; Sat, Giora; Levy, Yoav; Jenne, Jürgen; Günther, Matthias; Melzer, Andreas; Preusser, Tobias

    2017-01-01

    Focused ultrasound (FUS) is entering clinical routine as a treatment option. Currently, no clinically available FUS treatment system features automated respiratory motion compensation. The required quality standards make developing such a system challenging. A novel FUS treatment system with motion compensation is described, developed with the goal of clinical use. The system comprises a clinically available MR device and FUS transducer system. The controller is very generic and could use any suitable MR or FUS device. MR image sequences (echo planar imaging) are acquired for both motion observation and thermometry. Based on anatomical feature tracking, motion predictions are estimated to compensate for processing delays. FUS control parameters are computed repeatedly and sent to the hardware to steer the focus to the (estimated) target position. All involved calculations produce individually known errors, yet their impact on therapy outcome is unclear. This is solved by defining an intuitive quality measure that compares the achieved temperature to the static scenario, resulting in an overall efficiency with respect to temperature rise. To allow for extensive testing of the system over wide ranges of parameters and algorithmic choices, we replace the actual MR and FUS devices by a virtual system. It emulates the hardware and, using numerical simulations of FUS during motion, predicts the local temperature rise in the tissue resulting from the controls it receives. With a clinically available monitoring image rate of 6.67 Hz and 20 FUS control updates per second, normal respiratory motion is estimated to be compensable with an estimated efficiency of 80%. This reduces to about 70% for motion scaled by 1.5. Extensive testing (6347 simulated sonications) over wide ranges of parameters shows that the main source of error is the temporal motion prediction. A history-based motion prediction method performs better than a simple linear extrapolator. The estimated efficiency of the new treatment system is already suited for clinical applications. The simulation-based in-silico testing as a first-stage validation reduces the efforts of real-world testing. Due to the extensible modular design, the described approach might lead to faster translations from research to clinical practice.

  10. Analytic calculation of radio emission from parametrized extensive air showers: A tool to extract shower parameters

    NASA Astrophysics Data System (ADS)

    Scholten, O.; Trinh, T. N. G.; de Vries, K. D.; Hare, B. M.

    2018-01-01

    The radio intensity and polarization footprint of a cosmic-ray induced extensive air shower is determined by the time-dependent structure of the current distribution residing in the plasma cloud at the shower front. In turn, the time dependence of the integrated charge-current distribution in the plasma cloud, the longitudinal shower structure, is determined by interesting physics which one would like to extract, such as the location and multiplicity of the primary cosmic-ray collision or the values of electric fields in the atmosphere during thunderstorms. To extract the structure of a shower from its footprint requires solving a complicated inverse problem. For this purpose we have developed a code that semianalytically calculates the radio footprint of an extensive air shower given an arbitrary longitudinal structure. This code can be used in an optimization procedure to extract the optimal longitudinal shower structure given a radio footprint. On the basis of air-shower universality we propose a simple parametrization of the structure of the plasma cloud. This parametrization is based on the results of Monte Carlo shower simulations. Deriving the parametrization also teaches which aspects of the plasma cloud are important for understanding the features seen in the radio-emission footprint. The calculated radio footprints are compared with microscopic CoREAS simulations.

  11. A Model for Growth of a Single Fungal Hypha Based on Well-Mixed Tanks in Series: Simulation of Nutrient and Vesicle Transport in Aerial Reproductive Hyphae

    PubMed Central

    Balmant, Wellington; Sugai-Guérios, Maura Harumi; Coradin, Juliana Hey; Krieger, Nadia; Furigo Junior, Agenor; Mitchell, David Alexander

    2015-01-01

    Current models that describe the extension of fungal hyphae and development of a mycelium either do not describe the role of vesicles in hyphal extension or do not correctly describe the experimentally observed profile for distribution of vesicles along the hypha. The present work uses the n-tanks-in-series approach to develop a model for hyphal extension that describes the intracellular transport of nutrient to a sub-apical zone where vesicles are formed and then transported to the tip, where tip extension occurs. The model was calibrated using experimental data from the literature for the extension of reproductive aerial hyphae of three different fungi, and was able to describe different profiles involving acceleration and deceleration of the extension rate. A sensitivity analysis showed that the supply of nutrient to the sub-apical vesicle-producing zone is a key factor influencing the rate of extension of the hypha. Although this model was used to describe the extension of a single reproductive aerial hypha, the use of the n-tanks-in-series approach to representing the hypha means that the model has the flexibility to be extended to describe the growth of other types of hyphae and the branching of hyphae to form a complete mycelium. PMID:25785863

  12. N-body simulation for self-gravitating collisional systems with a new SIMD instruction set extension to the x86 architecture, Advanced Vector eXtensions

    NASA Astrophysics Data System (ADS)

    Tanikawa, Ataru; Yoshikawa, Kohji; Okamoto, Takashi; Nitadori, Keigo

    2012-02-01

    We present a high-performance N-body code for self-gravitating collisional systems accelerated with the aid of a new SIMD instruction set extension of the x86 architecture: Advanced Vector eXtensions (AVX), an enhanced version of the Streaming SIMD Extensions (SSE). With one processor core of Intel Core i7-2600 processor (8 MB cache and 3.40 GHz) based on Sandy Bridge micro-architecture, we implemented a fourth-order Hermite scheme with individual timestep scheme ( Makino and Aarseth, 1992), and achieved the performance of ˜20 giga floating point number operations per second (GFLOPS) for double-precision accuracy, which is two times and five times higher than that of the previously developed code implemented with the SSE instructions ( Nitadori et al., 2006b), and that of a code implemented without any explicit use of SIMD instructions with the same processor core, respectively. We have parallelized the code by using so-called NINJA scheme ( Nitadori et al., 2006a), and achieved ˜90 GFLOPS for a system containing more than N = 8192 particles with 8 MPI processes on four cores. We expect to achieve about 10 tera FLOPS (TFLOPS) for a self-gravitating collisional system with N ˜ 10 5 on massively parallel systems with at most 800 cores with Sandy Bridge micro-architecture. This performance will be comparable to that of Graphic Processing Unit (GPU) cluster systems, such as the one with about 200 Tesla C1070 GPUs ( Spurzem et al., 2010). This paper offers an alternative to collisional N-body simulations with GRAPEs and GPUs.

  13. Transient Three-Dimensional Side Load Analysis of a Film Cooled Nozzle

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Guidos, Mike

    2008-01-01

    Transient three-dimensional numerical investigations on the side load physics for an engine encompassing a film cooled nozzle extension and a regeneratively cooled thrust chamber, were performed. The objectives of this study are to identify the three-dimensional side load physics and to compute the associated aerodynamic side load using an anchored computational methodology. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Ultimately, the computational results will be provided to the nozzle designers for estimating of effect of the peak side load on the nozzle structure. Computations simulating engine startup at ambient pressures corresponding to sea level and three high altitudes were performed. In addition, computations for both engine startup and shutdown transients were also performed for a stub nozzle, operating at sea level. For engine with the full nozzle extension, computational result shows starting up at sea level, the peak side load occurs when the lambda shock steps into the turbine exhaust flow, while the side load caused by the transition from free-shock separation to restricted-shock separation comes at second; and the side loads decreasing rapidly and progressively as the ambient pressure decreases. For the stub nozzle operating at sea level, the computed side loads during both startup and shutdown becomes very small due to the much reduced flow area.

  14. Fail Safe, High Temperature Magnetic Bearings

    NASA Technical Reports Server (NTRS)

    Minihan, Thomas; Palazzolo, Alan; Kim, Yeonkyu; Lei, Shu-Liang; Kenny, Andrew; Na, Uhn Joo; Tucker, Randy; Preuss, Jason; Hunt, Andrew; Carter, Bart; hide

    2002-01-01

    This paper contributes to the magnetic bearing literature in two distinct areas: high temperature and redundant actuation. Design considerations and test results are given for the first published combined 538 C (1000 F) high speed rotating test performance of a magnetic bearing. Secondly, a significant extension of the flux isolation based, redundant actuator control algorithm is proposed to eliminate the prior deficiency of changing position stiffness after failure. The benefit of the novel extension was not experimentally demonstrated due to a high active stiffness requirement. In addition, test results are given for actuator failure tests at 399 C (750 F), 12,500 rpm. Finally, simulation results are presented confirming the experimental data and validating the redundant control algorithm.

  15. Cutting the wires: modularization of cellular networks for experimental design.

    PubMed

    Lang, Moritz; Summers, Sean; Stelling, Jörg

    2014-01-07

    Understanding naturally evolved cellular networks requires the consecutive identification and revision of the interactions between relevant molecular species. In this process, initially often simplified and incomplete networks are extended by integrating new reactions or whole subnetworks to increase consistency between model predictions and new measurement data. However, increased consistency with experimental data alone is not sufficient to show the existence of biomolecular interactions, because the interplay of different potential extensions might lead to overall similar dynamics. Here, we present a graph-based modularization approach to facilitate the design of experiments targeted at independently validating the existence of several potential network extensions. Our method is based on selecting the outputs to measure during an experiment, such that each potential network extension becomes virtually insulated from all others during data analysis. Each output defines a module that only depends on one hypothetical network extension, and all other outputs act as virtual inputs to achieve insulation. Given appropriate experimental time-series measurements of the outputs, our modules can be analyzed, simulated, and compared to the experimental data separately. Our approach exemplifies the close relationship between structural systems identification and modularization, an interplay that promises development of related approaches in the future. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. PhysiCell: An open source physics-based cell simulator for 3-D multicellular systems.

    PubMed

    Ghaffarizadeh, Ahmadreza; Heiland, Randy; Friedman, Samuel H; Mumenthaler, Shannon M; Macklin, Paul

    2018-02-01

    Many multicellular systems problems can only be understood by studying how cells move, grow, divide, interact, and die. Tissue-scale dynamics emerge from systems of many interacting cells as they respond to and influence their microenvironment. The ideal "virtual laboratory" for such multicellular systems simulates both the biochemical microenvironment (the "stage") and many mechanically and biochemically interacting cells (the "players" upon the stage). PhysiCell-physics-based multicellular simulator-is an open source agent-based simulator that provides both the stage and the players for studying many interacting cells in dynamic tissue microenvironments. It builds upon a multi-substrate biotransport solver to link cell phenotype to multiple diffusing substrates and signaling factors. It includes biologically-driven sub-models for cell cycling, apoptosis, necrosis, solid and fluid volume changes, mechanics, and motility "out of the box." The C++ code has minimal dependencies, making it simple to maintain and deploy across platforms. PhysiCell has been parallelized with OpenMP, and its performance scales linearly with the number of cells. Simulations up to 105-106 cells are feasible on quad-core desktop workstations; larger simulations are attainable on single HPC compute nodes. We demonstrate PhysiCell by simulating the impact of necrotic core biomechanics, 3-D geometry, and stochasticity on the dynamics of hanging drop tumor spheroids and ductal carcinoma in situ (DCIS) of the breast. We demonstrate stochastic motility, chemical and contact-based interaction of multiple cell types, and the extensibility of PhysiCell with examples in synthetic multicellular systems (a "cellular cargo delivery" system, with application to anti-cancer treatments), cancer heterogeneity, and cancer immunology. PhysiCell is a powerful multicellular systems simulator that will be continually improved with new capabilities and performance improvements. It also represents a significant independent code base for replicating results from other simulation platforms. The PhysiCell source code, examples, documentation, and support are available under the BSD license at http://PhysiCell.MathCancer.org and http://PhysiCell.sf.net.

  17. A serious game for learning ultrasound-guided needle placement skills.

    PubMed

    Chan, Wing-Yin; Qin, Jing; Chui, Yim-Pan; Heng, Pheng-Ann

    2012-11-01

    Ultrasound-guided needle placement is a key step in a lot of radiological intervention procedures such as biopsy, local anesthesia and fluid drainage. To help training future intervention radiologists, we develop a serious game to teach the skills involved. We introduce novel techniques for realistic simulation and integrate game elements for active and effective learning. This game is designed in the context of needle placement training based on the some essential characteristics of serious games. Training scenarios are interactively generated via a block-based construction scheme. A novel example-based texture synthesis technique is proposed to simulate corresponding ultrasound images. Game levels are defined based on the difficulties of the generated scenarios. Interactive recommendation of desirable insertion paths is provided during the training as an adaptation mechanism. We also develop a fast physics-based approach to reproduce the shadowing effect of needles in ultrasound images. Game elements such as time-attack tasks, hints and performance evaluation tools are also integrated in our system. Extensive experiments are performed to validate its feasibility for training.

  18. Finite element model focused on stress distribution in the levator ani muscle during vaginal delivery.

    PubMed

    Krofta, Ladislav; Havelková, Linda; Urbánková, Iva; Krčmář, Michal; Hynčík, Luděk; Feyereisl, Jaroslav

    2017-02-01

    During vaginal delivery, the levator ani muscle (LAM) undergoes severe deformation. This stress can lead to stretch-related LAM injuries. The objective of this study was to develop a sophisticated MRI-based model to simulate changes in the LAM during vaginal delivery. A 3D finite element model of the female pelvic floor and fetal head was developed. The model geometry was based on MRI data from a nulliparous woman and 1-day-old neonate. Material parameters were estimated using uniaxial test data from the literature and by least-square minimization method. The boundary conditions reflected all anatomical constraints and supports. A simulation of vaginal delivery with regard to the cardinal movements of labor was then performed. The mean stress values in the iliococcygeus portion of the LAM during fetal head extension were 4.91-7.93 MPa. The highest stress values were induced in the pubovisceral and puborectal LAM portions (mean 27.46 MPa) at the outset of fetal head extension. The last LAM subdivision engaged in the changes in stress was the posteromedial section of the puborectal muscle. The mean stress values were 16.89 MPa at the end of fetal head extension. The LAM was elongated by nearly 2.5 times from its initial resting position. The cardinal movements of labor significantly affect the subsequent heterogeneous stress distribution in the LAM. The absolute stress values were highest in portions of the muscle that arise from the pubic bone. These areas are at the highest risk for muscle injuries with long-term complications.

  19. Dark matter as a ghost free conformal extension of Einstein theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barvinsky, A.O., E-mail: barvin@td.lpi.ru

    We discuss ghost free models of the recently suggested mimetic dark matter theory. This theory is shown to be a conformal extension of Einstein general relativity. Dark matter originates from gauging out its local Weyl invariance as an extra degree of freedom which describes a potential flow of the pressureless perfect fluid. For a positive energy density of this fluid the theory is free of ghost instabilities, which gives strong preference to stable configurations with a positive scalar curvature and trace of the matter stress tensor. Instabilities caused by caustics of the geodesic flow, inherent in this model, serve asmore » a motivation for an alternative conformal extension of Einstein theory, based on the generalized Proca vector field. A potential part of this field modifies the inflationary stage in cosmology, whereas its rotational part at the post inflationary epoch might simulate rotating flows of dark matter.« less

  20. Modelling phosphorus (P), sulfur (S) and iron (Fe) interactions for dynamic simulations of anaerobic digestion processes.

    PubMed

    Flores-Alsina, Xavier; Solon, Kimberly; Kazadi Mbamba, Christian; Tait, Stephan; Gernaey, Krist V; Jeppsson, Ulf; Batstone, Damien J

    2016-05-15

    This paper proposes a series of extensions to functionally upgrade the IWA Anaerobic Digestion Model No. 1 (ADM1) to allow for plant-wide phosphorus (P) simulation. The close interplay between the P, sulfur (S) and iron (Fe) cycles requires a substantial (and unavoidable) increase in model complexity due to the involved three-phase physico-chemical and biological transformations. The ADM1 version, implemented in the plant-wide context provided by the Benchmark Simulation Model No. 2 (BSM2), is used as the basic platform (A0). Three different model extensions (A1, A2, A3) are implemented, simulated and evaluated. The first extension (A1) considers P transformations by accounting for the kinetic decay of polyphosphates (XPP) and potential uptake of volatile fatty acids (VFA) to produce polyhydroxyalkanoates (XPHA) by phosphorus accumulating organisms (XPAO). Two variant extensions (A2,1/A2,2) describe biological production of sulfides (SIS) by means of sulfate reducing bacteria (XSRB) utilising hydrogen only (autolithotrophically) or hydrogen plus organic acids (heterorganotrophically) as electron sources, respectively. These two approaches also consider a potential hydrogen sulfide ( [Formula: see text] inhibition effect and stripping to the gas phase ( [Formula: see text] ). The third extension (A3) accounts for chemical iron (III) ( [Formula: see text] ) reduction to iron (II) ( [Formula: see text] ) using hydrogen ( [Formula: see text] ) and sulfides (SIS) as electron donors. A set of pre/post interfaces between the Activated Sludge Model No. 2d (ASM2d) and ADM1 are furthermore proposed in order to allow for plant-wide (model-based) analysis and study of the interactions between the water and sludge lines. Simulation (A1 - A3) results show that the ratio between soluble/particulate P compounds strongly depends on the pH and cationic load, which determines the capacity to form (or not) precipitation products. Implementations A1 and A2,1/A2,2 lead to a reduction in the predicted methane/biogas production (and potential energy recovery) compared to reference ADM1 predictions (A0). This reduction is attributed to two factors: (1) loss of electron equivalents due to sulfate [Formula: see text] reduction by XSRB and storage of XPHA by XPAO; and, (2) decrease of acetoclastic and hydrogenotrophic methanogenesis due to [Formula: see text] inhibition. Model A3 shows the potential for iron to remove free SIS (and consequently inhibition) and instead promote iron sulfide (XFeS) precipitation. It also reduces the quantities of struvite ( [Formula: see text] ) and calcium phosphate ( [Formula: see text] ) that are formed due to its higher affinity for phosphate anions. This study provides a detailed analysis of the different model assumptions, the effect that operational/design conditions have on the model predictions and the practical implications of the proposed model extensions in view of plant-wide modelling/development of resource recovery strategies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Computational Nanomechanics of Carbon Nanotubes and Composites

    NASA Technical Reports Server (NTRS)

    Srivastava, Deepak; Wei, Chenyu; Cho, Kyeongjae; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Nanomechanics of individual carbon and boron-nitride nanotubes and their application as reinforcing fibers in polymer composites has been reviewed with interplay of theoretical modeling, computer simulations and experimental observations. The emphasis in this work is on elucidating the multi-length scales of the problems involved, and of different simulation techniques that are needed to address specific characteristics of individual nanotubes and nanotube polymer-matrix interfaces. Classical molecular dynamics simulations are shown to be sufficient to describe the generic behavior such as strength and stiffness modulus but are inadequate to describe elastic limit and nature of plastic buckling at large strength. Quantum molecular dynamics simulations are shown to bring out explicit atomic nature dependent behavior of these nanoscale materials objects that are not accessible either via continuum mechanics based descriptions or through classical molecular dynamics based simulations. As examples, we discus local plastic collapse of carbon nanotubes under axial compression and anisotropic plastic buckling of boron-nitride nanotubes. Dependence of the yield strain on the strain rate is addressed through temperature dependent simulations, a transition-state-theory based model of the strain as a function of strain rate and simulation temperature is presented, and in all cases extensive comparisons are made with experimental observations. Mechanical properties of nanotube-polymer composite materials are simulated with diverse nanotube-polymer interface structures (with van der Waals interaction). The atomistic mechanisms of the interface toughening for optimal load transfer through recycling, high-thermal expansion and diffusion coefficient composite formation above glass transition temperature, and enhancement of Young's modulus on addition of nanotubes to polymer are discussed and compared with experimental observations.

  2. A note on the kappa statistic for clustered dichotomous data.

    PubMed

    Zhou, Ming; Yang, Zhao

    2014-06-30

    The kappa statistic is widely used to assess the agreement between two raters. Motivated by a simulation-based cluster bootstrap method to calculate the variance of the kappa statistic for clustered physician-patients dichotomous data, we investigate its special correlation structure and develop a new simple and efficient data generation algorithm. For the clustered physician-patients dichotomous data, based on the delta method and its special covariance structure, we propose a semi-parametric variance estimator for the kappa statistic. An extensive Monte Carlo simulation study is performed to evaluate the performance of the new proposal and five existing methods with respect to the empirical coverage probability, root-mean-square error, and average width of the 95% confidence interval for the kappa statistic. The variance estimator ignoring the dependence within a cluster is generally inappropriate, and the variance estimators from the new proposal, bootstrap-based methods, and the sampling-based delta method perform reasonably well for at least a moderately large number of clusters (e.g., the number of clusters K ⩾50). The new proposal and sampling-based delta method provide convenient tools for efficient computations and non-simulation-based alternatives to the existing bootstrap-based methods. Moreover, the new proposal has acceptable performance even when the number of clusters is as small as K = 25. To illustrate the practical application of all the methods, one psychiatric research data and two simulated clustered physician-patients dichotomous data are analyzed. Copyright © 2014 John Wiley & Sons, Ltd.

  3. On Market-Based Coordination of Thermostatically Controlled Loads With User Preference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sen; Zhang, Wei; Lian, Jianming

    2014-12-15

    This paper presents a market-based control framework to coordinate a group of autonomous Thermostatically Controlled Loads (TCL) to achieve the system-level objectives with pricing incentives. The problem is formulated as maximizing the social welfare subject to feeder power constraint. It allows the coordinator to affect the aggregated power of a group of dynamical systems, and creates an interactive market where the users and the coordinator cooperatively determine the optimal energy allocation and energy price. The optimal pricing strategy is derived, which maximizes social welfare while respecting the feeder power constraint. The bidding strategy is also designed to compute the optimalmore » price in real time (e.g., every 5 minutes) based on local device information. The coordination framework is validated with realistic simulations in GridLab-D. Extensive simulation results demonstrate that the proposed approach effectively maximizes the social welfare and decreases power congestion at key times.« less

  4. Integrated layout based Monte-Carlo simulation for design arc optimization

    NASA Astrophysics Data System (ADS)

    Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James

    2016-03-01

    Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533

  5. Autonomous facial recognition system inspired by human visual system based logarithmical image visualization technique

    NASA Astrophysics Data System (ADS)

    Wan, Qianwen; Panetta, Karen; Agaian, Sos

    2017-05-01

    Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.

  6. Dual-drive Mach-Zehnder modulator-based reconfigurable and transparent spectral conversion for dense wavelength division multiplexing transmissions

    NASA Astrophysics Data System (ADS)

    Mao, Mingzhi; Qian, Chen; Cao, Bingyao; Zhang, Qianwu; Song, Yingxiong; Wang, Min

    2017-09-01

    A digital signal process enabled dual-drive Mach-Zehnder modulator (DD-MZM)-based spectral converter is proposed and extensively investigated to realize dynamically reconfigurable and high transparent spectral conversion. As another important innovation point of the paper, to optimize the converter performance, the optimum operation conditions of the proposed converter are deduced, statistically simulated, and experimentally verified. The optimum conditions supported-converter performances are verified by detail numerical simulations and experiments in intensity-modulation and direct-detection-based network in terms of frequency detuning range-dependent conversion efficiency, strict operation transparency for user signal characteristics, impact of parasitic components on the conversion performance, as well as the converted component waveform are almost nondistortion. It is also found that the converter has the high robustness to the input signal power, optical signal-to-noise ratio variations, extinction ratio, and driving signal frequency.

  7. Simulation and Experimental Studies of a 2.45GHz Magnetron Source for an SRF Cavity with Field Amplitude and Phase Controls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Haipeng; Plawski, Tomasz E.; Rimmer, Robert A.

    2016-06-01

    Phase lock to an SRF cavity by using injection signal through output waveguide of a magnetron has been demonstrated [1, 3]. Amplitude control using magnetic field trimming and anode voltage modulation has been studied using MATLAB/Simulink simulations [2]. Based on these, we are planning to use an FPGA based digital LLRF system, which allows applying various types of control algorithms in order to achieve the required accelerating field stability. Since the 1497 MHz magnetron is still in the design stage, the proof of principle measurements of a commercial 2450 MHz magnetron are carried out to characterize the anode I-V curve,more » output power (the tube electronic efficiency), frequency dependence on the anode current (frequency pushing) and the Rieke diagram (frequency pulling by the reactive load). Based on early Simulink simulation, experimental data and extension of the Adler equation governing injection phase stability by Chen’s model, the specification of the new LLRF control chassis for both 2450 and 1497MHz systems are presented in this paper.« less

  8. Complete wetting of graphene by biological lipids

    NASA Astrophysics Data System (ADS)

    Luan, Binquan; Huynh, Tien; Zhou, Ruhong

    2016-03-01

    Graphene nanosheets have been demonstrated to extract large amounts of lipid molecules directly out of the cell membrane of bacteria and thus cause serious damage to the cell's integrity. This interesting phenomenon, however, is so far not well understood theoretically. Here through extensive molecular dynamics simulations and theoretical analyses, we show that this phenomenon can be categorized as a complete wetting of graphene by membrane lipids in water. A wetting-based theory was utilized to associate the free energy change during the microscopic extraction of a lipid with the spreading parameter for the macroscopic wetting. With a customized thermodynamic cycle for detailed energetics, we show that the dispersive adhesion between graphene and lipids plays a dominant role during this extraction as manifested by the curved graphene. Our simulation results suggest that biological lipids can completely wet the concave, flat or even convex (with a small curvature) surface of a graphene sheet.Graphene nanosheets have been demonstrated to extract large amounts of lipid molecules directly out of the cell membrane of bacteria and thus cause serious damage to the cell's integrity. This interesting phenomenon, however, is so far not well understood theoretically. Here through extensive molecular dynamics simulations and theoretical analyses, we show that this phenomenon can be categorized as a complete wetting of graphene by membrane lipids in water. A wetting-based theory was utilized to associate the free energy change during the microscopic extraction of a lipid with the spreading parameter for the macroscopic wetting. With a customized thermodynamic cycle for detailed energetics, we show that the dispersive adhesion between graphene and lipids plays a dominant role during this extraction as manifested by the curved graphene. Our simulation results suggest that biological lipids can completely wet the concave, flat or even convex (with a small curvature) surface of a graphene sheet. Electronic supplementary information (ESI) available: The movie showing the simulation trajectory for the extraction of lipids from the membrane. See DOI: 10.1039/C6NR00202A

  9. Mesoscopic simulations of shock-to-detonation transition in reactive liquid high explosive

    NASA Astrophysics Data System (ADS)

    Maillet, J. B.; Bourasseau, E.; Desbiens, N.; Vallverdu, G.; Stoltz, G.

    2011-12-01

    An extension of the model described in a previous work (see Maillet J. B. et al., EPL, 78 (2007) 68001) based on Dissipative Particle Dynamics is presented and applied to a liquid high explosive (HE), with thermodynamic properties mimicking those of liquid nitromethane. Large scale nonequilibrium simulations of reacting liquid HE with model kinetic under sustained shock conditions allow a better understanding of the shock-to-detonation transition in homogeneous explosives. Moreover, the propagation of the reactive wave appears discontinuous since ignition points in the shocked material can be activated by the compressive waves emitted from the onset of chemical reactions.

  10. A formal language for the specification and verification of synchronous and asynchronous circuits

    NASA Technical Reports Server (NTRS)

    Russinoff, David M.

    1993-01-01

    A formal hardware description language for the intended application of verifiable asynchronous communication is described. The language is developed within the logical framework of the Nqthm system of Boyer and Moore and is based on the event-driven behavioral model of VHDL, including the basic VHDL signal propagation mechanisms, the notion of simulation deltas, and the VHDL simulation cycle. A core subset of the language corresponds closely with a subset of VHDL and is adequate for the realistic gate-level modeling of both combinational and sequential circuits. Various extensions to this subset provide means for convenient expression of behavioral circuit specifications.

  11. Cellerator: extending a computer algebra system to include biochemical arrows for signal transduction simulations

    NASA Technical Reports Server (NTRS)

    Shapiro, Bruce E.; Levchenko, Andre; Meyerowitz, Elliot M.; Wold, Barbara J.; Mjolsness, Eric D.

    2003-01-01

    Cellerator describes single and multi-cellular signal transduction networks (STN) with a compact, optionally palette-driven, arrow-based notation to represent biochemical reactions and transcriptional activation. Multi-compartment systems are represented as graphs with STNs embedded in each node. Interactions include mass-action, enzymatic, allosteric and connectionist models. Reactions are translated into differential equations and can be solved numerically to generate predictive time courses or output as systems of equations that can be read by other programs. Cellerator simulations are fully extensible and portable to any operating system that supports Mathematica, and can be indefinitely nested within larger data structures to produce highly scaleable models.

  12. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  13. Design and implementation of an internet-based electrical engineering laboratory.

    PubMed

    He, Zhenlei; Shen, Zhangbiao; Zhu, Shanan

    2014-09-01

    This paper describes an internet-based electrical engineering laboratory (IEE-Lab) with virtual and physical experiments at Zhejiang University. In order to synthesize the advantages of both experiment styles, the IEE-Lab is come up with Client/Server/Application framework and combines the virtual and physical experiments. The design and workflow of IEE-Lab are introduced. The analog electronic experiment is taken as an example to show Flex plug-in design, data communication based on XML (Extensible Markup Language), experiment simulation modeled by Modelica and control terminals' design. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  14. Simulating pre-galactic metal enrichment for JWST deep-field observations

    NASA Astrophysics Data System (ADS)

    Jaacks, Jason

    2017-08-01

    We propose to create a new suite of mesoscale cosmological volume simulations with custom built sub-grid physics in which we independently track the contribution from Population III and Population II star formation to the total metals in the interstellar medium (ISM) of the first galaxies, and in the diffuse IGM at an epoch prior to reionization. These simulations will fill a gap in our simulation knowledge about chemical enrichment in the pre-reionization universe, which is a crucial need given the impending observational push into this epoch with near-future ground and space-based telescopes. This project is the natural extension of our successful Cycle 24 theory proposal (HST-AR-14569.001-A; PI Jaacks) in which we developed a new Pop III star formation sub-grid model which is currently being utilized to study the baseline metal enrichment of pre-reionization systems.

  15. Comparisons between GRNTRN simulations and beam measurements of proton lateral broadening distributions

    NASA Astrophysics Data System (ADS)

    Mertens, Christopher; Moyers, Michael; Walker, Steven; Tweed, John

    Recent developments in NASA's High Charge and Energy Transport (HZETRN) code have included lateral broadening of primary ion beams due to small-angle multiple Coulomb scattering, and coupling of the ion-nuclear scattering interactions with energy loss and straggling. The new version of HZETRN based on Green function methods, GRNTRN, is suitable for modeling transport with both space environment and laboratory boundary conditions. Multiple scattering processes are a necessary extension to GRNTRN in order to accurately model ion beam experiments, to simulate the physical and biological-effective radiation dose, and to develop new methods and strategies for light ion radiation therapy. In this paper we compare GRNTRN simulations of proton lateral scattering distributions with beam measurements taken at Loma Linda Medical University. The simulated and measured lateral proton distributions will be compared for a 250 MeV proton beam on aluminum, polyethylene, polystyrene, bone, iron, and lead target materials.

  16. Simulation of nanoparticle-mediated near-infrared thermal therapy using GATE

    PubMed Central

    Cuplov, Vesna; Pain, Frédéric; Jan, Sébastien

    2017-01-01

    Application of nanotechnology for biomedicine in cancer therapy allows for direct delivery of anticancer agents to tumors. An example of such therapies is the nanoparticle-mediated near-infrared hyperthermia treatment. In order to investigate the influence of nanoparticle properties on the spatial distribution of heat in the tumor and healthy tissues, accurate simulations are required. The Geant4 Application for Emission Tomography (GATE) open-source simulation platform, based on the Geant4 toolkit, is widely used by the research community involved in molecular imaging, radiotherapy and optical imaging. We present an extension of GATE that can model nanoparticle-mediated hyperthermal therapy as well as simple heat diffusion in biological tissues. This new feature of GATE combined with optical imaging allows for the simulation of a theranostic scenario in which the patient is injected with theranostic nanosystems that can simultaneously deliver therapeutic (i.e. hyperthermia therapy) and imaging agents (i.e. fluorescence imaging). PMID:28663855

  17. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in Flash

    NASA Technical Reports Server (NTRS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-01-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  18. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  19. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  20. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in FLASH

    NASA Astrophysics Data System (ADS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-08-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid-structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  1. Spatially resolved photodiode response for simulating precise interferometers.

    PubMed

    Fernández Barranco, Germán; Tröbs, Michael; Müller, Vitali; Gerberding, Oliver; Seifert, Frank; Heinzel, Gerhard

    2016-08-20

    Quadrant photodiodes (QPDs) are used in laser interferometry systems to simultaneously detect longitudinal displacement of test masses and angular misalignment between the two interfering beams. The latter is achieved by means of the differential wavefront sensing (DWS) technique, which provides ultra-high precision for measuring angular displacements. We have developed a setup to obtain the spatially resolved response of QPDs that, together with an extension of the simulation software IfoCAD, allows us to use the measured response in simulations and accurately predict the desired longitudinal and DWS phase observables. Three different commercial off-the-shelf QPD candidates for space-based interferometry were characterized. The measured response of one QPD was used in optical simulations. Nonuniformities in the response of the device and crosstalk between segments do not introduce significant variations in the longitudinal and DWS measurands with respect to the standard case when a uniform QPD without crosstalk is used.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Digital instrumentation and controls system technique is being introduced in new constructed research reactor or life extension of older research reactor. Digital systems are easy to change and optimize but the validated process for them is required. Also, to reduce project risk or cost, we have to make it sure that configuration and control functions is right before the commissioning phase on research reactor. For this purpose, simulators have been widely used in developing control systems in automotive and aerospace industries. In these literatures, however, very few of these can be found regarding test on the control system of researchmore » reactor with simulator. Therefore, this paper proposes a simulation platform to verify the performance of RRS (Reactor Regulating System) for research reactor. This simulation platform consists of the reactor simulation model and the interface module. This simulation platform is applied to I and C upgrade project of TRIGA reactor, and many problems of RRS configuration were found and solved. And it proved that the dynamic performance testing based on simulator enables significant time saving and improves economics and quality for RRS in the system test phase. (authors)« less

  3. Benchmarking of Advanced Control Strategies for a Simulated Hydroelectric System

    NASA Astrophysics Data System (ADS)

    Finotti, S.; Simani, S.; Alvisi, S.; Venturini, M.

    2017-01-01

    This paper analyses and develops the design of advanced control strategies for a typical hydroelectric plant during unsteady conditions, performed in the Matlab and Simulink environments. The hydraulic system consists of a high water head and a long penstock with upstream and downstream surge tanks, and is equipped with a Francis turbine. The nonlinear characteristics of hydraulic turbine and the inelastic water hammer effects were considered to calculate and simulate the hydraulic transients. With reference to the control solutions addressed in this work, the proposed methodologies rely on data-driven and model-based approaches applied to the system under monitoring. Extensive simulations and comparisons serve to determine the best solution for the development of the most effective, robust and reliable control tool when applied to the considered hydraulic system.

  4. Primary proton and helium spectra around the knee observed by the Tibet air-shower experiment

    NASA Astrophysics Data System (ADS)

    Jing, Huang; Tibet ASγ Collaboration

    A hybrid experiment was carried out to study the cosmic-ray primary composition in the 'knee' energy region. The experimental set-up consists of the Tibet-II air shower array( AS ), the emulsion chamber ( EC ) and the burst detector ( BD ) which are operated simulteneously and provides us information on the primary species. The experiment was carried out at Yangbajing (4,300 m a.s.l., 606 g/cm2) in Tibet during the period from 1996 through 1999. We have already reported the primary proton flux around the knee region based on the simulation code COSMOS. In this paper, we present the primary proton and helium spectra around the knee region. We also extensively examine the simulation codes COSMOS ad-hoc and CORSIKA with interaction models of QGSJET01, DPMJET 2.55, SIBYLL 2.1, VENUS 4.125, HDPM, and NEXUS 2. Based on these calculations, we briefly discuss on the systematic errors involved in our experimental results due to the Monte Carlo simulation.

  5. Full-envelope aerodynamic modeling of the Harrier aircraft

    NASA Technical Reports Server (NTRS)

    Mcnally, B. David

    1986-01-01

    A project to identify a full-envelope model of the YAV-8B Harrier using flight-test and parameter identification techniques is described. As part of the research in advanced control and display concepts for V/STOL aircraft, a full-envelope aerodynamic model of the Harrier is identified, using mathematical model structures and parameter identification methods. A global-polynomial model structure is also used as a basis for the identification of the YAV-8B aerodynamic model. State estimation methods are used to ensure flight data consistency prior to parameter identification.Equation-error methods are used to identify model parameters. A fixed-base simulator is used extensively to develop flight test procedures and to validate parameter identification software. Using simple flight maneuvers, a simulated data set was created covering the YAV-8B flight envelope from about 0.3 to 0.7 Mach and about -5 to 15 deg angle of attack. A singular value decomposition implementation of the equation-error approach produced good parameter estimates based on this simulated data set.

  6. Employing multi-GPU power for molecular dynamics simulation: an extension of GALAMOST

    NASA Astrophysics Data System (ADS)

    Zhu, You-Liang; Pan, Deng; Li, Zhan-Wei; Liu, Hong; Qian, Hu-Jun; Zhao, Yang; Lu, Zhong-Yuan; Sun, Zhao-Yan

    2018-04-01

    We describe the algorithm of employing multi-GPU power on the basis of Message Passing Interface (MPI) domain decomposition in a molecular dynamics code, GALAMOST, which is designed for the coarse-grained simulation of soft matters. The code of multi-GPU version is developed based on our previous single-GPU version. In multi-GPU runs, one GPU takes charge of one domain and runs single-GPU code path. The communication between neighbouring domains takes a similar algorithm of CPU-based code of LAMMPS, but is optimised specifically for GPUs. We employ a memory-saving design which can enlarge maximum system size at the same device condition. An optimisation algorithm is employed to prolong the update period of neighbour list. We demonstrate good performance of multi-GPU runs on the simulation of Lennard-Jones liquid, dissipative particle dynamics liquid, polymer and nanoparticle composite, and two-patch particles on workstation. A good scaling of many nodes on cluster for two-patch particles is presented.

  7. Neutralizer Hollow Cathode Simulations and Comparisons with Ground Test Data

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Snyder, John S.; Goebel, Dan M.; Katz, Ira; Herman, Daniel A.

    2009-01-01

    The fidelity of electric propulsion physics-based models depends largely on the validity of their predictions over a range of operating conditions and geometries. In general, increased complexity of the physics requires more extensive comparisons with laboratory data to identify the region(s) that lie outside the validity of the model assumptions and to quantify the uncertainties within its range of application. This paper presents numerical simulations of neutralizer hollow cathodes at various operating conditions and orifice sizes. The simulations were performed using a two-dimensional axisymmetric model that solves numerically a relatively extensive system of conservation laws for the partially ionized gas in these devices. A summary of the comparisons between simulation results and Langmuir probe measurements is provided. The model has also been employed to provide insight into recent ground test observations of the neutralizer cathode in NEXT. It is found that a likely cause of the observed keeper voltage drop is cathode orifice erosion. However, due to the small magnitude of this change, is approx. 0.5 V (less than 5% of the beginning-of-life value) over 10 khrs, and in light of the large uncertainties of the cathode material sputtering yield at low ion energies, other causes cannot be excluded. Preliminary simulations to understand transition to plume mode suggest that in the range of 3-5 sccm the existing 2-D model reproduces fairly well the rise of the keeper voltage in the NEXT neutralizer as observed in the laboratory. At lower flow rates the simulation produces oscillations in the keeper current and voltage that require prohibitively small time-steps to resolve with the existing algorithms.

  8. Hippocampome.org: a knowledge base of neuron types in the rodent hippocampus.

    PubMed

    Wheeler, Diek W; White, Charise M; Rees, Christopher L; Komendantov, Alexander O; Hamilton, David J; Ascoli, Giorgio A

    2015-09-24

    Hippocampome.org is a comprehensive knowledge base of neuron types in the rodent hippocampal formation (dentate gyrus, CA3, CA2, CA1, subiculum, and entorhinal cortex). Although the hippocampal literature is remarkably information-rich, neuron properties are often reported with incompletely defined and notoriously inconsistent terminology, creating a formidable challenge for data integration. Our extensive literature mining and data reconciliation identified 122 neuron types based on neurotransmitter, axonal and dendritic patterns, synaptic specificity, electrophysiology, and molecular biomarkers. All ∼3700 annotated properties are individually supported by specific evidence (∼14,000 pieces) in peer-reviewed publications. Systematic analysis of this unprecedented amount of machine-readable information reveals novel correlations among neuron types and properties, the potential connectivity of the full hippocampal circuitry, and outstanding knowledge gaps. User-friendly browsing and online querying of Hippocampome.org may aid design and interpretation of both experiments and simulations. This powerful, simple, and extensible neuron classification endeavor is unique in its detail, utility, and completeness.

  9. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  10. Three-Dimensional Data Registration Based on Human Perception

    DTIC Science & Technology

    2006-01-01

    sets. The new algorithm was tested extensively on simulated sensor images in several scenarios key to successful application to autonomous ground...that humans perceive visual images, an assumption of stationarity can be applied to the data sets , with to compensate for any new data...proximity to each other that an assumption of, or preference for , stationarity would require corresponding data in the data sets that is not new

  11. Numerical Experiments Investigating the Source of Explosion S-Waves

    DTIC Science & Technology

    2007-09-01

    simulations in this study are based on the well-recorded 1993 Nonproliferation experiment (NPE) ( chemical kiloton). A regional 3-dimensional model...1-kiloton chemical explosion at the NTS. NPE details and research reports can be found in Denny and Stull (1994). Figure 3 shows the extensive...T., D. Helmberger, and G. Engen (1985). Evidence for tectonic release from underground nuclear explosions in long period S waves, Bull. Seismol. Soc

  12. Agent Based Intelligence in a Tetrahedral Rover

    NASA Technical Reports Server (NTRS)

    Phelps, Peter; Truszkowski, Walt

    2007-01-01

    A tetrahedron is a 4-node 6-strut pyramid structure which is being used by the NASA - Goddard Space Flight Center as the basic building block for a new approach to robotic motion. The struts are extendable; it is by the sequence of activities: strut-extension, changing the center of gravity and falling that the tetrahedron "moves". Currently, strut-extension is handled by human remote control. There is an effort underway to make the movement of the tetrahedron autonomous, driven by an attempt to achieve a goal. The approach being taken is to associate an intelligent agent with each node. Thus, the autonomous tetrahedron is realized as a constrained multi-agent system, where the constraints arise from the fact that between any two agents there is an extendible strut. The hypothesis of this work is that, by proper composition of such automated tetrahedra, robotic structures of various levels of complexity can be developed which will support more complex dynamic motions. This is the basis of the new approach to robotic motion which is under investigation. A Java-based simulator for the single tetrahedron, realized as a constrained multi-agent system, has been developed and evaluated. This paper reports on this project and presents a discussion of the structure and dynamics of the simulator.

  13. NITPICK: peak identification for mass spectrometry data

    PubMed Central

    Renard, Bernhard Y; Kirchner, Marc; Steen , Hanno; Steen, Judith AJ; Hamprecht , Fred A

    2008-01-01

    Background The reliable extraction of features from mass spectra is a fundamental step in the automated analysis of proteomic mass spectrometry (MS) experiments. Results This contribution proposes a sparse template regression approach to peak picking called NITPICK. NITPICK is a Non-greedy, Iterative Template-based peak PICKer that deconvolves complex overlapping isotope distributions in multicomponent mass spectra. NITPICK is based on fractional averagine, a novel extension to Senko's well-known averagine model, and on a modified version of sparse, non-negative least angle regression, for which a suitable, statistically motivated early stopping criterion has been derived. The strength of NITPICK is the deconvolution of overlapping mixture mass spectra. Conclusion Extensive comparative evaluation has been carried out and results are provided for simulated and real-world data sets. NITPICK outperforms pepex, to date the only alternate, publicly available, non-greedy feature extraction routine. NITPICK is available as software package for the R programming language and can be downloaded from . PMID:18755032

  14. Variance change point detection for fractional Brownian motion based on the likelihood ratio test

    NASA Astrophysics Data System (ADS)

    Kucharczyk, Daniel; Wyłomańska, Agnieszka; Sikora, Grzegorz

    2018-01-01

    Fractional Brownian motion is one of the main stochastic processes used for describing the long-range dependence phenomenon for self-similar processes. It appears that for many real time series, characteristics of the data change significantly over time. Such behaviour one can observe in many applications, including physical and biological experiments. In this paper, we present a new technique for the critical change point detection for cases where the data under consideration are driven by fractional Brownian motion with a time-changed diffusion coefficient. The proposed methodology is based on the likelihood ratio approach and represents an extension of a similar methodology used for Brownian motion, the process with independent increments. Here, we also propose a statistical test for testing the significance of the estimated critical point. In addition to that, an extensive simulation study is provided to test the performance of the proposed method.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Mark J.; Saleh, Omar A.

    We calculated the force-extension curves for a flexible polyelectrolyte chain with varying charge separations by performing Monte Carlo simulations of a 5000 bead chain using a screened Coulomb interaction. At all charge separations, the force-extension curves exhibit a Pincus-like scaling regime at intermediate forces and a logarithmic regime at large forces. As the charge separation increases, the Pincus regime shifts to a larger range of forces and the logarithmic regime starts are larger forces. We also found that force-extension curve for the corresponding neutral chain has a logarithmic regime. Decreasing the diameter of bead in the neutral chain simulations removedmore » the logarithmic regime, and the force-extension curve tends to the freely jointed chain limit. In conclusion, this result shows that only excluded volume is required for the high force logarithmic regime to occur.« less

  16. Alborz-I array: A simulation on performance and properties of the array around the knee of the cosmic ray spectrum

    NASA Astrophysics Data System (ADS)

    Abdollahi, Soheila; Bahmanabadi, Mahmud; Pezeshkian, Yousef; Mortazavi Moghaddam, Saba

    2016-03-01

    The first phase of the Alborz Observatory Array (Alborz-I) consists of 20 plastic scintillation detectors each one with surface area of 0.25 m2spread over an area of 40 × 40 m2 realized to the study of Extensive Air Showers around the knee at the Sharif University of Technology campus. The first stage of the project including construction and operation of a prototype system has now been completed and the electronics that will be used in the array instrument has been tested under field conditions. In order to achieve a realistic estimate of the array performance, a large number of simulated CORSIKA showers have been used. In the present work, theoretical results obtained in the study of different array layouts and trigger conditions are described. Using Monte Carlo simulations of showers the rate of detected events per day and the trigger probability functions, i.e., the probability for an extensive air shower to trigger a ground based array as a function of the shower core distance to the center of array are presented for energies above 1 TeV and zenith angles up to 60°. Moreover, the angular resolution of the Alborz-I array is obtained.

  17. Molecular dynamics simulations reveal the conformational dynamics of Arabidopsis thaliana BRI1 and BAK1 receptor-like kinases.

    PubMed

    Moffett, Alexander S; Bender, Kyle W; Huber, Steven C; Shukla, Diwakar

    2017-07-28

    The structural motifs responsible for activation and regulation of eukaryotic protein kinases in animals have been studied extensively in recent years, and a coherent picture of their activation mechanisms has begun to emerge. In contrast, non-animal eukaryotic protein kinases are not as well understood from a structural perspective, representing a large knowledge gap. To this end, we investigated the conformational dynamics of two key Arabidopsis thaliana receptor-like kinases, brassinosteroid-insensitive 1 (BRI1) and BRI1-associated kinase 1 (BAK1), through extensive molecular dynamics simulations of their fully phosphorylated kinase domains. Molecular dynamics simulations calculate the motion of each atom in a protein based on classical approximations of interatomic forces, giving researchers insight into protein function at unparalleled spatial and temporal resolutions. We found that in an otherwise "active" BAK1 the αC helix is highly disordered, a hallmark of deactivation, whereas the BRI1 αC helix is moderately disordered and displays swinging behavior similar to numerous animal kinases. An analysis of all known sequences in the A. thaliana kinome found that αC helix disorder may be a common feature of plant kinases. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  18. Extension of PENELOPE to protons: simulation of nuclear reactions and benchmark with Geant4.

    PubMed

    Sterpin, E; Sorriaux, J; Vynckier, S

    2013-11-01

    Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4. PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac-Hartree-Fock-Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer-Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for (1)H and ICRU 63 data for (12)C, (14)N, (16)O, (31)P, and (40)Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth-dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth-dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone. For simulations with EM collisions only, integral depth-dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth-dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth-dose distributions). The agreement is much better with FLUKA, with deviations within 3%/3 mm. When nuclear interactions were turned on, agreement (within 6% before the Bragg-peak) between PENH and Geant4 was consistent with uncertainties on nuclear models and cross sections, whatever the material simulated (water, muscle, or bone). A detailed and flexible description of nuclear reactions has been implemented in the PENH extension of PENELOPE to protons, which utilizes a mixed-simulation scheme for both elastic and inelastic EM collisions, analogous to the well-established algorithm for electrons/positrons. PENH is compatible with all current main programs that use PENELOPE as the MC engine. The nuclear model of PENH is realistic enough to give dose distributions in fair agreement with those computed by Geant4.

  19. Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sterpin, E.; Sorriaux, J.; Vynckier, S.

    2013-11-15

    Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for {sup 1}H and ICRUmore » 63 data for {sup 12}C, {sup 14}N, {sup 16}O, {sup 31}P, and {sup 40}Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth–dose distributions). The agreement is much better with FLUKA, with deviations within 3%/3 mm. When nuclear interactions were turned on, agreement (within 6% before the Bragg-peak) between PENH and Geant4 was consistent with uncertainties on nuclear models and cross sections, whatever the material simulated (water, muscle, or bone).Conclusions: A detailed and flexible description of nuclear reactions has been implemented in the PENH extension of PENELOPE to protons, which utilizes a mixed-simulation scheme for both elastic and inelastic EM collisions, analogous to the well-established algorithm for electrons/positrons. PENH is compatible with all current main programs that use PENELOPE as the MC engine. The nuclear model of PENH is realistic enough to give dose distributions in fair agreement with those computed by Geant4.« less

  20. A prototype knowledge-based simulation support system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, T.R.; Roberts, S.D.

    1987-04-01

    As a preliminary step toward the goal of an intelligent automated system for simulation modeling support, we explore the feasibility of the overall concept by generating and testing a prototypical framework. A prototype knowledge-based computer system was developed to support a senior level course in industrial engineering so that the overall feasibility of an expert simulation support system could be studied in a controlled and observable setting. The system behavior mimics the diagnostic (intelligent) process performed by the course instructor and teaching assistants, finding logical errors in INSIGHT simulation models and recommending appropriate corrective measures. The system was programmed inmore » a non-procedural language (PROLOG) and designed to run interactively with students working on course homework and projects. The knowledge-based structure supports intelligent behavior, providing its users with access to an evolving accumulation of expert diagnostic knowledge. The non-procedural approach facilitates the maintenance of the system and helps merge the roles of expert and knowledge engineer by allowing new knowledge to be easily incorporated without regard to the existing flow of control. The background, features and design of the system are describe and preliminary results are reported. Initial success is judged to demonstrate the utility of the reported approach and support the ultimate goal of an intelligent modeling system which can support simulation modelers outside the classroom environment. Finally, future extensions are suggested.« less

  1. Parallel computing method for simulating hydrological processesof large rivers under climate change

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, Y.

    2016-12-01

    Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.

  2. Improving SWAT for simulating water and carbon fluxes of forest ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Qichun; Zhang, Xuesong

    2016-11-01

    As a widely used watershed model for assessing impacts of anthropogenic and natural disturbances on water quantity and quality, the Soil and Water Assessment Tool (SWAT) has not been extensively tested in simulating water and carbon fluxes of forest ecosystems. Here, we examine SWAT simulations of evapotranspiration (ET), net primary productivity (NPP), net ecosystem exchange (NEE), and plant biomass at ten AmeriFlux forest sites across the U.S. We identify unrealistic radiation use efficiency (Bio_E), large leaf to biomass fraction (Bio_LEAF), and missing phosphorus supply from parent material weathering as the primary causes for the inadequate performance of the default SWATmore » model in simulating forest dynamics. By further revising the relevant parameters and processes, SWAT’s performance is substantially improved. Based on the comparison between the improved SWAT simulations and flux tower observations, we discuss future research directions for further enhancing model parameterization and representation of water and carbon cycling for forests.« less

  3. Survey of computer programs for prediction of crash response and of its experimental validation

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.

    1976-01-01

    The author seeks to critically assess the potentialities of the mathematical and hybrid simulators which predict post-impact response of transportation vehicles. A strict rigorous numerical analysis of a complex phenomenon like crash may leave a lot to be desired with regard to the fidelity of mathematical simulation. Hybrid simulations on the other hand which exploit experimentally observed features of deformations appear to hold a lot of promise. MARC, ANSYS, NONSAP, DYCAST, ACTION, WHAM II and KRASH are among some of the simulators examined for their capabilities with regard to prediction of post impact response of vehicles. A review of these simulators reveals that much more by way of an analysis capability may be desirable than what is currently available. NASA's crashworthiness testing program in conjunction with similar programs of various other agencies, besides generating a large data base, will be equally useful in the validation of new mathematical concepts of nonlinear analysis and in the successful extension of other techniques in crashworthiness.

  4. Implementing ADM1 for plant-wide benchmark simulations in Matlab/Simulink.

    PubMed

    Rosen, C; Vrecko, D; Gernaey, K V; Pons, M N; Jeppsson, U

    2006-01-01

    The IWA Anaerobic Digestion Model No.1 (ADM1) was presented in 2002 and is expected to represent the state-of-the-art model within this field in the future. Due to its complexity the implementation of the model is not a simple task and several computational aspects need to be considered, in particular if the ADM1 is to be included in dynamic simulations of plant-wide or even integrated systems. In this paper, the experiences gained from a Matlab/Simulink implementation of ADM1 into the extended COST/IWA Benchmark Simulation Model (BSM2) are presented. Aspects related to system stiffness, model interfacing with the ASM family, mass balances, acid-base equilibrium and algebraic solvers for pH and other troublesome state variables, numerical solvers and simulation time are discussed. The main conclusion is that if implemented properly, the ADM1 will also produce high-quality results in dynamic plant-wide simulations including noise, discrete sub-systems, etc. without imposing any major restrictions due to extensive computational efforts.

  5. Simulation of beta radiator handling procedures in nuclear medicine by means of a movable hand phantom.

    PubMed

    Blunck, Ch; Becker, F; Urban, M

    2011-03-01

    In nuclear medicine therapies, people working with beta radiators such as (90)Y may be exposed to non-negligible partial body doses. For radiation protection, it is important to know the characteristics of the radiation field and possible dose exposures at relevant positions in the working area. Besides extensive measurements, simulations can provide these data. For this purpose, a movable hand phantom for Monte Carlo simulations was developed. Specific beta radiator handling scenarios can be modelled interactively with forward kinematics or automatically with an inverse kinematics procedure. As a first investigation, the dose distribution on a medical doctor's hand injecting a (90)Y solution was measured and simulated with the phantom. Modelling was done with the interactive method based on five consecutive frames from a video recorded during the injection. Owing to the use of only one camera, not each detail of the radiation scenario is visible in the video. In spite of systematic uncertainties, the measured and simulated dose values are in good agreement.

  6. Development of the simulation system {open_quotes}IMPACT{close_quotes} for analysis of nuclear power plant severe accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naitoh, Masanori; Ujita, Hiroshi; Nagumo, Hiroichi

    1997-07-01

    The Nuclear Power Engineering Corporation (NUPEC) has initiated a long-term program to develop the simulation system {open_quotes}IMPACT{close_quotes} for analysis of hypothetical severe accidents in nuclear power plants. IMPACT employs advanced methods of physical modeling and numerical computation, and can simulate a wide spectrum of senarios ranging from normal operation to hypothetical, beyond-design-basis-accident events. Designed as a large-scale system of interconnected, hierarchical modules, IMPACT`s distinguishing features include mechanistic models based on first principles and high speed simulation on parallel processing computers. The present plan is a ten-year program starting from 1993, consisting of the initial one-year of preparatory work followed bymore » three technical phases: Phase-1 for development of a prototype system; Phase-2 for completion of the simulation system, incorporating new achievements from basic studies; and Phase-3 for refinement through extensive verification and validation against test results and available real plant data.« less

  7. Improving Security for SCADA Sensor Networks with Reputation Systems and Self-Organizing Maps.

    PubMed

    Moya, José M; Araujo, Alvaro; Banković, Zorana; de Goyeneche, Juan-Mariano; Vallejo, Juan Carlos; Malagón, Pedro; Villanueva, Daniel; Fraga, David; Romero, Elena; Blesa, Javier

    2009-01-01

    The reliable operation of modern infrastructures depends on computerized systems and Supervisory Control and Data Acquisition (SCADA) systems, which are also based on the data obtained from sensor networks. The inherent limitations of the sensor devices make them extremely vulnerable to cyberwarfare/cyberterrorism attacks. In this paper, we propose a reputation system enhanced with distributed agents, based on unsupervised learning algorithms (self-organizing maps), in order to achieve fault tolerance and enhanced resistance to previously unknown attacks. This approach has been extensively simulated and compared with previous proposals.

  8. Note: Design of FPGA based system identification module with application to atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Ghosal, Sayan; Pradhan, Sourav; Salapaka, Murti

    2018-05-01

    The science of system identification is widely utilized in modeling input-output relationships of diverse systems. In this article, we report field programmable gate array (FPGA) based implementation of a real-time system identification algorithm which employs forgetting factors and bias compensation techniques. The FPGA module is employed to estimate the mechanical properties of surfaces of materials at the nano-scale with an atomic force microscope (AFM). The FPGA module is user friendly which can be interfaced with commercially available AFMs. Extensive simulation and experimental results validate the design.

  9. Improving Security for SCADA Sensor Networks with Reputation Systems and Self-Organizing Maps

    PubMed Central

    Moya, José M.; Araujo, Álvaro; Banković, Zorana; de Goyeneche, Juan-Mariano; Vallejo, Juan Carlos; Malagón, Pedro; Villanueva, Daniel; Fraga, David; Romero, Elena; Blesa, Javier

    2009-01-01

    The reliable operation of modern infrastructures depends on computerized systems and Supervisory Control and Data Acquisition (SCADA) systems, which are also based on the data obtained from sensor networks. The inherent limitations of the sensor devices make them extremely vulnerable to cyberwarfare/cyberterrorism attacks. In this paper, we propose a reputation system enhanced with distributed agents, based on unsupervised learning algorithms (self-organizing maps), in order to achieve fault tolerance and enhanced resistance to previously unknown attacks. This approach has been extensively simulated and compared with previous proposals. PMID:22291569

  10. Power and sample-size estimation for microbiome studies using pairwise distances and PERMANOVA.

    PubMed

    Kelly, Brendan J; Gross, Robert; Bittinger, Kyle; Sherrill-Mix, Scott; Lewis, James D; Collman, Ronald G; Bushman, Frederic D; Li, Hongzhe

    2015-08-01

    The variation in community composition between microbiome samples, termed beta diversity, can be measured by pairwise distance based on either presence-absence or quantitative species abundance data. PERMANOVA, a permutation-based extension of multivariate analysis of variance to a matrix of pairwise distances, partitions within-group and between-group distances to permit assessment of the effect of an exposure or intervention (grouping factor) upon the sampled microbiome. Within-group distance and exposure/intervention effect size must be accurately modeled to estimate statistical power for a microbiome study that will be analyzed with pairwise distances and PERMANOVA. We present a framework for PERMANOVA power estimation tailored to marker-gene microbiome studies that will be analyzed by pairwise distances, which includes: (i) a novel method for distance matrix simulation that permits modeling of within-group pairwise distances according to pre-specified population parameters; (ii) a method to incorporate effects of different sizes within the simulated distance matrix; (iii) a simulation-based method for estimating PERMANOVA power from simulated distance matrices; and (iv) an R statistical software package that implements the above. Matrices of pairwise distances can be efficiently simulated to satisfy the triangle inequality and incorporate group-level effects, which are quantified by the adjusted coefficient of determination, omega-squared (ω2). From simulated distance matrices, available PERMANOVA power or necessary sample size can be estimated for a planned microbiome study. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.

    PubMed

    Lee, Leng-Feng; Umberger, Brian R

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.

  12. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB

    PubMed Central

    Lee, Leng-Feng

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184

  13. Force-momentum-based self-guided Langevin dynamics: A rapid sampling method that approaches the canonical ensemble

    NASA Astrophysics Data System (ADS)

    Wu, Xiongwu; Brooks, Bernard R.

    2011-11-01

    The self-guided Langevin dynamics (SGLD) is a method to accelerate conformational searching. This method is unique in the way that it selectively enhances and suppresses molecular motions based on their frequency to accelerate conformational searching without modifying energy surfaces or raising temperatures. It has been applied to studies of many long time scale events, such as protein folding. Recent progress in the understanding of the conformational distribution in SGLD simulations makes SGLD also an accurate method for quantitative studies. The SGLD partition function provides a way to convert the SGLD conformational distribution to the canonical ensemble distribution and to calculate ensemble average properties through reweighting. Based on the SGLD partition function, this work presents a force-momentum-based self-guided Langevin dynamics (SGLDfp) simulation method to directly sample the canonical ensemble. This method includes interaction forces in its guiding force to compensate the perturbation caused by the momentum-based guiding force so that it can approximately sample the canonical ensemble. Using several example systems, we demonstrate that SGLDfp simulations can approximately maintain the canonical ensemble distribution and significantly accelerate conformational searching. With optimal parameters, SGLDfp and SGLD simulations can cross energy barriers of more than 15 kT and 20 kT, respectively, at similar rates for LD simulations to cross energy barriers of 10 kT. The SGLDfp method is size extensive and works well for large systems. For studies where preserving accessible conformational space is critical, such as free energy calculations and protein folding studies, SGLDfp is an efficient approach to search and sample the conformational space.

  14. Simulation for Teaching Orthopaedic Residents in a Competency-based Curriculum: Do the Benefits Justify the Increased Costs?

    PubMed

    Nousiainen, Markku T; McQueen, Sydney A; Ferguson, Peter; Alman, Benjamin; Kraemer, William; Safir, Oleg; Reznick, Richard; Sonnadara, Ranil

    2016-04-01

    Although simulation-based training is becoming widespread in surgical education and research supports its use, one major limitation is cost. Until now, little has been published on the costs of simulation in residency training. At the University of Toronto, a novel competency-based curriculum in orthopaedic surgery has been implemented for training selected residents, which makes extensive use of simulation. Despite the benefits of this intensive approach to simulation, there is a need to consider its financial implications and demands on faculty time. This study presents a cost and faculty work-hours analysis of implementing simulation as a teaching and evaluation tool in the University of Toronto's novel competency-based curriculum program compared with the historic costs of using simulation in the residency training program. All invoices for simulation training were reviewed to determine the financial costs before and after implementation of the competency-based curriculum. Invoice items included costs for cadavers, artificial models, skills laboratory labor, associated materials, and standardized patients. Costs related to the surgical skills laboratory rental fees and orthopaedic implants were waived as a result of special arrangements with the skills laboratory and implant vendors. Although faculty time was not reimbursed, faculty hours dedicated to simulation were also evaluated. The academic year of 2008 to 2009 was chosen to represent an academic year that preceded the introduction of the competency-based curriculum. During this year, 12 residents used simulation for teaching. The academic year of 2010 to 2011 was chosen to represent an academic year when the competency-based curriculum training program was functioning parallel but separate from the regular stream of training. In this year, six residents used simulation for teaching and assessment. The academic year of 2012 to 2013 was chosen to represent an academic year when simulation was used equally among the competency-based curriculum and regular stream residents for teaching (60 residents) and among 14 competency-based curriculum residents and 21 regular stream residents for assessment. The total costs of using simulation to teach and assess all residents in the competency-based curriculum and regular stream programs (academic year 2012-2013) (CDN 155,750, USD 158,050) were approximately 15 times higher than the cost of using simulation to teach residents before the implementation of the competency-based curriculum (academic year 2008-2009) (CDN 10,090, USD 11,140). The number of hours spent teaching and assessing trainees increased from 96 to 317 hours during this period, representing a threefold increase. Although the financial costs and time demands on faculty in running the simulation program in the new competency-based curriculum at the University of Toronto have been substantial, augmented learner and trainer satisfaction has been accompanied by direct evidence of improved and more efficient learning outcomes. The higher costs and demands on faculty time associated with implementing simulation for teaching and assessment must be considered when it is used to enhance surgical training.

  15. Integrated techno-economic and environmental analysis of butadiene production from biomass.

    PubMed

    Farzad, Somayeh; Mandegari, Mohsen Ali; Görgens, Johann F

    2017-09-01

    In this study, lignocellulose biorefineries annexed to a typical sugar mill were investigated to produce either ethanol (EtOH) or 1,3-butadiene (BD), utilizing bagasse and trash as feedstock. Aspen simulation of the scenarios were developed and evaluated in terms of economic and environmental performance. The minimum selling prices (MSPs) for bio-based BD and EtOH production were 2.9-3.3 and 1.26-1.38-fold higher than market prices, respectively. Based on the sensitivity analysis results, capital investment, Internal Rate of Return and extension of annual operating time had the greatest impact on the MSP. Monte Carlo simulation demonstrated that EtOH and BD productions could be profitable if the average of ten-year historical price increases by 1.05 and 1.9-fold, respectively. The fossil-based route was found inferior to bio-based pathway across all investigated environmental impact categories, due to burdens associated with oil extraction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A New Improved and Extended Version of the Multicell Bacterial Simulator gro.

    PubMed

    Gutiérrez, Martín; Gregorio-Godoy, Paula; Pérez Del Pulgar, Guillermo; Muñoz, Luis E; Sáez, Sandra; Rodríguez-Patón, Alfonso

    2017-08-18

    gro is a cell programming language developed in Klavins Lab for simulating colony growth and cell-cell communication. It is used as a synthetic biology prototyping tool for simulating multicellular biocircuits and microbial consortia. In this work, we present several extensions made to gro that improve the performance of the simulator, make it easier to use, and provide new functionalities. The new version of gro is between 1 and 2 orders of magnitude faster than the original version. It is able to grow microbial colonies with up to 10 5 cells in less than 10 min. A new library, CellEngine, accelerates the resolution of spatial physical interactions between growing and dividing cells by implementing a new shoving algorithm. A genetic library, CellPro, based on Probabilistic Timed Automata, simulates gene expression dynamics using simplified and easy to compute digital proteins. We also propose a more convenient language specification layer, ProSpec, based on the idea that proteins drive cell behavior. CellNutrient, another library, implements Monod-based growth and nutrient uptake functionalities. The intercellular signaling management was improved and extended in a library called CellSignals. Finally, bacterial conjugation, another local cell-cell communication process, was added to the simulator. To show the versatility and potential outreach of this version of gro, we provide studies and novel examples ranging from synthetic biology to evolutionary microbiology. We believe that the upgrades implemented for gro have made it into a powerful and fast prototyping tool capable of simulating a large variety of systems and synthetic biology designs.

  17. Effect of ski simulator training on kinematic and muscle activation of the lower extremities

    PubMed Central

    Moon, Jeheon; Koo, Dohoon; Kim, Kitae; Shin, Insik; Kim, Hyeyoung; Kim, Jinhae

    2015-01-01

    [Purpose] This study aimed to verify the effectiveness of an augmented reality-based ski simulator through analyzing the changes in movement patterns as well as the engagement of major muscles of the lower body. [Subjects] Seven subjects participated in the study. All were national team-level athletes studying at “K” Sports University in Korea who exhibited comparable performance levels and had no record of injuries in the preceding 6 months (Age 23.4 ± 3.8 years; Height 172.6 ± 12.1 cm; Weight 72.3 ± 16.2 kg; Experience 12.3 ± 4.8 years). [Methods] A reality-based ski simulator developed by a Korean manufacturer was used for the study. Three digital video cameras and a wireless electromyography system were used to perform 3-dimensional motion analysis and measure muscle activation level. [Results] Left hip angulation was found to increase as the frequency of the turns increased. Electromyography data revealed that the activation level of the quadriceps group’s extension muscles and the biceps femoris group’s flexing muscles had a crossing pattern. [Conclusion] Sustained training using an augmented reality-based ski simulator resulted in movements that extended the lower body joints, which is thought to contribute to increasing muscle fatigue. PMID:26357449

  18. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    USGS Publications Warehouse

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  19. Richardson-Lucy/maximum likelihood image restoration algorithm for fluorescence microscopy: further testing.

    PubMed

    Holmes, T J; Liu, Y H

    1989-11-15

    A maximum likelihood based iterative algorithm adapted from nuclear medicine imaging for noncoherent optical imaging was presented in a previous publication with some initial computer-simulation testing. This algorithm is identical in form to that previously derived in a different way by W. H. Richardson "Bayesian-Based Iterative Method of Image Restoration," J. Opt. Soc. Am. 62, 55-59 (1972) and L. B. Lucy "An Iterative Technique for the Rectification of Observed Distributions," Astron. J. 79, 745-765 (1974). Foreseen applications include superresolution and 3-D fluorescence microscopy. This paper presents further simulation testing of this algorithm and a preliminary experiment with a defocused camera. The simulations show quantified resolution improvement as a function of iteration number, and they show qualitatively the trend in limitations on restored resolution when noise is present in the data. Also shown are results of a simulation in restoring missing-cone information for 3-D imaging. Conclusions are in support of the feasibility of using these methods with real systems, while computational cost and timing estimates indicate that it should be realistic to implement these methods. Itis suggested in the Appendix that future extensions to the maximum likelihood based derivation of this algorithm will address some of the limitations that are experienced with the nonextended form of the algorithm presented here.

  20. Characterizing groundwater/surface-water interactions in the interior of Jianghan Plain, central China

    NASA Astrophysics Data System (ADS)

    Du, Yao; Ma, Teng; Deng, Yamin; Shen, Shuai; Lu, Zongjie

    2018-06-01

    Quantifying groundwater/surface-water interactions is essential for managing water resources and revealing contaminant fate. There has been little concern on the exchange between streams and aquifers through an extensive aquitard thus far. In this study, hydrogeologic calculation and tritium modeling were jointly applied to characterize such interactions through an extensive aquitard in the interior of Jianghan Plain, an alluvial plain of Yangtze River, China. One groundwater simulation suggested that the lateral distance of influence from the river was about 1,000 m; vertical flow in the aquitard followed by lateral flow in the aquifer contributed significantly more ( 90%) to the aquifer head change near the river than lateral bank storage in the aquitard followed by infiltration. The hydrogeologic calculation produced vertical fluxes of the order 0.01 m/day both near and farther from the river, suggesting that similar shorter-lived (half-monthly) vertical fluxes occur between the river and aquitard near the river, and between the surface end members and aquitard farther from the river. Tritium simulation based on the OTIS model produced an average groundwater residence time of about 15 years near the river and a resulting vertical flux of the order 0.001 m/day. Another tritium simulation based on a dispersion model produced a vertical flux of the order 0.0001 m/day away from the river, coupled with an average residence time of around 90 years. These results suggest an order of magnitude difference for the longer-lived (decadal) vertical fluxes between surface waters and the aquifer near and away from the river.

  1. Characterizing groundwater/surface-water interactions in the interior of Jianghan Plain, central China

    NASA Astrophysics Data System (ADS)

    Du, Yao; Ma, Teng; Deng, Yamin; Shen, Shuai; Lu, Zongjie

    2018-01-01

    Quantifying groundwater/surface-water interactions is essential for managing water resources and revealing contaminant fate. There has been little concern on the exchange between streams and aquifers through an extensive aquitard thus far. In this study, hydrogeologic calculation and tritium modeling were jointly applied to characterize such interactions through an extensive aquitard in the interior of Jianghan Plain, an alluvial plain of Yangtze River, China. One groundwater simulation suggested that the lateral distance of influence from the river was about 1,000 m; vertical flow in the aquitard followed by lateral flow in the aquifer contributed significantly more ( 90%) to the aquifer head change near the river than lateral bank storage in the aquitard followed by infiltration. The hydrogeologic calculation produced vertical fluxes of the order 0.01 m/day both near and farther from the river, suggesting that similar shorter-lived (half-monthly) vertical fluxes occur between the river and aquitard near the river, and between the surface end members and aquitard farther from the river. Tritium simulation based on the OTIS model produced an average groundwater residence time of about 15 years near the river and a resulting vertical flux of the order 0.001 m/day. Another tritium simulation based on a dispersion model produced a vertical flux of the order 0.0001 m/day away from the river, coupled with an average residence time of around 90 years. These results suggest an order of magnitude difference for the longer-lived (decadal) vertical fluxes between surface waters and the aquifer near and away from the river.

  2. Extensible Adaptable Simulation Systems: Supporting Multiple Fidelity Simulations in a Common Environment

    NASA Technical Reports Server (NTRS)

    McLaughlin, Brian J.; Barrett, Larry K.

    2012-01-01

    Common practice in the development of simulation systems is meeting all user requirements within a single instantiation. The Joint Polar Satellite System (JPSS) presents a unique challenge to establish a simulation environment that meets the needs of a diverse user community while also spanning a multi-mission environment over decades of operation. In response, the JPSS Flight Vehicle Test Suite (FVTS) is architected with an extensible infrastructure that supports the operation of multiple observatory simulations for a single mission and multiple mission within a common system perimeter. For the JPSS-1 satellite, multiple fidelity flight observatory simulations are necessary to support the distinct user communities consisting of the Common Ground System development team, the Common Ground System Integration & Test team, and the Mission Rehearsal Team/Mission Operations Team. These key requirements present several challenges to FVTS development. First, the FVTS must ensure all critical user requirements are satisfied by at least one fidelity instance of the observatory simulation. Second, the FVTS must allow for tailoring of the system instances to function in diverse operational environments from the High-security operations environment at NOAA Satellite Operations Facility (NSOF) to the ground system factory floor. Finally, the FVTS must provide the ability to execute sustaining engineering activities on a subset of the system without impacting system availability to parallel users. The FVTS approach of allowing for multiple fidelity copies of observatory simulations represents a unique concept in simulator capability development and corresponds to the JPSS Ground System goals of establishing a capability that is flexible, extensible, and adaptable.

  3. Point cloud modeling using the homogeneous transformation for non-cooperative pose estimation

    NASA Astrophysics Data System (ADS)

    Lim, Tae W.

    2015-06-01

    A modeling process to simulate point cloud range data that a lidar (light detection and ranging) sensor produces is presented in this paper in order to support the development of non-cooperative pose (relative attitude and position) estimation approaches which will help improve proximity operation capabilities between two adjacent vehicles. The algorithms in the modeling process were based on the homogeneous transformation, which has been employed extensively in robotics and computer graphics, as well as in recently developed pose estimation algorithms. Using a flash lidar in a laboratory testing environment, point cloud data of a test article was simulated and compared against the measured point cloud data. The simulated and measured data sets match closely, validating the modeling process. The modeling capability enables close examination of the characteristics of point cloud images of an object as it undergoes various translational and rotational motions. Relevant characteristics that will be crucial in non-cooperative pose estimation were identified such as shift, shadowing, perspective projection, jagged edges, and differential point cloud density. These characteristics will have to be considered in developing effective non-cooperative pose estimation algorithms. The modeling capability will allow extensive non-cooperative pose estimation performance simulations prior to field testing, saving development cost and providing performance metrics of the pose estimation concepts and algorithms under evaluation. The modeling process also provides "truth" pose of the test objects with respect to the sensor frame so that the pose estimation error can be quantified.

  4. The molecular kink paradigm for rubber elasticity: Numerical simulations of explicit polyisoprene networks at low to moderate tensile strains

    NASA Astrophysics Data System (ADS)

    Hanson, David E.

    2011-08-01

    Based on recent molecular dynamics and ab initio simulations of small isoprene molecules, we propose a new ansatz for rubber elasticity. We envision a network chain as a series of independent molecular kinks, each comprised of a small number of backbone units, and the strain as being imposed along the contour of the chain. We treat chain extension in three distinct force regimes: (Ia) near zero strain, where we assume that the chain is extended within a well defined tube, with all of the kinks participating simultaneously as entropic elastic springs, (II) when the chain becomes sensibly straight, giving rise to a purely enthalpic stretching force (until bond rupture occurs) and, (Ib) a linear entropic regime, between regimes Ia and II, in which a force limit is imposed by tube deformation. In this intermediate regime, the molecular kinks are assumed to be gradually straightened until the chain becomes a series of straight segments between entanglements. We assume that there exists a tube deformation tension limit that is inversely proportional to the chain path tortuosity. Here we report the results of numerical simulations of explicit three-dimensional, periodic, polyisoprene networks, using these extension-only force models. At low strain, crosslink nodes are moved affinely, up to an arbitrary node force limit. Above this limit, non-affine motion of the nodes is allowed to relax unbalanced chain forces. Our simulation results are in good agreement with tensile stress vs. strain experiments.

  5. The molecular kink paradigm for rubber elasticity: numerical simulations of explicit polyisoprene networks at low to moderate tensile strains.

    PubMed

    Hanson, David E

    2011-08-07

    Based on recent molecular dynamics and ab initio simulations of small isoprene molecules, we propose a new ansatz for rubber elasticity. We envision a network chain as a series of independent molecular kinks, each comprised of a small number of backbone units, and the strain as being imposed along the contour of the chain. We treat chain extension in three distinct force regimes: (Ia) near zero strain, where we assume that the chain is extended within a well defined tube, with all of the kinks participating simultaneously as entropic elastic springs, (II) when the chain becomes sensibly straight, giving rise to a purely enthalpic stretching force (until bond rupture occurs) and, (Ib) a linear entropic regime, between regimes Ia and II, in which a force limit is imposed by tube deformation. In this intermediate regime, the molecular kinks are assumed to be gradually straightened until the chain becomes a series of straight segments between entanglements. We assume that there exists a tube deformation tension limit that is inversely proportional to the chain path tortuosity. Here we report the results of numerical simulations of explicit three-dimensional, periodic, polyisoprene networks, using these extension-only force models. At low strain, crosslink nodes are moved affinely, up to an arbitrary node force limit. Above this limit, non-affine motion of the nodes is allowed to relax unbalanced chain forces. Our simulation results are in good agreement with tensile stress vs. strain experiments.

  6. Distal radius osteotomy with volar locking plates based on computer simulation.

    PubMed

    Miyake, Junichi; Murase, Tsuyoshi; Moritomo, Hisao; Sugamoto, Kazuomi; Yoshikawa, Hideki

    2011-06-01

    Corrective osteotomy using dorsal plates and structural bone graft usually has been used for treating symptomatic distal radius malunions. However, the procedure is technically demanding and requires an extensive dorsal approach. Residual deformity is a relatively frequent complication of this technique. We evaluated the clinical applicability of a three-dimensional osteotomy using computer-aided design and manufacturing techniques with volar locking plates for distal radius malunions. Ten patients with metaphyseal radius malunions were treated. Corrective osteotomy was simulated with the help of three-dimensional bone surface models created using CT data. We simulated the most appropriate screw holes in the deformed radius using computer-aided design data of a locking plate. During surgery, using a custom-made surgical template, we predrilled the screw holes as simulated. After osteotomy, plate fixation using predrilled screw holes enabled automatic reduction of the distal radial fragment. Autogenous iliac cancellous bone was grafted after plate fixation. The median volar tilt, radial inclination, and ulnar variance improved from -20°, 13°, and 6 mm, respectively, before surgery to 12°, 24°, and 1 mm, respectively, after surgery. The median wrist flexion improved from 33° before surgery to 60° after surgery. The median wrist extension was 70° before surgery and 65° after surgery. All patients experienced wrist pain before surgery, which disappeared or decreased after surgery. Surgeons can operate precisely and easily using this advanced technique. It is a new treatment option for malunion of distal radius fractures.

  7. SKIRT: The design of a suite of input models for Monte Carlo radiative transfer simulations

    NASA Astrophysics Data System (ADS)

    Baes, M.; Camps, P.

    2015-09-01

    The Monte Carlo method is the most popular technique to perform radiative transfer simulations in a general 3D geometry. The algorithms behind and acceleration techniques for Monte Carlo radiative transfer are discussed extensively in the literature, and many different Monte Carlo codes are publicly available. On the contrary, the design of a suite of components that can be used for the distribution of sources and sinks in radiative transfer codes has received very little attention. The availability of such models, with different degrees of complexity, has many benefits. For example, they can serve as toy models to test new physical ingredients, or as parameterised models for inverse radiative transfer fitting. For 3D Monte Carlo codes, this requires algorithms to efficiently generate random positions from 3D density distributions. We describe the design of a flexible suite of components for the Monte Carlo radiative transfer code SKIRT. The design is based on a combination of basic building blocks (which can be either analytical toy models or numerical models defined on grids or a set of particles) and the extensive use of decorators that combine and alter these building blocks to more complex structures. For a number of decorators, e.g. those that add spiral structure or clumpiness, we provide a detailed description of the algorithms that can be used to generate random positions. Advantages of this decorator-based design include code transparency, the avoidance of code duplication, and an increase in code maintainability. Moreover, since decorators can be chained without problems, very complex models can easily be constructed out of simple building blocks. Finally, based on a number of test simulations, we demonstrate that our design using customised random position generators is superior to a simpler design based on a generic black-box random position generator.

  8. Evaluating State Options for Reducing Medicaid Churning

    PubMed Central

    Swartz, Katherine; Short, Pamela Farley; Graefe, Deborah R.; Uberoi, Namrata

    2015-01-01

    Medicaid churning - the constant exit and re-entry of beneficiaries as their eligibility changes - has long been a problem for both Medicaid administrators and recipients. Churning will continue under the Affordable Care Act, because despite new federal rules, Medicaid eligibility will continue to be based on current monthly income. We developed a longitudinal simulation model to evaluate four policy options for modifying or extending Medicaid eligibility to reduce churning. The simulations suggest that two options, extending Medicaid eligibility either to the end of a calendar year or for twelve months after enrollment, would be far more effective in reducing churning than the other options of a three-month extension or eligibility based on projected annual income. States should consider implementation of the option that best balances costs, including both administration and services, with improved health of Medicaid enrollees. PMID:26153313

  9. Glyph-based analysis of multimodal directional distributions in vector field ensembles

    NASA Astrophysics Data System (ADS)

    Jarema, Mihaela; Demir, Ismail; Kehrer, Johannes; Westermann, Rüdiger

    2015-04-01

    Ensemble simulations are increasingly often performed in the geosciences in order to study the uncertainty and variability of model predictions. Describing ensemble data by mean and standard deviation can be misleading in case of multimodal distributions. We present first results of a glyph-based visualization of multimodal directional distributions in 2D and 3D vector ensemble data. Directional information on the circle/sphere is modeled using mixtures of probability density functions (pdfs), which enables us to characterize the distributions with relatively few parameters. The resulting mixture models are represented by 2D and 3D lobular glyphs showing direction, spread and strength of each principal mode of the distributions. A 3D extension of our approach is realized by means of an efficient GPU rendering technique. We demonstrate our method in the context of ensemble weather simulations.

  10. Investigating the performances of a 1 MV high pulsed power linear transformer driver: from beam dynamics to x radiation

    NASA Astrophysics Data System (ADS)

    Maisonny, R.; Ribière, M.; Toury, M.; Plewa, J. M.; Caron, M.; Auriel, G.; d'Almeida, T.

    2016-12-01

    The performance of a 1 MV pulsed high-power linear transformer driver accelerator were extensively investigated based on a numerical approach which utilizes both electromagnetic and Monte Carlo simulations. Particle-in-cell calculations were employed to examine the beam dynamics throughout the magnetically insulated transmission line which governs the coupling between the generator and the electron diode. Based on the information provided by the study of the beam dynamics, and using Monte Carlo methods, the main properties of the resulting x radiation were predicted. Good agreement was found between these simulations and experimental results. This work provides a detailed understanding of mechanisms affecting the performances of this type of high current, high-voltage pulsed accelerator, which are very promising for a growing number of applications.

  11. Simulation of fluid-structure interaction in micropumps by coupling of two commercial finite element programs

    NASA Astrophysics Data System (ADS)

    Klein, Andreas; Gerlach, Gerald

    1998-09-01

    This paper deals with the simulation of the fluid-structure interaction phenomena in micropumps. The proposed solution approach is based on external coupling of two different solvers, which are considered here as `black boxes'. Therefore, no specific intervention is necessary into the program code, and solvers can be exchanged arbitrarily. For the realization of the external iteration loop, two algorithms are considered: the relaxation-based Gauss-Seidel method and the computationally more extensive Newton method. It is demonstrated in terms of a simplified test case, that for rather weak coupling, the Gauss-Seidel method is sufficient. However, by simply changing the considered fluid from air to water, the two physical domains become strongly coupled, and the Gauss-Seidel method fails to converge in this case. The Newton iteration scheme must be used instead.

  12. Markov state models of protein misfolding

    NASA Astrophysics Data System (ADS)

    Sirur, Anshul; De Sancho, David; Best, Robert B.

    2016-02-01

    Markov state models (MSMs) are an extremely useful tool for understanding the conformational dynamics of macromolecules and for analyzing MD simulations in a quantitative fashion. They have been extensively used for peptide and protein folding, for small molecule binding, and for the study of native ensemble dynamics. Here, we adapt the MSM methodology to gain insight into the dynamics of misfolded states. To overcome possible flaws in root-mean-square deviation (RMSD)-based metrics, we introduce a novel discretization approach, based on coarse-grained contact maps. In addition, we extend the MSM methodology to include "sink" states in order to account for the irreversibility (on simulation time scales) of processes like protein misfolding. We apply this method to analyze the mechanism of misfolding of tandem repeats of titin domains, and how it is influenced by confinement in a chaperonin-like cavity.

  13. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  14. Electron distribution functions in electric field environments

    NASA Technical Reports Server (NTRS)

    Rudolph, Terence H.

    1991-01-01

    The amount of current carried by an electric discharge in its early stages of growth is strongly dependent on its geometrical shape. Discharges with a large number of branches, each funnelling current to a common stem, tend to carry more current than those with fewer branches. The fractal character of typical discharges was simulated using stochastic models based on solutions of the Laplace equation. Extension of these models requires the use of electron distribution functions to describe the behavior of electrons in the undisturbed medium ahead of the discharge. These electrons, interacting with the electric field, determine the propagation of branches in the discharge and the way in which further branching occurs. The first phase in the extension of the referenced models , the calculation of simple electron distribution functions in an air/electric field medium, is discussed. Two techniques are investigated: (1) the solution of the Boltzmann equation in homogeneous, steady state environments, and (2) the use of Monte Carlo simulations. Distribution functions calculated from both techniques are illustrated. Advantages and disadvantages of each technique are discussed.

  15. DAKOTA Design Analysis Kit for Optimization and Terascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.

    2010-02-24

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less

  16. The folding transition state of Protein L is extensive with non-native interactions (and not small and polarized)

    PubMed Central

    Yoo, Tae Yeon; Adhikari, Aashish; Xia, Zhen; Huynh, Tien; Freed, Karl F.; Zhou, Ruhong; Sosnick, Tobin R.

    2012-01-01

    Progress in understanding protein folding relies heavily upon an interplay between experiment and theory. In particular, readily interpretable experimental data are required that can be meaningfully compared to simulations. According to standard mutational φ analysis, the transition state for Protein L contains only a single hairpin. However, we demonstrate here using ψ analysis with engineered metal ion binding sites that the transition state is extensive, containing the entire four-stranded β sheet. Underreporting of the structural content of the transition state by φ analysis also occurs for acyl phosphatase1, ubiquitin2 and BdpA3. The carboxy terminal hairpin in the transition state of Protein L is found to be non-native, a significant result that agrees with our PDB-based backbone sampling and all-atom simulations. The non-native character partially explains the failure of accepted experimental and native-centric computational approaches to adequately describe the transition state. Hence, caution is required even when an apparent agreement exists between experiment and theory, thus highlighting the importance of having alternative methods for characterizing transition states. PMID:22522126

  17. New insights into the human body iron metabolism analyzed by a Petri net based approach.

    PubMed

    Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Blazewicz, Jacek

    2009-04-01

    Iron homeostasis is one of the most important biochemical processes in the human body. Despite this fact, the process is not fully understood and until recently only rough descriptions of parts of the process could be found in the literature. Here, an extension of the recently published formal model of the main part of the process is presented. This extension consists in including all known mechanisms of hepcidin regulation. Hepcidin is a hormone synthesized in the liver which is mainly responsible for an inhibition of iron absorption in the small intestine during an inflammatory process. The model is expressed in the language of Petri net theory which allows for its relatively easy analysis and simulation.

  18. Predicting a future lifetime through Box-Cox transformation.

    PubMed

    Yang, Z

    1999-09-01

    In predicting a future lifetime based on a sample of past lifetimes, the Box-Cox transformation method provides a simple and unified procedure that is shown in this article to meet or often outperform the corresponding frequentist solution in terms of coverage probability and average length of prediction intervals. Kullback-Leibler information and second-order asymptotic expansion are used to justify the Box-Cox procedure. Extensive Monte Carlo simulations are also performed to evaluate the small sample behavior of the procedure. Certain popular lifetime distributions, such as Weibull, inverse Gaussian and Birnbaum-Saunders are served as illustrative examples. One important advantage of the Box-Cox procedure lies in its easy extension to linear model predictions where the exact frequentist solutions are often not available.

  19. Correlation of high energy muons with primary composition in extensive air shower

    NASA Technical Reports Server (NTRS)

    Chou, C.; Higashi, S.; Hiraoka, N.; Ozaki, S.; Sato, T.; Suwada, T.; Takahasi, T.; Umeda, H.

    1985-01-01

    An experimental investigation of high energy muons above 200 GeV in extensive air showers has been made for studying high energy interaction and primary composition of cosmic rays of energies in the range 10 to the 14th power approx. 10 to the 15th power eV. The muon energies are estimated from the burst sizes initiated by the muons in the rock, which are measured by four layers of proportional counters, each of area 5 x 2.6 sq m, placed at 30 m.w.e. deep, Funasaka tunnel vertically below the air shower array. These results are compared with Monte Carlo simulations based on the scaling model and the fireball model for two primary compositions, all proton and mixed.

  20. State variable theories based on Hart's formulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korhonen, M.A.; Hannula, S.P.; Li, C.Y.

    In this paper a review of the development of a state variable theory for nonelastic deformation is given. The physical and phenomenological basis of the theory and the constitutive equations describing macroplastic, microplastic, anelastic and grain boundary sliding enhanced deformation are presented. The experimental and analytical evaluation of different parameters in the constitutive equations are described in detail followed by a review of the extensive experimental work on different materials. The technological aspects of the state variable approach are highlighted by examples of the simulative and predictive capabilities of the theory. Finally, a discussion of general capabilities, limitations and futuremore » developments of the theory and particularly the possible extensions to cover an even wider range of deformation or deformation-related phenomena is presented.« less

  1. Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON.

    PubMed

    Mattioni, Michele; Cohen, Uri; Le Novère, Nicolas

    2012-01-01

    The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.

  2. Biomechanical testing simulation of a cadaver spine specimen: development and evaluation study.

    PubMed

    Ahn, Hyung Soo; DiAngelo, Denis J

    2007-05-15

    This article describes a computer model of the cadaver cervical spine specimen and virtual biomechanical testing. To develop a graphics-oriented, multibody model of a cadaver cervical spine and to build a virtual laboratory simulator for the biomechanical testing using physics-based dynamic simulation techniques. Physics-based computer simulations apply the laws of physics to solid bodies with defined material properties. This technique can be used to create a virtual simulator for the biomechanical testing of a human cadaver spine. An accurate virtual model and simulation would complement tissue-based in vitro studies by providing a consistent test bed with minimal variability and by reducing cost. The geometry of cervical vertebrae was created from computed tomography images. Joints linking adjacent vertebrae were modeled as a triple-joint complex, comprised of intervertebral disc joints in the anterior region, 2 facet joints in the posterior region, and the surrounding ligament structure. A virtual laboratory simulation of an in vitro testing protocol was performed to evaluate the model responses during flexion, extension, and lateral bending. For kinematic evaluation, the rotation of motion segment unit, coupling behaviors, and 3-dimensional helical axes of motion were analyzed. The simulation results were in correlation with the findings of in vitro tests and published data. For kinetic evaluation, the forces of the intervertebral discs and facet joints of each segment were determined and visually animated. This methodology produced a realistic visualization of in vitro experiment, and allowed for the analyses of the kinematics and kinetics of the cadaver cervical spine. With graphical illustrations and animation features, this modeling technique has provided vivid and intuitive information.

  3. Physical Processes and Applications of the Monte Carlo Radiative Energy Deposition (MRED) Code

    NASA Astrophysics Data System (ADS)

    Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Fleetwood, Daniel M.; Warren, Kevin M.; Sierawski, Brian D.; King, Michael P.; Schrimpf, Ronald D.; Auden, Elizabeth C.

    2015-08-01

    MRED is a Python-language scriptable computer application that simulates radiation transport. It is the computational engine for the on-line tool CRÈME-MC. MRED is based on c++ code from Geant4 with additional Fortran components to simulate electron transport and nuclear reactions with high precision. We provide a detailed description of the structure of MRED and the implementation of the simulation of physical processes used to simulate radiation effects in electronic devices and circuits. Extensive discussion and references are provided that illustrate the validation of models used to implement specific simulations of relevant physical processes. Several applications of MRED are summarized that demonstrate its ability to predict and describe basic physical phenomena associated with irradiation of electronic circuits and devices. These include effects from single particle radiation (including both direct ionization and indirect ionization effects), dose enhancement effects, and displacement damage effects. MRED simulations have also helped to identify new single event upset mechanisms not previously observed by experiment, but since confirmed, including upsets due to muons and energetic electrons.

  4. Enhanced sampling simulations to construct free-energy landscape of protein-partner substrate interaction.

    PubMed

    Ikebe, Jinzen; Umezawa, Koji; Higo, Junichi

    2016-03-01

    Molecular dynamics (MD) simulations using all-atom and explicit solvent models provide valuable information on the detailed behavior of protein-partner substrate binding at the atomic level. As the power of computational resources increase, MD simulations are being used more widely and easily. However, it is still difficult to investigate the thermodynamic properties of protein-partner substrate binding and protein folding with conventional MD simulations. Enhanced sampling methods have been developed to sample conformations that reflect equilibrium conditions in a more efficient manner than conventional MD simulations, thereby allowing the construction of accurate free-energy landscapes. In this review, we discuss these enhanced sampling methods using a series of case-by-case examples. In particular, we review enhanced sampling methods conforming to trivial trajectory parallelization, virtual-system coupled multicanonical MD, and adaptive lambda square dynamics. These methods have been recently developed based on the existing method of multicanonical MD simulation. Their applications are reviewed with an emphasis on describing their practical implementation. In our concluding remarks we explore extensions of the enhanced sampling methods that may allow for even more efficient sampling.

  5. Electromagnetic panel deployment and retraction using the geomagnetic field in LEO satellite missions

    NASA Astrophysics Data System (ADS)

    Inamori, Takaya; Sugawara, Yoshiki; Satou, Yasutaka

    2015-12-01

    Increasingly, spacecraft are installed with large-area structures that are extended and deployed post-launch. These extensible structures have been applied in several missions for power generation, thermal radiation, and solar propulsion. Here, we propose a deployment and retraction method using the electromagnetic force generated when the geomagnetic field interacts with electric current flowing on extensible panels. The panels are installed on a satellite in low Earth orbit. Specifically, electrical wires placed on the extensible panels generate magnetic moments, which interfere with the geomagnetic field. The resulting repulsive and retraction forces enable panel deployment and retraction. In the proposed method, a satellite realizes structural deployment using simple electrical wires. Furthermore, the satellite can achieve not only deployment but also retraction for avoiding damage from space debris and for agile attitude maneuvers. Moreover, because the proposed method realizes quasi-static deployment and the retraction of panels by electromagnetic forces, low impulsive force is exerted on fragile panels. The electrical wires can also be used to detect the panel deployment and retraction and generate a large magnetic moment for attitude control. The proposed method was assessed in numerical simulations based on multibody dynamics. Simulation results shows that a small cubic satellite with a wire current of 25 AT deployed 4 panels (20 cm × 20 cm) in 500 s and retracted 4 panels in 100 s.

  6. SU-G-BRC-10: Feasibility of a Web-Based Monte Carlo Simulation Tool for Dynamic Electron Arc Radiotherapy (DEAR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodrigues, A; Wu, Q; Sawkey, D

    Purpose: DEAR is a radiation therapy technique utilizing synchronized motion of gantry and couch during delivery to optimize dose distribution homogeneity and penumbra for treatment of superficial disease. Dose calculation for DEAR is not yet supported by commercial TPSs. The purpose of this study is to demonstrate the feasibility of using a web-based Monte Carlo (MC) simulation tool (VirtuaLinac) to calculate dose distributions for a DEAR delivery. Methods: MC simulations were run through VirtuaLinac, which is based on the GEANT4 platform. VirtuaLinac utilizes detailed linac head geometry and material models, validated phase space files, and a voxelized phantom. The inputmore » was expanded to include an XML file for simulation of varying mechanical axes as a function of MU. A DEAR XML plan was generated and used in the MC simulation and delivered on a TrueBeam in Developer Mode. Radiographic film wrapped on a cylindrical phantom (12.5 cm radius) measured dose at a depth of 1.5 cm and compared to the simulation results. Results: A DEAR plan was simulated using an energy of 6 MeV and a 3×10 cm{sup 2} cut-out in a 15×15 cm{sup 2} applicator for a delivery of a 90° arc. The resulting data were found to provide qualitative and quantitative evidence that the simulation platform could be used as the basis for DEAR dose calculations. The resulting unwrapped 2D dose distributions agreed well in the cross-plane direction along the arc, with field sizes of 18.4 and 18.2 cm and penumbrae of 1.9 and 2.0 cm for measurements and simulations, respectively. Conclusion: Preliminary feasibility of a DEAR delivery using a web-based MC simulation platform has been demonstrated. This tool will benefit treatment planning for DEAR as a benchmark for developing other model based algorithms, allowing efficient optimization of trajectories, and quality assurance of plans without the need for extensive measurements.« less

  7. High speed data transmission coaxial-cable in the space communication system

    NASA Astrophysics Data System (ADS)

    Su, Haohang; Huang, Jing

    2018-01-01

    An effective method is proved based on the scattering parameter of high speed 8-core coaxial-cable measured by vector network analyzer, and the semi-physical simulation is made to receive the eye diagram at different data transmission rate. The result can be apply to analysis decay and distortion of the signal through the coaxial-cable at high frequency, and can extensively design for electromagnetic compatibility of high-speed data transmission system.

  8. Influence of Crown Biomass Estimators and Distribution on Canopy Fuel Characteristics in Ponderosa Pine Stands of the Black Hills

    Treesearch

    Tara Keyser; Frederick Smith

    2009-01-01

    Two determinants of crown fire hazard are canopy bulk density (CBD) and canopy base height (CBH). The Fire and Fuels Extension to the Forest Vegetation Simulator (FFE-FVS) is a model that predicts CBD and CBH. Currently, FFE-FVS accounts for neither geographic variation in tree allometries nor the nonuniform distribution of crown mass when one is estimating CBH and CBD...

  9. Simultaneous confidence bands for Cox regression from semiparametric random censorship.

    PubMed

    Mondal, Shoubhik; Subramanian, Sundarraman

    2016-01-01

    Cox regression is combined with semiparametric random censorship models to construct simultaneous confidence bands (SCBs) for subject-specific survival curves. Simulation results are presented to compare the performance of the proposed SCBs with the SCBs that are based only on standard Cox. The new SCBs provide correct empirical coverage and are more informative. The proposed SCBs are illustrated with two real examples. An extension to handle missing censoring indicators is also outlined.

  10. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes.

    PubMed

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-04-01

    Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a three-dimensional functional-structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed.

  11. A functional–structural model of rice linking quantitative genetic information with morphological development and physiological processes

    PubMed Central

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype–phenotype model, we present here a three-dimensional functional–structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. Methods The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Key Results Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. Conclusions We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed. PMID:21247905

  12. The effects of chain length, embedded polar groups, pressure, and pore shape on structure and retention in reversed-phase liquid chromatography: molecular-level insights from Monte Carlo simulations.

    PubMed

    Rafferty, Jake L; Siepmann, J Ilja; Schure, Mark R

    2009-03-20

    Particle-based simulations using the configurational-bias and Gibbs ensemble Monte Carlo techniques are carried out to probe the effects of various chromatographic parameters on bonded-phase chain conformation, solvent penetration, and retention in reversed-phase liquid chromatography (RPLC). Specifically, we investigate the effects due to the length of the bonded-phase chains (C(18), C(8), and C(1)), the inclusion of embedded polar groups (amide and ether) near the base of the bonded-phase chains, the column pressure (1, 400, and 1000 atm), and the pore shape (planar slit pore versus cylindrical pore with a 60A diameter). These simulations utilize a bonded-phase coverage of 2.9 micromol/m(2)and a mobile phase containing methanol at a molfraction of 33% (about 50% by volume). The simulations show that chain length, embedded polar groups, and pore shape significantly alter structural and retentive properties of the model RPLC system, whereas the column pressure has a relatively small effect. The simulation results are extensively compared to retention measurements. A molecular view of the RPLC retention mechanism emerges that is more complex than can be inferred from thermodynamic measurements.

  13. Data-Driven Software Framework for Web-Based ISS Telescience

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.

    2005-01-01

    Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.

  14. Non-destructive inspection in industrial equipment using robotic mobile manipulation

    NASA Astrophysics Data System (ADS)

    Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah

    2016-05-01

    MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.

  15. Using Monte-Carlo Simulations to Study the Disk Structure in Cygnus X-1

    NASA Technical Reports Server (NTRS)

    Yao, Y.; Zhang, S. N.; Zhang, X. L.; Feng, Y. X.

    2002-01-01

    As the first dynamically determined black hole X-ray binary system, Cygnus X-1 has been studied extensively. However, its broad-band spectra in hard state with BeppoSAX is still not well understood. Besides the soft excess described by the multi-color disk model (MCD), the power- law component and a broad excess feature above 10 keV (disk reflection component), there is also an additional soft component around 1 keV, whose origin is not known currently.We propose that the additional soft component is due to the thermal Comptonization process between the s oft disk photon and the warm plasma cloud just above the disk.i.e., a warm layer. We use Monte-Carlo technique t o simulate this Compton scattering process and build several table models based on our simulation results.

  16. Wildfire simulation using LES with synthetic-velocity SGS models

    NASA Astrophysics Data System (ADS)

    McDonough, J. M.; Tang, Tingting

    2016-11-01

    Wildland fires are becoming more prevalent and intense worldwide as climate change leads to warmer, drier conditions; and large-eddy simulation (LES) is receiving increasing attention for fire spread predictions as computing power continues to improve (see, e.g.,). We report results from wildfire simulations over general terrain employing implicit LES for solution of the incompressible Navier-Stokes (N.-S.) and thermal energy equations with Boussinesq approximation, altered with Darcy, Forchheimer and Brinkman extensions, to represent forested regions as porous media with varying (in both space and time) porosity and permeability. We focus on subgrid-scale (SGS) behaviors computed with a synthetic-velocity model, a discrete dynamical system, based on the poor man's N.-S. equations and investigate the ability of this model to produce fire whirls (tornadoes of fire) at the (unresolved) SGS level. Professor, Mechanical Engineering and Mathematics.

  17. Measuring technical and mathematical investigation of multiple reignitions at the switching of a motor using vacuum circuit breakers

    NASA Astrophysics Data System (ADS)

    Luxa, Andreas

    The necessary conditions in switching system and vacuum circuit breaker for the occurrence of multiple re-ignitions and accompanying effects were examined. The shape of the occurring voltages was determined in relationship to other types of overvoltage. A phenomenological model of the arc, based on an extension of the Mayr equation for arcs was used with the simulation program NETOMAC for the switching transients. Factors which affect the arc parameters were analyzed. The results were statistically verified by 3000 three-phase switching tests on 3 standard vacuum circuit breakers under realistic systems conditions; the occurring overvoltage level was measured. Dimensioning criteria for motor simulation circuits in power plants were formulated on the basis of a theoretical equivalence analysis and experimental studies. The simulation model allows a sufficiently correct estimation of all effects.

  18. A service life extension (SLEP) approach to operating aging aircraft beyond their original design lives

    NASA Astrophysics Data System (ADS)

    Pentz, Alan Carter

    With today's uncertain funding climate (including sequestration and continuing budget resolutions), decision makers face severe budgetary challenges to maintain dominance through all aspects of the Department of Defense (DoD). To meet war-fighting capabilities, the DoD continues to extend aircraft programs beyond their design service lives by up to ten years, and occasionally much more. The budget requires a new approach to traditional extension strategies (i.e., reuse, reset, and reclamation) for structural hardware. While extending service life without careful controls can present a safety concern, future operations planning does not consider how much risk is present when operating within sound structural principles. Traditional structural hardware extension methods drive increased costs. Decision makers often overlook the inherent damage tolerance and fatigue capability of structural components and rely on simple time- and flight-based cycle accumulation when determining aircraft retirement lives. This study demonstrates that decision makers should consider risk in addition to the current extension strategies. Through an evaluation of eight military aircraft programs and the application and simulation of F-18 turbine engine usage data, this dissertation shows that insight into actual aircraft mission data, consideration of fatigue capability, and service extension length are key factors to consider. Aircraft structural components, as well as many critical safety components and system designs, have a predefined level of conservatism and inherent damage tolerance. The methods applied in this study would apply to extensions of other critical structures such as bridges. Understanding how much damage tolerance is built into the design compared to the original design usage requirements presents the opportunity to manage systems based on risk. The study presents the sensitivity of these factors and recommends avenues for further research.

  19. Investigation in Simulated Vertical Descent of the Characteristics of a Cargo-Dropping Device having Extensible Rotating Blades

    NASA Technical Reports Server (NTRS)

    Stone, Ralph W., Jr.; Hultz, Burton E.

    1949-01-01

    The characteristics of a cargo-dropping device having extensible rotating blades as load-carrying surfaces have been studied in simulated vertical descent in the Langley 20-foot free-spinning tunnel. The investigation included tests to determine the variation in vertical sinking speed with load. A study of the blade characteristics and of the test results indicated a method of dynamically balancing the blades to permit proper functioning of the device.

  20. Thermal control/oxidation resistant coatings for titanium-based alloys

    NASA Technical Reports Server (NTRS)

    Clark, Ronald K.; Wallace, Terryl A.; Cunnington, George R.; Wiedemann, Karl E.

    1992-01-01

    Extensive research and development efforts have been expended toward development of thermal control and environmental protection coatings for NASP and generic hypersonic vehicle applications. The objective of the coatings development activities summarized here was to develop light-weight coatings for protecting advanced titanium alloys from oxidation in hypersonic vehicle applications. A number of new coating concepts have been evaluated. Coated samples were exposed to static oxidation tests at temperatures up to 1000 C using a thermogravimetric apparatus. Samples were also exposed to simulated hypersonic flight conditions for up to 10 hr to determine their thermal and chemical stability and catalytic efficiency. The emittance of samples was determined before and after exposure to simulated hypersonic flight conditions.

  1. Flame-Generated Vorticity Production in Premixed Flame-Vortex Interactions

    NASA Technical Reports Server (NTRS)

    Patnaik, G.; Kailasanath, K.

    2003-01-01

    In this study, we use detailed time-dependent, multi-dimensional numerical simulations to investigate the relative importance of the processes leading to FGV in flame-vortex interactions in normal gravity and microgravity and to determine if the production of vorticity in flames in gravity is the same as that in zero gravity except for the contribution of the gravity term. The numerical simulations will be performed using the computational model developed at NRL, FLAME3D. FLAME3D is a parallel, multi-dimensional (either two- or three-dimensional) flame model based on FLIC2D, which has been used extensively to study the structure and stability of premixed hydrogen and methane flames.

  2. Advanced power analysis methodology targeted to the optimization of a digital pixel readout chip design and its critical serial powering system

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.

    2017-02-01

    A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.

  3. A method for obtaining a statistically stationary turbulent free shear flow

    NASA Technical Reports Server (NTRS)

    Timson, Stephen F.; Lele, S. K.; Moser, R. D.

    1994-01-01

    The long-term goal of the current research is the study of Large-Eddy Simulation (LES) as a tool for aeroacoustics. New algorithms and developments in computer hardware are making possible a new generation of tools for aeroacoustic predictions, which rely on the physics of the flow rather than empirical knowledge. LES, in conjunction with an acoustic analogy, holds the promise of predicting the statistics of noise radiated to the far-field of a turbulent flow. LES's predictive ability will be tested through extensive comparison of acoustic predictions based on a Direct Numerical Simulation (DNS) and LES of the same flow, as well as a priori testing of DNS results. The method presented here is aimed at allowing simulation of a turbulent flow field that is both simple and amenable to acoustic predictions. A free shear flow is homogeneous in both the streamwise and spanwise directions and which is statistically stationary will be simulated using equations based on the Navier-Stokes equations with a small number of added terms. Studying a free shear flow eliminates the need to consider flow-surface interactions as an acoustic source. The homogeneous directions and the flow's statistically stationary nature greatly simplify the application of an acoustic analogy.

  4. Flight Dynamic Model Exchange using XML

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2002-01-01

    The AIAA Modeling and Simulation Technical Committee has worked for several years to develop a standard by which the information needed to develop physics-based models of aircraft can be specified. The purpose of this standard is to provide a well-defined set of information, definitions, data tables and axis systems so that cooperating organizations can transfer a model from one simulation facility to another with maximum efficiency. This paper proposes using an application of the eXtensible Markup Language (XML) to implement the AIAA simulation standard. The motivation and justification for using a standard such as XML is discussed. Necessary data elements to be supported are outlined. An example of an aerodynamic model as an XML file is given. This example includes definition of independent and dependent variables for function tables, definition of key variables used to define the model, and axis systems used. The final steps necessary for implementation of the standard are presented. Software to take an XML-defined model and import/export it to/from a given simulation facility is discussed, but not demonstrated. That would be the next step in final implementation of standards for physics-based aircraft dynamic models.

  5. From conscious thought to automatic action: A simulation account of action planning.

    PubMed

    Martiny-Huenger, Torsten; Martiny, Sarah E; Parks-Stamm, Elizabeth J; Pfeiffer, Elisa; Gollwitzer, Peter M

    2017-10-01

    We provide a theoretical framework and empirical evidence for how verbally planning an action creates direct perception-action links and behavioral automaticity. We argue that planning actions in an if (situation)-then (action) format induces sensorimotor simulations (i.e., activity patterns reenacting the event in the sensory and motor brain areas) of the anticipated situation and the intended action. Due to their temporal overlap, these activity patterns become linked. Whenever the previously simulated situation is encountered, the previously simulated action is partially reactivated through spreading activation and thus more likely to be executed. In 4 experiments (N = 363), we investigated the relation between specific if-then action plans worded to activate simulations of elbow flexion versus extension movements and actual elbow flexion versus extension movements in a subsequent, ostensibly unrelated categorization task. As expected, linking a critical stimulus to intended actions that implied elbow flexion movements (e.g., grabbing it for consumption) subsequently facilitated elbow flexion movements upon encountering the critical stimulus. However, linking a critical stimulus to actions that implied elbow extension movements (e.g., pointing at it) subsequently facilitated elbow extension movements upon encountering the critical stimulus. Thus, minor differences (i.e., exchanging the words "point at" with "grab") in verbally formulated action plans (i.e., conscious thought) had systematic consequences on subsequent actions. The question of how conscious thought can induce stimulus-triggered action is illuminated by the provided theoretical framework and the respective empirical evidence, facilitating the understanding of behavioral automaticity and human agency. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Helios: a Multi-Purpose LIDAR Simulation Framework for Research, Planning and Training of Laser Scanning Operations with Airborne, Ground-Based Mobile and Stationary Platforms

    NASA Astrophysics Data System (ADS)

    Bechtold, S.; Höfle, B.

    2016-06-01

    In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.

  7. Oceanic dispersion of Fukushima-derived Cs-137 in the coastal, offshore, and open oceans simulated by multiple oceanic general circulation models

    NASA Astrophysics Data System (ADS)

    Kawamura, H.; Furuno, A.; Kobayashi, T.; In, T.; Nakayama, T.; Ishikawa, Y.; Miyazawa, Y.; Usui, N.

    2017-12-01

    To understand the concentration and amount of Fukushima-derived Cs-137 in the ocean, this study simulates the oceanic dispersion of Cs-137 by an oceanic dispersion model SEA-GEARN-FDM developed at Japan Atomic Energy Agency (JAEA) and multiple oceanic general circulation models. The Cs-137 deposition amounts at the sea surface were used as the source term in oceanic dispersion simulations, which were estimated by atmospheric dispersion simulations with a Worldwide version of System for Prediction of Environmental Emergency Dose Information version II (WSPEEDI-II) developed at JAEA. The direct release from the Fukushima Daiichi Nuclear Power Plant into the ocean based on in situ Cs-137 measurements was used as the other source term in oceanic dispersion simulations. The simulated air Cs-137 concentrations qualitatively replicated those measured around the North Pacific. The accumulated Cs-137 ground deposition amount in the eastern Japanese Islands was consistent with that estimated by aircraft measurements. The oceanic dispersion simulations relatively well reproduced the measured Cs-137 concentrations in the coastal and offshore oceans during the first few months after the Fukushima disaster, and in the open ocean during the first year post-disaster. It was suggested that Cs-137 dispersed along the coast in the north-south direction during the first few months post-disaster, and were subsequently dispersed offshore by the Kuroshio Current and Kuroshio Extension. Mesoscale eddies accompanied by the Kuroshio Current and Kuroshio Extension played an important role in dilution of Cs-137. The Cs-137 amounts were quantified in the coastal, offshore, and open oceans during the first year post-disaster. It was demonstrated that Cs-137 actively dispersed from the coastal and offshore oceans to the open ocean, and from the surface layer to the deeper layer in the North Pacific.

  8. Swarm Counter-Asymmetric-Threat (CAT) 6-DOF Dynamics Simulation

    DTIC Science & Technology

    2005-07-01

    NAWCWD TP 8593 Swarm Counter-Asymmetric-Threat ( CAT ) 6-DOF Dynamics Simulation by James Bobinchak Weapons and Energetics...mathematical models used in the swarm counter- asymmetric-threat ( CAT ) simulation and the results of extensive Monte Carlo simulations. The swarm CAT ...Asymmetric-Threat ( CAT ) 6-DOF Dynamics Simulation (U) 6. AUTHOR(S) James Bobinchak and Gary Hewer 7. PERFORMING ORGANIZATION NAME(S) AND

  9. A Modular and Extensible Architecture Integrating Sensors, Dynamic Displays of Anatomy and Physiology, and Automated Instruction for Innovations in Clinical Education

    ERIC Educational Resources Information Center

    Nelson, Douglas Allen, Jr.

    2017-01-01

    Adoption of simulation in healthcare education has increased tremendously over the past two decades. However, the resources necessary to perform simulation are immense. Simulators are large capital investments and require specialized training for both instructors and simulation support staff to develop curriculum using the simulator and to use the…

  10. Improving the Acquisition of Basic Technical Surgical Skills with VR-Based Simulation Coupled with Computer-Based Video Instruction.

    PubMed

    Rojas, David; Kapralos, Bill; Dubrowski, Adam

    2016-01-01

    Next to practice, feedback is the most important variable in skill acquisition. Feedback can vary in content and the way that it is used for delivery. Health professions education research has extensively examined the different effects provided by the different feedback methodologies. In this paper we compared two different types of knowledge of performance (KP) feedback. The first type was video-based KP feedback while the second type consisted of computer generated KP feedback. Results of this study showed that computer generated performance feedback is more effective than video based performance feedback. The combination of the two feedback methodologies provides trainees with a better understanding.

  11. Investigation of protein folding by coarse-grained molecular dynamics with the UNRES force field.

    PubMed

    Maisuradze, Gia G; Senet, Patrick; Czaplewski, Cezary; Liwo, Adam; Scheraga, Harold A

    2010-04-08

    Coarse-grained molecular dynamics simulations offer a dramatic extension of the time-scale of simulations compared to all-atom approaches. In this article, we describe the use of the physics-based united-residue (UNRES) force field, developed in our laboratory, in protein-structure simulations. We demonstrate that this force field offers about a 4000-times extension of the simulation time scale; this feature arises both from averaging out the fast-moving degrees of freedom and reduction of the cost of energy and force calculations compared to all-atom approaches with explicit solvent. With massively parallel computers, microsecond folding simulation times of proteins containing about 1000 residues can be obtained in days. A straightforward application of canonical UNRES/MD simulations, demonstrated with the example of the N-terminal part of the B-domain of staphylococcal protein A (PDB code: 1BDD, a three-alpha-helix bundle), discerns the folding mechanism and determines kinetic parameters by parallel simulations of several hundred or more trajectories. Use of generalized-ensemble techniques, of which the multiplexed replica exchange method proved to be the most effective, enables us to compute thermodynamics of folding and carry out fully physics-based prediction of protein structure, in which the predicted structure is determined as a mean over the most populated ensemble below the folding-transition temperature. By using principal component analysis of the UNRES folding trajectories of the formin-binding protein WW domain (PDB code: 1E0L; a three-stranded antiparallel beta-sheet) and 1BDD, we identified representative structures along the folding pathways and demonstrated that only a few (low-indexed) principal components can capture the main structural features of a protein-folding trajectory; the potentials of mean force calculated along these essential modes exhibit multiple minima, as opposed to those along the remaining modes that are unimodal. In addition, a comparison between the structures that are representative of the minima in the free-energy profile along the essential collective coordinates of protein folding (computed by principal component analysis) and the free-energy profile projected along the virtual-bond dihedral angles gamma of the backbone revealed the key residues involved in the transitions between the different basins of the folding free-energy profile, in agreement with existing experimental data for 1E0L .

  12. A novel coupling of noise reduction algorithms for particle flow simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zimoń, M.J., E-mail: malgorzata.zimon@stfc.ac.uk; James Weir Fluids Lab, Mechanical and Aerospace Engineering Department, The University of Strathclyde, Glasgow G1 1XJ; Reese, J.M.

    2016-09-15

    Proper orthogonal decomposition (POD) and its extension based on time-windows have been shown to greatly improve the effectiveness of recovering smooth ensemble solutions from noisy particle data. However, to successfully de-noise any molecular system, a large number of measurements still need to be provided. In order to achieve a better efficiency in processing time-dependent fields, we have combined POD with a well-established signal processing technique, wavelet-based thresholding. In this novel hybrid procedure, the wavelet filtering is applied within the POD domain and referred to as WAVinPOD. The algorithm exhibits promising results when applied to both synthetically generated signals and particlemore » data. In this work, the simulations compare the performance of our new approach with standard POD or wavelet analysis in extracting smooth profiles from noisy velocity and density fields. Numerical examples include molecular dynamics and dissipative particle dynamics simulations of unsteady force- and shear-driven liquid flows, as well as phase separation phenomenon. Simulation results confirm that WAVinPOD preserves the dimensionality reduction obtained using POD, while improving its filtering properties through the sparse representation of data in wavelet basis. This paper shows that WAVinPOD outperforms the other estimators for both synthetically generated signals and particle-based measurements, achieving a higher signal-to-noise ratio from a smaller number of samples. The new filtering methodology offers significant computational savings, particularly for multi-scale applications seeking to couple continuum informations with atomistic models. It is the first time that a rigorous analysis has compared de-noising techniques for particle-based fluid simulations.« less

  13. Fusion of Optimized Indicators from Advanced Driver Assistance Systems (ADAS) for Driver Drowsiness Detection

    PubMed Central

    Daza, Iván G.; Bergasa, Luis M.; Bronte, Sebastián; Yebes, J. Javier; Almazán, Javier; Arroyo, Roberto

    2014-01-01

    This paper presents a non-intrusive approach for monitoring driver drowsiness using the fusion of several optimized indicators based on driver physical and driving performance measures, obtained from ADAS (Advanced Driver Assistant Systems) in simulated conditions. The paper is focused on real-time drowsiness detection technology rather than on long-term sleep/awake regulation prediction technology. We have developed our own vision system in order to obtain robust and optimized driver indicators able to be used in simulators and future real environments. These indicators are principally based on driver physical and driving performance skills. The fusion of several indicators, proposed in the literature, is evaluated using a neural network and a stochastic optimization method to obtain the best combination. We propose a new method for ground-truth generation based on a supervised Karolinska Sleepiness Scale (KSS). An extensive evaluation of indicators, derived from trials over a third generation simulator with several test subjects during different driving sessions, was performed. The main conclusions about the performance of single indicators and the best combinations of them are included, as well as the future works derived from this study. PMID:24412904

  14. Background simulations for the wide field imager aboard the ATHENA X-ray Observatory

    NASA Astrophysics Data System (ADS)

    Hauf, Steffen; Kuster, Markus; Hoffmann, Dieter H. H.; Lang, Philipp-Michael; Neff, Stephan; Pia, Maria Grazia; Strüder, Lothar

    2012-09-01

    The ATHENA X-ray observatory was a European Space Agency project for a L-class mission. ATHENA was to be based upon a simplified IXO design with the number of instruments and the focal length of the Wolter optics being reduced. One of the two instruments, the Wide Field Imager (WFI) was to be a DePFET based focal plane pixel detector, allowing for high time and spatial resolution spectroscopy in the energy-range between 0.1 and 15 keV. In order to fulfill the mission goals a high sensitivity is essential, especially to study faint and extended sources. Thus a detailed understanding of the detector background induced by cosmic ray particles is crucial. During the mission design generally extensive Monte-Carlo simulations are used to estimate the detector background in order to optimize shielding components and software rejection algorithms. The Geant4 toolkit1,2 is frequently the tool of choice for this purpose. Alongside validation of the simulation environment with XMM-Newton EPIC-pn and Space Shuttle STS-53 data we present estimates for the ATHENA WFI cosmic ray induced background including long-term activation, which demonstrate that DEPFET-technology based detectors are able to achieve the required sensitivity.

  15. Exact and efficient simulation of concordant computation

    NASA Astrophysics Data System (ADS)

    Cable, Hugo; Browne, Daniel E.

    2015-11-01

    Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.

  16. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  17. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-06-01

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  18. Urban water-quality modelling: implementing an extension to Multi-Hydro platform for real case studies

    NASA Astrophysics Data System (ADS)

    Hong, Yi; Giangola-Murzyn, Agathe; Bonhomme, Celine; Chebbo, Ghassan; Schertzer, Daniel

    2015-04-01

    During the last few years, the physically based and fully distributed numerical platform Multi-Hydro (MH) has been developed to simulate hydrological behaviours in urban/peri-urban areas (El-Tabach et al. , 2009 ; Gires et al., 2013 ; Giangola-Murzyn et al., 2014). This hydro-dynamical platform is open-access and has a modular structure, which is designed to be easily scalable and transportable, in order to simulate the dynamics and complex interactions of the water cycle processes in urban or peri-urban environment (surface hydrology, urban groundwater infrastructures and infiltration). Each hydrological module relies on existing and widely validated open source models, such as TREX model (Velleux, 2005) for the surface module, SWMM model (Rossman, 2010) for the drainage module and VS2DT model (Lappala et al., 1987) for the soil module. In our recent studies, an extension of MH has been set up by connecting the already available water-quality computational components among different modules, to introduce a pollutant transport modelling into the hydro-dynamical platform. As for the surface module in two-dimensions, the concentration of particles in flow is expressed by sediment advection equation, the settling of suspended particles is calculated with a simplified settling velocity formula, while the pollutant wash-off from a given land-use is represented as a mass rate of particle removal from the bottom boundary over time, based on transport capacity, which is computed by a modified form of Universal Soil Loss Equation (USLE). Considering that the USLE is originally conceived to predict soil losses caused by runoff in agriculture areas, several adaptations were needed to use it for urban areas, such as the alterations of USLE parameters according to different criterions, the definition of the appropriate initial dust thickness corresponding to various land-uses, etc. Concerning the drainage module, water quality routing within pipes assumes that the conduit behaves as a continuously stirred tank reactor. This extension of Multi-Hydro was tested on two peri-urban catchments located near Paris, the Villecresnes (France, 0.7 km²) and the Le Perreux-sur-Marne (France, 0.2 km²). As the Villecresnes had been analyzed within several European projects (FP7 SMARTeST, KIC-Climate BlueGreenDream, Interreg RainGain), the robustness of the new extension of MH was firstly tested on this basin by comparing the water quantity simulation outcomes with the results already obtained in previous works. Benefiting from the large datasets that are collected in the framework of the ANR (French National Agency for Research) Trafipollu project, the water quality modelling performance of the extension was then illustrated on the catchment of Le Perreux-sur-Marne.

  19. Efficient simulations of the aqueous bio-interface of graphitic nanostructures with a polarisable model

    NASA Astrophysics Data System (ADS)

    Hughes, Zak E.; Tomásio, Susana M.; Walsh, Tiffany R.

    2014-04-01

    To fully harness the enormous potential offered by interfaces between graphitic nanostructures and biomolecules, detailed connections between adsorbed conformations and adsorption behaviour are needed. To elucidate these links, a key approach, in partnership with experimental techniques, is molecular simulation. For this, a force-field (FF) that can appropriately capture the relevant physics and chemistry of these complex bio-interfaces, while allowing extensive conformational sampling, and also supporting inter-operability with known biological FFs, is a pivotal requirement. Here, we present and apply such a force-field, GRAPPA, designed to work with the CHARMM FF. GRAPPA is an efficiently implemented polarisable force-field, informed by extensive plane-wave DFT calculations using the revPBE-vdW-DF functional. GRAPPA adequately recovers the spatial and orientational structuring of the aqueous interface of graphene and carbon nanotubes, compared with more sophisticated approaches. We apply GRAPPA to determine the free energy of adsorption for a range of amino acids, identifying Trp, Tyr and Arg to have the strongest binding affinity and Asp to be a weak binder. The GRAPPA FF can be readily incorporated into mainstream simulation packages, and will enable large-scale polarisable biointerfacial simulations at graphitic interfaces, that will aid the development of biomolecule-mediated, solution-based graphene processing and self-assembly strategies.To fully harness the enormous potential offered by interfaces between graphitic nanostructures and biomolecules, detailed connections between adsorbed conformations and adsorption behaviour are needed. To elucidate these links, a key approach, in partnership with experimental techniques, is molecular simulation. For this, a force-field (FF) that can appropriately capture the relevant physics and chemistry of these complex bio-interfaces, while allowing extensive conformational sampling, and also supporting inter-operability with known biological FFs, is a pivotal requirement. Here, we present and apply such a force-field, GRAPPA, designed to work with the CHARMM FF. GRAPPA is an efficiently implemented polarisable force-field, informed by extensive plane-wave DFT calculations using the revPBE-vdW-DF functional. GRAPPA adequately recovers the spatial and orientational structuring of the aqueous interface of graphene and carbon nanotubes, compared with more sophisticated approaches. We apply GRAPPA to determine the free energy of adsorption for a range of amino acids, identifying Trp, Tyr and Arg to have the strongest binding affinity and Asp to be a weak binder. The GRAPPA FF can be readily incorporated into mainstream simulation packages, and will enable large-scale polarisable biointerfacial simulations at graphitic interfaces, that will aid the development of biomolecule-mediated, solution-based graphene processing and self-assembly strategies. Electronic supplementary information (ESI) available: Details of the testing of four different DFT functionals; the adsorption energies and separation distances for the full set of analogue molecules; details of the adsorption energies of the phenyl species on the graphene surface at different adsorption sites; snapshots of the set-ups of the three different water-graphene simulations; plane-wave DFT minimum energy configurations of the full set of analogue molecules; details of the development and parametrisation of the GRAPPA FF; details of the parameters and setup used for the AMEOBAPRO simulations; the probability distribution of the O-H bond vectors of water molecules at the graphene interface; details of the simulation times for the (14 × 0) CNT systems using the different FFs; details of tests performed to determine the contribution of polarisability to binding energies; the RMSD between the reference values and plane-wave DFT values of different groups of molecules; 2D density maps of water on the graphene interface; density and hydrogen bond profiles for the simulations of water inside CNTs; 2D density maps of water inside the CNTs; plots of the collective variable against time for the meta-dynamics simulations; probability distributions of the angle between the plane of the aromatic rings and the graphene surface; the probability distribution of distance of the methyl carbon from the graphene surface for Ala. See DOI: 10.1039/c4nr00468j

  20. Power and sample-size estimation for microbiome studies using pairwise distances and PERMANOVA

    PubMed Central

    Kelly, Brendan J.; Gross, Robert; Bittinger, Kyle; Sherrill-Mix, Scott; Lewis, James D.; Collman, Ronald G.; Bushman, Frederic D.; Li, Hongzhe

    2015-01-01

    Motivation: The variation in community composition between microbiome samples, termed beta diversity, can be measured by pairwise distance based on either presence–absence or quantitative species abundance data. PERMANOVA, a permutation-based extension of multivariate analysis of variance to a matrix of pairwise distances, partitions within-group and between-group distances to permit assessment of the effect of an exposure or intervention (grouping factor) upon the sampled microbiome. Within-group distance and exposure/intervention effect size must be accurately modeled to estimate statistical power for a microbiome study that will be analyzed with pairwise distances and PERMANOVA. Results: We present a framework for PERMANOVA power estimation tailored to marker-gene microbiome studies that will be analyzed by pairwise distances, which includes: (i) a novel method for distance matrix simulation that permits modeling of within-group pairwise distances according to pre-specified population parameters; (ii) a method to incorporate effects of different sizes within the simulated distance matrix; (iii) a simulation-based method for estimating PERMANOVA power from simulated distance matrices; and (iv) an R statistical software package that implements the above. Matrices of pairwise distances can be efficiently simulated to satisfy the triangle inequality and incorporate group-level effects, which are quantified by the adjusted coefficient of determination, omega-squared (ω2). From simulated distance matrices, available PERMANOVA power or necessary sample size can be estimated for a planned microbiome study. Availability and implementation: http://github.com/brendankelly/micropower. Contact: brendank@mail.med.upenn.edu or hongzhe@upenn.edu PMID:25819674

  1. Dissolution curve comparisons through the F(2) parameter, a Bayesian extension of the f(2) statistic.

    PubMed

    Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan

    2015-01-01

    Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.

  2. A Comparative Study on Multifactor Dimensionality Reduction Methods for Detecting Gene-Gene Interactions with the Survival Phenotype

    PubMed Central

    Lee, Seungyeoun; Kim, Yongkang; Kwon, Min-Seok; Park, Taesung

    2015-01-01

    Genome-wide association studies (GWAS) have extensively analyzed single SNP effects on a wide variety of common and complex diseases and found many genetic variants associated with diseases. However, there is still a large portion of the genetic variants left unexplained. This missing heritability problem might be due to the analytical strategy that limits analyses to only single SNPs. One of possible approaches to the missing heritability problem is to consider identifying multi-SNP effects or gene-gene interactions. The multifactor dimensionality reduction method has been widely used to detect gene-gene interactions based on the constructive induction by classifying high-dimensional genotype combinations into one-dimensional variable with two attributes of high risk and low risk for the case-control study. Many modifications of MDR have been proposed and also extended to the survival phenotype. In this study, we propose several extensions of MDR for the survival phenotype and compare the proposed extensions with earlier MDR through comprehensive simulation studies. PMID:26339630

  3. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  4. Impact of model structure on flow simulation and hydrological realism: from a lumped to a semi-distributed approach

    NASA Astrophysics Data System (ADS)

    Garavaglia, Federico; Le Lay, Matthieu; Gottardi, Fréderic; Garçon, Rémy; Gailhard, Joël; Paquet, Emmanuel; Mathevet, Thibault

    2017-08-01

    Model intercomparison experiments are widely used to investigate and improve hydrological model performance. However, a study based only on runoff simulation is not sufficient to discriminate between different model structures. Hence, there is a need to improve hydrological models for specific streamflow signatures (e.g., low and high flow) and multi-variable predictions (e.g., soil moisture, snow and groundwater). This study assesses the impact of model structure on flow simulation and hydrological realism using three versions of a hydrological model called MORDOR: the historical lumped structure and a revisited formulation available in both lumped and semi-distributed structures. In particular, the main goal of this paper is to investigate the relative impact of model equations and spatial discretization on flow simulation, snowpack representation and evapotranspiration estimation. Comparison of the models is based on an extensive dataset composed of 50 catchments located in French mountainous regions. The evaluation framework is founded on a multi-criterion split-sample strategy. All models were calibrated using an automatic optimization method based on an efficient genetic algorithm. The evaluation framework is enriched by the assessment of snow and evapotranspiration modeling against in situ and satellite data. The results showed that the new model formulations perform significantly better than the initial one in terms of the various streamflow signatures, snow and evapotranspiration predictions. The semi-distributed approach provides better calibration-validation performance for the snow cover area, snow water equivalent and runoff simulation, especially for nival catchments.

  5. Closed Environment Module - Modularization and extension of the Virtual Habitat

    NASA Astrophysics Data System (ADS)

    Plötner, Peter; Czupalla, Markus; Zhukov, Anton

    2013-12-01

    The Virtual Habitat (V-HAB), is a Life Support System (LSS) simulation, created to perform dynamic simulation of LSS's for future human spaceflight missions. It allows the testing of LSS robustness by means of computer simulations, e.g. of worst case scenarios.

  6. Real-time simulation of thermal shadows with EMIT

    NASA Astrophysics Data System (ADS)

    Klein, Andreas; Oberhofer, Stefan; Schätz, Peter; Nischwitz, Alfred; Obermeier, Paul

    2016-05-01

    Modern missile systems use infrared imaging for tracking or target detection algorithms. The development and validation processes of these missile systems need high fidelity simulations capable of stimulating the sensors in real-time with infrared image sequences from a synthetic 3D environment. The Extensible Multispectral Image Generation Toolset (EMIT) is a modular software library developed at MBDA Germany for the generation of physics-based infrared images in real-time. EMIT is able to render radiance images in full 32-bit floating point precision using state of the art computer graphics cards and advanced shader programs. An important functionality of an infrared image generation toolset is the simulation of thermal shadows as these may cause matching errors in tracking algorithms. However, for real-time simulations, such as hardware in the loop simulations (HWIL) of infrared seekers, thermal shadows are often neglected or precomputed as they require a thermal balance calculation in four-dimensions (3D geometry in one-dimensional time up to several hours in the past). In this paper we will show the novel real-time thermal simulation of EMIT. Our thermal simulation is capable of simulating thermal effects in real-time environments, such as thermal shadows resulting from the occlusion of direct and indirect irradiance. We conclude our paper with the practical use of EMIT in a missile HWIL simulation.

  7. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, Ming‐shu; Whittemore, Donald O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  8. Simulations For Investigating the Contrast Mechanism of Biological Cells with High Frequency Scanning Acoustic Microscopy

    NASA Astrophysics Data System (ADS)

    Juntarapaso, Yada

    Scanning Acoustic Microscopy (SAM) is one of the most powerful techniques for nondestructive evaluation and it is a promising tool for characterizing the elastic properties of biological tissues/cells. Exploring a single cell is important since there is a connection between single cell biomechanics and human cancer. Scanning acoustic microscopy (SAM) has been accepted and extensively utilized for acoustical cellular and tissue imaging including measurements of the mechanical and elastic properties of biological specimens. SAM provides superb advantages in that it is non-invasive, can measure mechanical properties of biological cells or tissues, and fixation/chemical staining is not necessary. The first objective of this research is to develop a program for simulating the images and contrast mechanism obtained by high-frequency SAM. Computer simulation algorithms based on MatlabRTM were built for simulating the images and contrast mechanisms. The mechanical properties of HeLa and MCF-7 cells were computed from the measurement data of the output signal amplitude as a function of distance from the focal planes of the acoustics lens which is known as V(z) . Algorithms for simulating V(z) responses involved the calculation of the reflectance function and were created based on ray theory and wave theory. The second objective is to design transducer arrays for SAM. Theoretical simulations based on Field II(c) programs of the high frequency ultrasound array designs were performed to enhance image resolution and volumetric imaging capabilities. Phased array beam forming and dynamic apodization and focusing were employed in the simulations. The new transducer array design will be state-of-the-art in improving the performance of SAM by electronic scanning and potentially providing a 4-D image of the specimen.

  9. Thermal Stabilization of Dihydrofolate Reductase Using Monte Carlo Unfolding Simulations and Its Functional Consequences

    PubMed Central

    Whitney, Anna; Shakhnovich, Eugene I.

    2015-01-01

    Design of proteins with desired thermal properties is important for scientific and biotechnological applications. Here we developed a theoretical approach to predict the effect of mutations on protein stability from non-equilibrium unfolding simulations. We establish a relative measure based on apparent simulated melting temperatures that is independent of simulation length and, under certain assumptions, proportional to equilibrium stability, and we justify this theoretical development with extensive simulations and experimental data. Using our new method based on all-atom Monte-Carlo unfolding simulations, we carried out a saturating mutagenesis of Dihydrofolate Reductase (DHFR), a key target of antibiotics and chemotherapeutic drugs. The method predicted more than 500 stabilizing mutations, several of which were selected for detailed computational and experimental analysis. We find a highly significant correlation of r = 0.65–0.68 between predicted and experimentally determined melting temperatures and unfolding denaturant concentrations for WT DHFR and 42 mutants. The correlation between energy of the native state and experimental denaturation temperature was much weaker, indicating the important role of entropy in protein stability. The most stabilizing point mutation was D27F, which is located in the active site of the protein, rendering it inactive. However for the rest of mutations outside of the active site we observed a weak yet statistically significant positive correlation between thermal stability and catalytic activity indicating the lack of a stability-activity tradeoff for DHFR. By combining stabilizing mutations predicted by our method, we created a highly stable catalytically active E. coli DHFR mutant with measured denaturation temperature 7.2°C higher than WT. Prediction results for DHFR and several other proteins indicate that computational approaches based on unfolding simulations are useful as a general technique to discover stabilizing mutations. PMID:25905910

  10. New Approaches to Robust Confidence Intervals for Location: A Simulation Study.

    DTIC Science & Technology

    1984-06-01

    obtain a denominator for the test statistic. Those statistics based on location estimates derived from Hampel’s redescending influence function or v...defined an influence function for a test in terms of the behavior of its P-values when the data are sampled from a model distribution modified by point...proposal could be used for interval estimation as well as hypothesis testing, the extension is immediate. Once an influence function has been defined

  11. Using Reputation Systems and Non-Deterministic Routing to Secure Wireless Sensor Networks

    PubMed Central

    Moya, José M.; Vallejo, Juan Carlos; Fraga, David; Araujo, Álvaro; Villanueva, Daniel; de Goyeneche, Juan-Mariano

    2009-01-01

    Security in wireless sensor networks is difficult to achieve because of the resource limitations of the sensor nodes. We propose a trust-based decision framework for wireless sensor networks coupled with a non-deterministic routing protocol. Both provide a mechanism to effectively detect and confine common attacks, and, unlike previous approaches, allow bad reputation feedback to the network. This approach has been extensively simulated, obtaining good results, even for unrealistically complex attack scenarios. PMID:22412345

  12. Aspects of intelligent electronic device based switchgear control training model application

    NASA Astrophysics Data System (ADS)

    Bogdanov, Dimitar; Popov, Ivaylo

    2018-02-01

    The design of the protection and control equipment for electrical power sector application was object of extensive advance in the last several decades. The modern technologies offer a wide range of multifunctional flexible applications, making the protection and control of facilities more sophisticated. In the same time, the advance of technology imposes the necessity of simulators, training models and tutorial laboratory equipment to be used for adequate training of students and field specialists

  13. Research in digital adaptive flight controllers

    NASA Technical Reports Server (NTRS)

    Kaufman, H.

    1976-01-01

    A design study of adaptive control logic suitable for implementation in modern airborne digital flight computers was conducted. Both explicit controllers which directly utilize parameter identification and implicit controllers which do not require identification were considered. Extensive analytical and simulation efforts resulted in the recommendation of two explicit digital adaptive flight controllers. Interface weighted least squares estimation procedures with control logic were developed using either optimal regulator theory or with control logic based upon single stage performance indices.

  14. OpenSoC Fabric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-21

    Recent advancements in technology scaling have shown a trend towards greater integration with large-scale chips containing thousands of processors connected to memories and other I/O devices using non-trivial network topologies. Software simulation proves insufficient to study the tradeoffs in such complex systems due to slow execution time, whereas hardware RTL development is too time-consuming. We present OpenSoC Fabric, an on-chip network generation infrastructure which aims to provide a parameterizable and powerful on-chip network generator for evaluating future high performance computing architectures based on SoC technology. OpenSoC Fabric leverages a new hardware DSL, Chisel, which contains powerful abstractions provided by itsmore » base language, Scala, and generates both software (C++) and hardware (Verilog) models from a single code base. The OpenSoC Fabric2 infrastructure is modeled after existing state-of-the-art simulators, offers large and powerful collections of configuration options, and follows object-oriented design and functional programming to make functionality extension as easy as possible.« less

  15. Extension of a Kolmogorov Atmospheric Turbulence Model for Time-Based Simulation Implementation

    NASA Technical Reports Server (NTRS)

    McMinn, John D.

    1997-01-01

    The development of any super/hypersonic aircraft requires the interaction of a wide variety of technical disciplines to maximize vehicle performance. For flight and engine control system design and development on this class of vehicle, realistic mathematical simulation models of atmospheric turbulence, including winds and the varying thermodynamic properties of the atmosphere, are needed. A model which has been tentatively selected by a government/industry group of flight and engine/inlet controls representatives working on the High Speed Civil Transport is one based on the Kolmogorov spectrum function. This report compares the Dryden and Kolmogorov turbulence forms, and describes enhancements that add functionality to the selected Kolmogorov model. These added features are: an altitude variation of the eddy dissipation rate based on Dryden data, the mapping of the eddy dissipation rate database onto a regular latitude and longitude grid, a method to account for flight at large vehicle attitude angles, and a procedure for transitioning smoothly across turbulence segments.

  16. Harmonic analysis of electrified railway based on improved HHT

    NASA Astrophysics Data System (ADS)

    Wang, Feng

    2018-04-01

    In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.

  17. Exploration of an oculometer-based model of pilot workload

    NASA Technical Reports Server (NTRS)

    Krebs, M. J.; Wingert, J. W.; Cunningham, T.

    1977-01-01

    Potential relationships between eye behavior and pilot workload are discussed. A Honeywell Mark IIA oculometer was used to obtain the eye data in a fixed base transport aircraft simulation facility. The data were analyzed to determine those parameters of eye behavior which were related to changes in level of task difficulty of the simulated manual approach and landing on instruments. A number of trends and relationships between eye variables and pilot ratings were found. A preliminary equation was written based on the results of a stepwise linear regression. High variability in time spent on various instruments was related to differences in scanning strategy among pilots. A more detailed analysis of individual runs by individual pilots was performed to investigate the source of this variability more closely. Results indicated a high degree of intra-pilot variability in instrument scanning. No consistent workload related trends were found. Pupil diameter which had demonstrated a strong relationship to task difficulty was extensively re-exmained.

  18. Output-feedback control of combined sewer networks through receding horizon control with moving horizon estimation

    NASA Astrophysics Data System (ADS)

    Joseph-Duran, Bernat; Ocampo-Martinez, Carlos; Cembrano, Gabriela

    2015-10-01

    An output-feedback control strategy for pollution mitigation in combined sewer networks is presented. The proposed strategy provides means to apply model-based predictive control to large-scale sewer networks, in-spite of the lack of measurements at most of the network sewers. In previous works, the authors presented a hybrid linear control-oriented model for sewer networks together with the formulation of Optimal Control Problems (OCP) and State Estimation Problems (SEP). By iteratively solving these problems, preliminary Receding Horizon Control with Moving Horizon Estimation (RHC/MHE) results, based on flow measurements, were also obtained. In this work, the RHC/MHE algorithm has been extended to take into account both flow and water level measurements and the resulting control loop has been extensively simulated to assess the system performance according different measurement availability scenarios and rain events. All simulations have been carried out using a detailed physically based model of a real case-study network as virtual reality.

  19. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

    NASA Astrophysics Data System (ADS)

    Oprisan, Sorinel Adrian; Oprisan, Ana

    2005-03-01

    Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells — EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

  20. Nanostructure, hydrogen bonding and rheology in choline chloride deep eutectic solvents as a function of the hydrogen bond donor.

    PubMed

    Stefanovic, Ryan; Ludwig, Michael; Webber, Grant B; Atkin, Rob; Page, Alister J

    2017-01-25

    Deep eutectic solvents (DESs) are a mixture of a salt and a molecular hydrogen bond donor, which form a eutectic liquid with a depressed melting point. Quantum mechanical molecular dynamics (QM/MD) simulations have been used to probe the 1 : 2 choline chloride-urea (ChCl : U), choline chloride-ethylene glycol (ChCl : EG) and choline chloride-glycerol (ChCl : Gly) DESs. DES nanostructure and interactions between the ions is used to rationalise differences in DES eutectic point temperatures and viscosity. Simulations show that the structure of the bulk hydrogen bond donor is largely preserved for hydroxyl based hydrogen bond donors (ChCl:Gly and ChCl:EG), resulting in a smaller melting point depression. By contrast, ChCl:U exhibits a well-established hydrogen bond network between the salt and hydrogen bond donor, leading to a larger melting point depression. This extensive hydrogen bond network in ChCl:U also leads to substantially higher viscosity, compared to ChCl:EG and ChCl:Gly. Of the two hydroxyl based DESs, ChCl:Gly also exhibits a higher viscosity than ChCl:EG. This is attributed to the over-saturation of hydrogen bond donor groups in the ChCl:Gly bulk, which leads to more extensive hydrogen bond donor self-interaction and hence higher cohesive forces within the bulk liquid.

  1. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    PubMed

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  2. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS

    PubMed Central

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183

  3. Growing Actin Networks Form Lamellipodium and Lamellum by Self-Assembly

    PubMed Central

    Huber, Florian; Käs, Josef; Stuhrmann, Björn

    2008-01-01

    Many different cell types are able to migrate by formation of a thin actin-based cytoskeletal extension. Recently, it became evident that this extension consists of two distinct substructures, designated lamellipodium and lamellum, which differ significantly in their kinetic and kinematic properties as well as their biochemical composition. We developed a stochastic two-dimensional computer simulation that includes chemical reaction kinetics, G-actin diffusion, and filament transport to investigate the formation of growing actin networks in migrating cells. Model parameters were chosen based on experimental data or theoretical considerations. In this work, we demonstrate the system's ability to form two distinct networks by self-organization. We found a characteristic transition in mean filament length as well as a distinct maximum in depolymerization flux, both within the first 1–2 μm. The separation into two distinct substructures was found to be extremely robust with respect to initial conditions and variation of model parameters. We quantitatively investigated the complex interplay between ADF/cofilin and tropomyosin and propose a plausible mechanism that leads to spatial separation of, respectively, ADF/cofilin- or tropomyosin-dominated compartments. Tropomyosin was found to play an important role in stabilizing the lamellar actin network. Furthermore, the influence of filament severing and annealing on the network properties is explored, and simulation data are compared to existing experimental data. PMID:18708450

  4. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.

    PubMed

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level.

  5. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery

    PubMed Central

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  6. Numerical simulations of electrohydrodynamic evolution of thin polymer films

    NASA Astrophysics Data System (ADS)

    Borglum, Joshua Christopher

    Recently developed needleless electrospinning and electrolithography are two successful techniques that have been utilized extensively for low-cost, scalable, and continuous nano-fabrication. Rational understanding of the electrohydrodynamic principles underneath these nano-manufacturing methods is crucial to fabrication of continuous nanofibers and patterned thin films. This research project is to formulate robust, high-efficiency finite-difference Fourier spectral methods to simulate the electrohydrodynamic evolution of thin polymer films. Two thin-film models were considered and refined. The first was based on reduced lubrication theory; the second further took into account the effect of solvent drying and dewetting of the substrate. Fast Fourier Transform (FFT) based spectral method was integrated into the finite-difference algorithms for fast, accurately solving the governing nonlinear partial differential equations. The present methods have been used to examine the dependencies of the evolving surface features of the thin films upon the model parameters. The present study can be used for fast, controllable nanofabrication.

  7. A Comprehensive Fluid Dynamic-Diffusion Model of Blood Microcirculation with Focus on Sickle Cell Disease

    NASA Astrophysics Data System (ADS)

    Le Floch, Francois; Harris, Wesley L.

    2009-11-01

    A novel methodology has been developed to address sickle cell disease, based on highly descriptive mathematical models for blood flow in the capillaries. Our investigations focus on the coupling between oxygen delivery and red blood cell dynamics, which is crucial to understanding sickle cell crises and is unique to this blood disease. The main part of our work is an extensive study of blood dynamics through simulations of red cells deforming within the capillary vessels, and relies on the use of a large mathematical system of equations describing oxygen transfer, blood plasma dynamics and red cell membrane mechanics. This model is expected to lead to the development of new research strategies for sickle cell disease. Our simulation model could be used not only to assess current researched remedies, but also to spur innovative research initiatives, based on our study of the physical properties coupled in sickle cell disease.

  8. Cooperative Monocular-Based SLAM for Multi-UAV Systems in GPS-Denied Environments †

    PubMed Central

    Guerra, Edmundo

    2018-01-01

    This work presents a cooperative monocular-based SLAM approach for multi-UAV systems that can operate in GPS-denied environments. The main contribution of the work is to show that, using visual information obtained from monocular cameras mounted onboard aerial vehicles flying in formation, the observability properties of the whole system are improved. This fact is especially notorious when compared with other related visual SLAM configurations. In order to improve the observability properties, some measurements of the relative distance between the UAVs are included in the system. These relative distances are also obtained from visual information. The proposed approach is theoretically validated by means of a nonlinear observability analysis. Furthermore, an extensive set of computer simulations is presented in order to validate the proposed approach. The numerical simulation results show that the proposed system is able to provide a good position and orientation estimation of the aerial vehicles flying in formation. PMID:29701722

  9. Mesoscale Particle-Based Model of Electrophoretic Deposition

    DOE PAGES

    Giera, Brian; Zepeda-Ruiz, Luis A.; Pascall, Andrew J.; ...

    2016-12-20

    In this paper, we present and evaluate a semiempirical particle-based model of electrophoretic deposition using extensive mesoscale simulations. We analyze particle configurations in order to observe how colloids accumulate at the electrode and arrange into deposits. In agreement with existing continuum models, the thickness of the deposit increases linearly in time during deposition. Resulting colloidal deposits exhibit a transition between highly ordered and bulk disordered regions that can give rise to an appreciable density gradient under certain simulated conditions. The overall volume fraction increases and falls within a narrow range as the driving force due to the electric field increasesmore » and repulsive intercolloidal interactions decrease. We postulate ordering and stacking within the initial layer(s) dramatically impacts the microstructure of the deposits. Finally, we find a combination of parameters, i.e., electric field and suspension properties, whose interplay enhances colloidal ordering beyond the commonly known approach of only reducing the driving force.« less

  10. Cooperative Monocular-Based SLAM for Multi-UAV Systems in GPS-Denied Environments.

    PubMed

    Trujillo, Juan-Carlos; Munguia, Rodrigo; Guerra, Edmundo; Grau, Antoni

    2018-04-26

    This work presents a cooperative monocular-based SLAM approach for multi-UAV systems that can operate in GPS-denied environments. The main contribution of the work is to show that, using visual information obtained from monocular cameras mounted onboard aerial vehicles flying in formation, the observability properties of the whole system are improved. This fact is especially notorious when compared with other related visual SLAM configurations. In order to improve the observability properties, some measurements of the relative distance between the UAVs are included in the system. These relative distances are also obtained from visual information. The proposed approach is theoretically validated by means of a nonlinear observability analysis. Furthermore, an extensive set of computer simulations is presented in order to validate the proposed approach. The numerical simulation results show that the proposed system is able to provide a good position and orientation estimation of the aerial vehicles flying in formation.

  11. Experimental, Numerical, and Analytical Slosh Dynamics of Water and Liquid Nitrogen in a Spherical Tank

    NASA Technical Reports Server (NTRS)

    Storey, Jedediah Morse

    2016-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecraft's mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many experimental and numerical studies of water slosh have been conducted. However, slosh data for cryogenic liquids is lacking. Water and cryogenic liquid nitrogen are used in various ground-based tests with a spherical tank to characterize damping, slosh mode frequencies, and slosh forces. A single ring baffle is installed in the tank for some of the tests. Analytical models for slosh modes, slosh forces, and baffle damping are constructed based on prior work. Select experiments are simulated using a commercial CFD software, and the numerical results are compared to the analytical and experimental results for the purposes of validation and methodology-improvement.

  12. Protocol Support for a New Satellite-Based Airspace Communication Network

    NASA Technical Reports Server (NTRS)

    Shang, Yadong; Hadjitheodosiou, Michael; Baras, John

    2004-01-01

    We recommend suitable transport protocols for an aeronautical network supporting Internet and data services via satellite. We study the characteristics of an aeronautical satellite hybrid network and focus on the problems that cause dramatically degraded performance of the Transport Protocol. We discuss various extensions to standard TCP that alleviate some of these performance problems. Through simulation, we identify those TCP implementations that can be expected to perform well. Based on the observation that it is difficult for an end-to-end solution to solve these problems effectively, we propose a new TCP-splitting protocol, termed Aeronautical Transport Control Protocol (AeroTCP). The main idea of this protocol is to use a fixed window for flow control and one duplicated acknowledgement (ACK) for fast recovery. Our simulation results show that AeroTCP can maintain higher utilization for the satellite link than end-to-end TCP, especially in high BER environment.

  13. Effects of the Extended Water Retention Curve on Coupled Heat and Water Transport in the Vadose Zone

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Mohanty, B.

    2017-12-01

    Understanding and simulating coupled heat and water transfer appropriately in the shallow subsurface is of vital significance for accurate prediction of soil evaporation that would improve the coupling between land surface and atmosphere. The theory of Philip and de Vries (1957) and its extensions (de Vries, 1958; Milly, 1982), although physically incomplete, are still adopted successfully to describe the coupled heat and water movement in field soils. However, the adsorptive water retention, which was ignored in Philip and de Vries theory and its extensions for characterizing soil hydraulic parameters, was shown to be non-negligible for soil moisture and evaporation flux calculation in dry field soils based on a recent synthetic analysis (Mohanty and Yang, 2013). In this study, we attempt to comprehensively investigate the effects of full range water retention curve on coupled heat and water transport simulation with a focus on soil moisture content, temperature and soil evaporative flux, based on two synthetic (sand and loam) and two field sites (Riverside, California and Audubon, Arizona) analysis. The results of synthetic sand and loam numerical modeling showed that when neglecting the adsorptive water retention, the resulting simulated soil water content would be larger, and the evaporative flux would be lower, respectively, compared to that obtained by the full range water retention curve mode. The simulated temperature did not show significant difference with or without accounting for adsorptive water retention. The evaporation underestimation when neglecting the adsorptive water retention is mainly caused by isothermal hydraulic conductivity underprediction. These synthetic findings were further corroborated by the Audubon, Arizona field site experimental results. The results from Riverside, California field experimental site showed that the soil surface can reach very dry status, although the soil profile below the drying front is not dry, which also to some extent justifies the necessity of employing full range water retention function in such generally not quite dry scenarios.

  14. Estimation of sum-to-one constrained parameters with non-Gaussian extensions of ensemble-based Kalman filters: application to a 1D ocean biogeochemical model

    NASA Astrophysics Data System (ADS)

    Simon, E.; Bertino, L.; Samuelsen, A.

    2011-12-01

    Combined state-parameter estimation in ocean biogeochemical models with ensemble-based Kalman filters is a challenging task due to the non-linearity of the models, the constraints of positiveness that apply to the variables and parameters, and the non-Gaussian distribution of the variables in which they result. Furthermore, these models are sensitive to numerous parameters that are poorly known. Previous works [1] demonstrated that the Gaussian anamorphosis extensions of ensemble-based Kalman filters were relevant tools to perform combined state-parameter estimation in such non-Gaussian framework. In this study, we focus on the estimation of the grazing preferences parameters of zooplankton species. These parameters are introduced to model the diet of zooplankton species among phytoplankton species and detritus. They are positive values and their sum is equal to one. Because the sum-to-one constraint cannot be handled by ensemble-based Kalman filters, a reformulation of the parameterization is proposed. We investigate two types of changes of variables for the estimation of sum-to-one constrained parameters. The first one is based on Gelman [2] and leads to the estimation of normal distributed parameters. The second one is based on the representation of the unit sphere in spherical coordinates and leads to the estimation of parameters with bounded distributions (triangular or uniform). These formulations are illustrated and discussed in the framework of twin experiments realized in the 1D coupled model GOTM-NORWECOM with Gaussian anamorphosis extensions of the deterministic ensemble Kalman filter (DEnKF). [1] Simon E., Bertino L. : Gaussian anamorphosis extension of the DEnKF for combined state and parameter estimation : application to a 1D ocean ecosystem model. Journal of Marine Systems, 2011. doi :10.1016/j.jmarsys.2011.07.007 [2] Gelman A. : Method of Moments Using Monte Carlo Simulation. Journal of Computational and Graphical Statistics, 4, 1, 36-54, 1995.

  15. Scalable Entity-Based Modeling of Population-Based Systems, Final LDRD Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleary, A J; Smith, S G; Vassilevska, T K

    2005-01-27

    The goal of this project has been to develop tools, capabilities and expertise in the modeling of complex population-based systems via scalable entity-based modeling (EBM). Our initial focal application domain has been the dynamics of large populations exposed to disease-causing agents, a topic of interest to the Department of Homeland Security in the context of bioterrorism. In the academic community, discrete simulation technology based on individual entities has shown initial success, but the technology has not been scaled to the problem sizes or computational resources of LLNL. Our developmental emphasis has been on the extension of this technology to parallelmore » computers and maturation of the technology from an academic to a lab setting.« less

  16. Spacecraft applications of advanced global positioning system technology

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This is the final report on the Texas Instruments Incorporated (TI) simulations study of Spacecraft Application of Advanced Global Positioning System (GPS) Technology. This work was conducted for the NASA Johnson Space Center (JSC) under contract NAS9-17781. GPS, in addition to its baselined capability as a highly accurate spacecraft navigation system, can provide traffic control, attitude control, structural control, and uniform time base. In Phase 1 of this program, another contractor investigated the potential of GPS in these four areas and compared GPS to other techniques. This contract was for the Phase 2 effort, to study the performance of GPS for these spacecraft applications through computer simulations. TI had previously developed simulation programs for GPS differential navigation and attitude measurement. These programs were adapted for these specific spacecraft applications. In addition, TI has extensive expertise in the design and production of advanced GPS receivers, including space-qualified GPS receivers. We have drawn on this background to augment the simulation results in the system level overview, which is Section 2 of this report.

  17. Test of Hadronic Interaction Models with the KASCADE Hadron Calorimeter

    NASA Astrophysics Data System (ADS)

    Milke, J.; KASCADE Collaboration

    The interpretation of extensive air shower (EAS) measurements often requires the comparison with EAS simulations based on high-energy hadronic interaction models. These interaction models have to extrapolate into kinematical regions and energy ranges beyond the limit of present accelerators. Therefore, it is necessary to test whether these models are able to describe the EAS development in a consistent way. By measuring simultaneously the hadronic, electromagnetic, and muonic part of an EAS the experiment KASCADE offers best facilities for checking the models. For the EAS simulations the program CORSIKA with several hadronic event generators implemented is used. Different hadronic observables, e.g. hadron number, energy spectrum, lateral distribution, are investigated, as well as their correlations with the electromagnetic and muonic shower size. By comparing measurements and simulations the consistency of the description of the EAS development is checked. First results with the new interaction model NEXUS and the version II.5 of the model DPMJET, recently included in CORSIKA, are presented and compared with QGSJET simulations.

  18. (Invited) Comprehensive Assessment of Oxide Memristors As Post-CMOS Memory and Logic Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, X.; Mamaluy, D.; Cyr, E. C.

    As CMOS technology approaches the end of its scaling, oxide-based memristors have become one of the leading candidates for post-CMOS memory and logic devices. In orderTo facilitate the understanding of physical switching mechanisms and accelerate experimental development of memristors, we have developed a three-dimensional fully-coupled electrical and thermal transport model, which captures all the important processes that drive memristive switching and is applicable for simulating a wide range of memristors. Moreover, the model is applied to simulate the RESET and SET switching in a 3D filamentary TaOx memristor. Extensive simulations show that the switching dynamics of the bipolar device ismore » determined by thermally-activated field-dominant processes: with Joule heating, the raised temperature enables the movement of oxygen vacancies, and the field drift dominates the overall motion of vacancies. Simulated current-voltage hysteresis and device resistance profiles as a function of time and voltage during RESET and SET switching show good agreement with experimental measurement.« less

  19. (Invited) Comprehensive Assessment of Oxide Memristors As Post-CMOS Memory and Logic Devices

    DOE PAGES

    Gao, X.; Mamaluy, D.; Cyr, E. C.; ...

    2016-05-10

    As CMOS technology approaches the end of its scaling, oxide-based memristors have become one of the leading candidates for post-CMOS memory and logic devices. In orderTo facilitate the understanding of physical switching mechanisms and accelerate experimental development of memristors, we have developed a three-dimensional fully-coupled electrical and thermal transport model, which captures all the important processes that drive memristive switching and is applicable for simulating a wide range of memristors. Moreover, the model is applied to simulate the RESET and SET switching in a 3D filamentary TaOx memristor. Extensive simulations show that the switching dynamics of the bipolar device ismore » determined by thermally-activated field-dominant processes: with Joule heating, the raised temperature enables the movement of oxygen vacancies, and the field drift dominates the overall motion of vacancies. Simulated current-voltage hysteresis and device resistance profiles as a function of time and voltage during RESET and SET switching show good agreement with experimental measurement.« less

  20. A parallel simulated annealing algorithm for standard cell placement on a hypercube computer

    NASA Technical Reports Server (NTRS)

    Jones, Mark Howard

    1987-01-01

    A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.

  1. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment*†

    PubMed Central

    Khan, Md. Ashfaquzzaman; Herbordt, Martin C.

    2011-01-01

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations. PMID:21822327

  2. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment.

    PubMed

    Khan, Md Ashfaquzzaman; Herbordt, Martin C

    2011-07-20

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations.

  3. A standard library for modeling satellite orbits on a microcomputer

    NASA Astrophysics Data System (ADS)

    Beutel, Kenneth L.

    1988-03-01

    Introductory students of astrodynamics and the space environment are required to have a fundamental understanding of the kinematic behavior of satellite orbits. This thesis develops a standard library that contains the basic formulas for modeling earth orbiting satellites. This library is used as a basis for implementing a satellite motion simulator that can be used to demonstrate orbital phenomena in the classroom. Surveyed are the equations of orbital elements, coordinate systems and analytic formulas, which are made into a standard method for modeling earth orbiting satellites. The standard library is written in the C programming language and is designed to be highly portable between a variety of computer environments. The simulation draws heavily on the standards established by the library to produce a graphics-based orbit simulation program written for the Apple Macintosh computer. The simulation demonstrates the utility of the standard library functions but, because of its extensive use of the Macintosh user interface, is not portable to other operating systems.

  4. ARC-1969-AC-42137

    NASA Image and Video Library

    1969-02-05

    Height-Control Test Apparatus (HICONTA) Simulator mounted to the exterior of the 40x80ft W.T. Building N-221B and provided extensive vertical motion simulating airplanes, helicopter and V/STOL aircraft

  5. DRACO development for 3D simulations

    NASA Astrophysics Data System (ADS)

    Fatenejad, Milad; Moses, Gregory

    2006-10-01

    The DRACO (r-z) lagrangian radiation-hydrodynamics laser fusion simulation code is being extended to model 3D hydrodynamics in (x-y-z) coordinates with hexahedral cells on a structured grid. The equation of motion is solved with a lagrangian update with optional rezoning. The fluid equations are solved using an explicit scheme based on (Schulz, 1964) while the SALE-3D algorithm (Amsden, 1981) is used as a template for computing cell volumes and other quantities. A second order rezoner has been added which uses linear interpolation of the underlying continuous functions to preserve accuracy (Van Leer, 1976). Artificial restoring force terms and smoothing algorithms are used to avoid grid distortion in high aspect ratio cells. These include alternate node couplers along with a rotational restoring force based on the Tensor Code (Maenchen, 1964). Electron and ion thermal conduction is modeled using an extension of Kershaw's method (Kershaw, 1981) to 3D geometry. Test problem simulations will be presented to demonstrate the applicability of this new version of DRACO to the study of fluid instabilities in three dimensions.

  6. Revealing electronic open quantum systems with subsystem TDDFT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishtal, Alisa, E-mail: alisa.krishtal@rutgers.edu; Pavanello, Michele, E-mail: m.pavanello@rutgers.edu

    2016-03-28

    Open quantum systems (OQSs) are perhaps the most realistic systems one can approach through simulations. In recent years, describing OQSs with Density Functional Theory (DFT) has been a prominent avenue of research with most approaches based on a density matrix partitioning in conjunction with an ad-hoc description of system-bath interactions. We propose a different theoretical approach to OQSs based on partitioning of the electron density. Employing the machinery of subsystem DFT (and its time-dependent extension), we provide a novel way of isolating and analyzing the various terms contributing to the coupling between the system and the surrounding bath. To illustratemore » the theory, we provide numerical simulations on a toy system (a molecular dimer) and on a condensed phase system (solvated excimer). The simulations show that non-Markovian dynamics in the electronic system-bath interactions are important in chemical applications. For instance, we show that the superexchange mechanism of transport in donor-bridge-acceptor systems is a non-Markovian interaction between the donor-acceptor (OQS) with the bridge (bath) which is fully characterized by real-time subsystem time-dependent DFT.« less

  7. Revealing electronic open quantum systems with subsystem TDDFT.

    PubMed

    Krishtal, Alisa; Pavanello, Michele

    2016-03-28

    Open quantum systems (OQSs) are perhaps the most realistic systems one can approach through simulations. In recent years, describing OQSs with Density Functional Theory (DFT) has been a prominent avenue of research with most approaches based on a density matrix partitioning in conjunction with an ad-hoc description of system-bath interactions. We propose a different theoretical approach to OQSs based on partitioning of the electron density. Employing the machinery of subsystem DFT (and its time-dependent extension), we provide a novel way of isolating and analyzing the various terms contributing to the coupling between the system and the surrounding bath. To illustrate the theory, we provide numerical simulations on a toy system (a molecular dimer) and on a condensed phase system (solvated excimer). The simulations show that non-Markovian dynamics in the electronic system-bath interactions are important in chemical applications. For instance, we show that the superexchange mechanism of transport in donor-bridge-acceptor systems is a non-Markovian interaction between the donor-acceptor (OQS) with the bridge (bath) which is fully characterized by real-time subsystem time-dependent DFT.

  8. Revealing electronic open quantum systems with subsystem TDDFT

    NASA Astrophysics Data System (ADS)

    Krishtal, Alisa; Pavanello, Michele

    2016-03-01

    Open quantum systems (OQSs) are perhaps the most realistic systems one can approach through simulations. In recent years, describing OQSs with Density Functional Theory (DFT) has been a prominent avenue of research with most approaches based on a density matrix partitioning in conjunction with an ad-hoc description of system-bath interactions. We propose a different theoretical approach to OQSs based on partitioning of the electron density. Employing the machinery of subsystem DFT (and its time-dependent extension), we provide a novel way of isolating and analyzing the various terms contributing to the coupling between the system and the surrounding bath. To illustrate the theory, we provide numerical simulations on a toy system (a molecular dimer) and on a condensed phase system (solvated excimer). The simulations show that non-Markovian dynamics in the electronic system-bath interactions are important in chemical applications. For instance, we show that the superexchange mechanism of transport in donor-bridge-acceptor systems is a non-Markovian interaction between the donor-acceptor (OQS) with the bridge (bath) which is fully characterized by real-time subsystem time-dependent DFT.

  9. Emissions from open burning of simulated military waste from forward operating bases.

    PubMed

    Aurell, Johanna; Gullett, Brian K; Yamamoto, Dirk

    2012-10-16

    Emissions from open burning of simulated military waste from forward operating bases (FOBs) were extensively characterized as an initial step in assessing potential inhalation exposure of FOB personnel and future disposal alternatives. Emissions from two different burning scenarios, so-called "burn piles/pits" and an air curtain burner/"burn box", were compared using simulated FOB waste from municipal and commercial sources. A comprehensive array of emissions was quantified, including CO(2), PM(2.5), volatile organic compounds (VOCs), polyaromatic hydrocarbons (PAHs), polychlorinated dibenzodioxins and -furans (PCDDs/PCDFs), polybrominated dibenzodioxins and -furans (PBDDs/PBDFs), and metals. In general, smoldering conditions in the burn box and the burn pile led to similar emissions. However, when the burn box underwent periodic waste charging to maintain sustained combustion, PM(2.5), VOCs, and PAH emissions dropped considerably compared to smoldering conditions and the overall burn pile results. The PCDD/PCDF and PBDD/PBDF emission factors for the burn piles were 50 times higher than those from the burn box likely due to the dominance of smoldering combustion in the burn piles.

  10. Transport link scanner: simulating geographic transport network expansion through individual investments

    NASA Astrophysics Data System (ADS)

    Jacobs-Crisioni, C.; Koopmans, C. C.

    2016-07-01

    This paper introduces a GIS-based model that simulates the geographic expansion of transport networks by several decision-makers with varying objectives. The model progressively adds extensions to a growing network by choosing the most attractive investments from a limited choice set. Attractiveness is defined as a function of variables in which revenue and broader societal benefits may play a role and can be based on empirically underpinned parameters that may differ according to private or public interests. The choice set is selected from an exhaustive set of links and presumably contains those investment options that best meet private operator's objectives by balancing the revenues of additional fare against construction costs. The investment options consist of geographically plausible routes with potential detours. These routes are generated using a fine-meshed regularly latticed network and shortest path finding methods. Additionally, two indicators of the geographic accuracy of the simulated networks are introduced. A historical case study is presented to demonstrate the model's first results. These results show that the modelled networks reproduce relevant results of the historically built network with reasonable accuracy.

  11. Flight Simulator and Training Human Factors Validation

    NASA Technical Reports Server (NTRS)

    Glaser, Scott T.; Leland, Richard

    2009-01-01

    Loss of control has been identified as the leading cause of aircraft accidents in recent years. Efforts have been made to better equip pilots to deal with these types of events, commonly referred to as upsets. A major challenge in these endeavors has been recreating the motion environments found in flight as the majority of upsets take place well beyond the normal operating envelope of large aircraft. The Environmental Tectonics Corporation has developed a simulator motion base, called GYROLAB, that is capable of recreating the sustained accelerations, or G-forces, and motions of flight. A two part research study was accomplished that coupled NASA's Generic Transport Model with a GYROLAB device. The goal of the study was to characterize physiological effects of the upset environment and to demonstrate that a sustained motion based simulator can be an effective means for upset recovery training. Two groups of 25 Air Transport Pilots participated in the study. The results showed reliable signs of pilot arousal at specific stages of similar upsets. Further validation also demonstrated that sustained motion technology was successful in improving pilot performance during recovery following an extensive training program using GYROLAB technology.

  12. Quantitative risk management in gas injection project: a case study from Oman oil and gas industry

    NASA Astrophysics Data System (ADS)

    Khadem, Mohammad Miftaur Rahman Khan; Piya, Sujan; Shamsuzzoha, Ahm

    2017-09-01

    The purpose of this research was to study the recognition, application and quantification of the risks associated in managing projects. In this research, the management of risks in an oil and gas project is studied and implemented within a case company in Oman. In this study, at first, the qualitative data related to risks in the project were identified through field visits and extensive interviews. These data were then translated into numerical values based on the expert's opinion. Further, the numerical data were used as an input to Monte Carlo simulation. RiskyProject Professional™ software was used to simulate the system based on the identified risks. The simulation result predicted a delay of about 2 years as a worse case with no chance of meeting the project's on stream date. Also, it has predicted 8% chance of exceeding the total estimated budget. The result of numerical analysis from the proposed model is validated by comparing it with the result of qualitative analysis, which was obtained through discussion with various project managers of company.

  13. A rapid solvent accessible surface area estimator for coarse grained molecular simulations.

    PubMed

    Wei, Shuai; Brooks, Charles L; Frank, Aaron T

    2017-06-05

    The rapid and accurate calculation of solvent accessible surface area (SASA) is extremely useful in the energetic analysis of biomolecules. For example, SASA models can be used to estimate the transfer free energy associated with biophysical processes, and when combined with coarse-grained simulations, can be particularly useful for accounting for solvation effects within the framework of implicit solvent models. In such cases, a fast and accurate, residue-wise SASA predictor is highly desirable. Here, we develop a predictive model that estimates SASAs based on Cα-only protein structures. Through an extensive comparison between this method and a comparable method, POPS-R, we demonstrate that our new method, Protein-C α Solvent Accessibilities or PCASA, shows better performance, especially for unfolded conformations of proteins. We anticipate that this model will be quite useful in the efficient inclusion of SASA-based solvent free energy estimations in coarse-grained protein folding simulations. PCASA is made freely available to the academic community at https://github.com/atfrank/PCASA. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. Final report: Prototyping a combustion corridor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.; Leach, Joshua

    2001-12-15

    The Combustion Corridor is a concept in which researchers in combustion and thermal sciences have unimpeded access to large volumes of remote computational results. This will enable remote, collaborative analysis and visualization of state-of-the-art combustion science results. The Engine Research Center (ERC) at the University of Wisconsin - Madison partnered with Lawrence Berkeley National Laboratory, Argonne National Laboratory, Sandia National Laboratory, and several other universities to build and test the first stages of a combustion corridor. The ERC served two important functions in this partnership. First, we work extensively with combustion simulations so we were able to provide real worldmore » research data sets for testing the Corridor concepts. Second, the ERC was part of an extension of the high bandwidth based DOE National Laboratory connections to universities.« less

  15. Towards an internal model in pilot training.

    PubMed

    Braune, R J; Trollip, S R

    1982-10-01

    Optimal decision making requires an information seeking behavior which reflects the comprehension of the overall system dynamics. Research in the area of human monitors in man-machine systems supports the notion of an internal model with built-in expectancies. It is doubtful that the current approach to pilot training helps develop this internal model in the most efficient way. But this is crucial since the role of the pilot is changing to a systems' manager and decision maker. An extension of the behavioral framework of pilot training might help to prepare the pilot better for the increasingly complex flight environment. This extension is based on the theoretical model of schema theory, which evolved out of psychological research. The technological advances in aircraft simulators and in-flight performance measurement devices allow investigation of the still-unresolved issues.

  16. Modelling erosion on a daily basis, an adaptation of the MMF approach

    NASA Astrophysics Data System (ADS)

    Shrestha, Dhruba Pikha; Jetten, Victor G.

    2018-02-01

    Effect of soil erosion causing negative impact on ecosystem services and food security is well known. On the other hand there can be yearly variation of total precipitation received in an area, with the presence of extreme rains. To assess annual erosion rates various empirical models have been extensively used in all the climatic regions. While these models are simple to operate and do not require lot of input data, the effect of extreme rain is not taken into account. Although physically based models are available to simulate erosion processes including particle detachment, transportation and deposition of sediments during a storm they are not applicable for assessing annual soil loss rates. Moreover storm event data may not be available everywhere prohibiting their extensive use.

  17. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    USDA-ARS?s Scientific Manuscript database

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  18. Dual Interlocked Logic for Single-Event Transient Mitigation

    DTIC Science & Technology

    2017-03-01

    SPICE simulation and fault-injection analysis. Exemplar SPICE simulations have been performed in a 32nm partially- depleted silicon-on-insulator...in this work. The model has been validated at the 32nm SOI technology node with extensive heavy-ion data [7]. For the SPICE simulations, three

  19. Scaling Relations for Intercalation Induced Damage in Electrodes

    DOE PAGES

    Chen, Chien-Fan; Barai, Pallab; Smith, Kandler; ...

    2016-04-02

    Mechanical degradation, owing to intercalation induced stress and microcrack formation, is a key contributor to the electrode performance decay in lithium-ion batteries (LIBs). The stress generation and formation of microcracks are caused by the solid state diffusion of lithium in the active particles. Here in this work, scaling relations are constructed for diffusion induced damage in intercalation electrodes based on an extensive set of numerical experiments with a particle-level description of microcrack formation under disparate operating and cycling conditions, such as temperature, particle size, C-rate, and drive cycle. The microcrack formation and evolution in active particles is simulated based onmore » a stochastic methodology. A reduced order scaling law is constructed based on an extensive set of data from the numerical experiments. The scaling relations include combinatorial constructs of concentration gradient, cumulative strain energy, and microcrack formation. Lastly, the reduced order relations are further employed to study the influence of mechanical degradation on cell performance and validated against the high order model for the case of damage evolution during variable current vehicle drive cycle profiles.« less

  20. Hippocampome.org: a knowledge base of neuron types in the rodent hippocampus

    PubMed Central

    Wheeler, Diek W; White, Charise M; Rees, Christopher L; Komendantov, Alexander O; Hamilton, David J; Ascoli, Giorgio A

    2015-01-01

    Hippocampome.org is a comprehensive knowledge base of neuron types in the rodent hippocampal formation (dentate gyrus, CA3, CA2, CA1, subiculum, and entorhinal cortex). Although the hippocampal literature is remarkably information-rich, neuron properties are often reported with incompletely defined and notoriously inconsistent terminology, creating a formidable challenge for data integration. Our extensive literature mining and data reconciliation identified 122 neuron types based on neurotransmitter, axonal and dendritic patterns, synaptic specificity, electrophysiology, and molecular biomarkers. All ∼3700 annotated properties are individually supported by specific evidence (∼14,000 pieces) in peer-reviewed publications. Systematic analysis of this unprecedented amount of machine-readable information reveals novel correlations among neuron types and properties, the potential connectivity of the full hippocampal circuitry, and outstanding knowledge gaps. User-friendly browsing and online querying of Hippocampome.org may aid design and interpretation of both experiments and simulations. This powerful, simple, and extensible neuron classification endeavor is unique in its detail, utility, and completeness. DOI: http://dx.doi.org/10.7554/eLife.09960.001 PMID:26402459

  1. Lattice Boltzmann Methods to Address Fundamental Boiling and Two-Phase Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uddin, Rizwan

    2012-01-01

    This report presents the progress made during the fourth (no cost extension) year of this three-year grant aimed at the development of a consistent Lattice Boltzmann formulation for boiling and two-phase flows. During the first year, a consistent LBM formulation for the simulation of a two-phase water-steam system was developed. Results of initial model validation in a range of thermo-dynamic conditions typical for Boiling Water Reactors (BWRs) were shown. Progress was made on several fronts during the second year. Most important of these included the simulation of the coalescence of two bubbles including the surface tension effects. Work during themore » third year focused on the development of a new lattice Boltzmann model, called the artificial interface lattice Boltzmann model (AILB model) for the 3 simulation of two-phase dynamics. The model is based on the principle of free energy minimization and invokes the Gibbs-Duhem equation in the formulation of non-ideal forcing function. This was reported in detail in the last progress report. Part of the efforts during the last (no-cost extension) year were focused on developing a parallel capability for the 2D as well as for the 3D codes developed in this project. This will be reported in the final report. Here we report the work carried out on testing the AILB model for conditions including the thermal effects. A simplified thermal LB model, based on the thermal energy distribution approach, was developed. The simplifications are made after neglecting the viscous heat dissipation and the work done by pressure in the original thermal energy distribution model. Details of the model are presented here, followed by a discussion of the boundary conditions, and then results for some two-phase thermal problems.« less

  2. Physically-Based Modelling and Real-Time Simulation of Fluids.

    NASA Astrophysics Data System (ADS)

    Chen, Jim Xiong

    1995-01-01

    Simulating physically realistic complex fluid behaviors presents an extremely challenging problem for computer graphics researchers. Such behaviors include the effects of driving boats through water, blending differently colored fluids, rain falling and flowing on a terrain, fluids interacting in a Distributed Interactive Simulation (DIS), etc. Such capabilities are useful in computer art, advertising, education, entertainment, and training. We present a new method for physically-based modeling and real-time simulation of fluids in computer graphics and dynamic virtual environments. By solving the 2D Navier -Stokes equations using a CFD method, we map the surface into 3D using the corresponding pressures in the fluid flow field. This achieves realistic real-time fluid surface behaviors by employing the physical governing laws of fluids but avoiding extensive 3D fluid dynamics computations. To complement the surface behaviors, we calculate fluid volume and external boundary changes separately to achieve full 3D general fluid flow. To simulate physical activities in a DIS, we introduce a mechanism which uses a uniform time scale proportional to the clock-time and variable time-slicing to synchronize physical models such as fluids in the networked environment. Our approach can simulate many different fluid behaviors by changing the internal or external boundary conditions. It can model different kinds of fluids by varying the Reynolds number. It can simulate objects moving or floating in fluids. It can also produce synchronized general fluid flows in a DIS. Our model can serve as a testbed to simulate many other fluid phenomena which have never been successfully modeled previously.

  3. An Auto-Calibrating Knee Flexion-Extension Axis Estimator Using Principal Component Analysis with Inertial Sensors.

    PubMed

    McGrath, Timothy; Fineman, Richard; Stirling, Leia

    2018-06-08

    Inertial measurement units (IMUs) have been demonstrated to reliably measure human joint angles—an essential quantity in the study of biomechanics. However, most previous literature proposed IMU-based joint angle measurement systems that required manual alignment or prescribed calibration motions. This paper presents a simple, physically-intuitive method for IMU-based measurement of the knee flexion/extension angle in gait without requiring alignment or discrete calibration, based on computationally-efficient and easy-to-implement Principle Component Analysis (PCA). The method is compared against an optical motion capture knee flexion/extension angle modeled through OpenSim. The method is evaluated using both measured and simulated IMU data in an observational study ( n = 15) with an absolute root-mean-square-error (RMSE) of 9.24∘ and a zero-mean RMSE of 3.49∘. Variation in error across subjects was found, made emergent by the larger subject population than previous literature considers. Finally, the paper presents an explanatory model of RMSE on IMU mounting location. The observational data suggest that RMSE of the method is a function of thigh IMU perturbation and axis estimation quality. However, the effect size for these parameters is small in comparison to potential gains from improved IMU orientation estimations. Results also highlight the need to set relevant datums from which to interpret joint angles for both truth references and estimated data.

  4. Fun During Knee Rehabilitation: Feasibility and Acceptability Testing of a New Android-Based Training Device.

    PubMed

    Weber-Spickschen, Thomas Sanjay; Colcuc, Christian; Hanke, Alexander; Clausen, Jan-Dierk; James, Paul Abraham; Horstmann, Hauke

    2017-01-01

    The initial goals of rehabilitation after knee injuries and operations are to achieve full knee extension and to activate quadriceps muscle. In addition to regular physiotherapy, an android-based knee training device is designed to help patients achieve these goals and improve compliance in the early rehabilitation period. This knee training device combines fun in a computer game with muscular training or rehabilitation. Our aim was to test the feasibility and acceptability of this new device. 50 volunteered subjects enrolled to test out the computer game aided device. The first game was the high-striker game, which recorded maximum knee extension power. The second game involved controlling quadriceps muscular power to simulate flying an aeroplane in order to record accuracy of muscle activation. The subjects evaluated this game by completing a simple questionnaire. No technical problem was encountered during the usage of this device. No subjects complained of any discomfort after using this device. Measurements including maximum knee extension power, knee muscle activation and control were recorded successfully. Subjects rated their experience with the device as either excellent or very good and agreed that the device can motivate and monitor the progress of knee rehabilitation training. To the best of our knowledge, this is the first android-based tool available to fast track knee rehabilitation training. All subjects gave very positive feedback to this computer game aided knee device.

  5. Radiation dominated acoustophoresis driven by surface acoustic waves.

    PubMed

    Guo, Jinhong; Kang, Yuejun; Ai, Ye

    2015-10-01

    Acoustophoresis-based particle manipulation in microfluidics has gained increasing attention in recent years. Despite the fact that experimental studies have been extensively performed to demonstrate this technique for various microfluidic applications, numerical simulation of acoustophoresis driven by surface acoustic waves (SAWs) has still been largely unexplored. In this work, a numerical model taking into account the acoustic-piezoelectric interaction was developed to simulate the generation of a standing surface acoustic wave (SSAW) field and predict the acoustic pressure field in the liquid. Acoustic radiation dominated particle tracing was performed to simulate acoustophoresis of particles with different sizes undergoing a SSAW field. A microfluidic device composed of two interdigital transducers (IDTs) for SAW generation and a microfluidic channel was fabricated for experimental validation. Numerical simulations could well capture the focusing phenomenon of particles to the pressure nodes in the experimental observation. Further comparison of particle trajectories demonstrated considerably quantitative agreement between numerical simulations and experimental results with fitting in the applied voltage. Particle switching was also demonstrated using the fabricated device that could be further developed as an active particle sorting device. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Local rules simulation of the kinetics of virus capsid self-assembly.

    PubMed

    Schwartz, R; Shor, P W; Prevelige, P E; Berger, B

    1998-12-01

    A computer model is described for studying the kinetics of the self-assembly of icosahedral viral capsids. Solution of this problem is crucial to an understanding of the viral life cycle, which currently cannot be adequately addressed through laboratory techniques. The abstract simulation model employed to address this is based on the local rules theory of. Proc. Natl. Acad. Sci. USA. 91:7732-7736). It is shown that the principle of local rules, generalized with a model of kinetics and other extensions, can be used to simulate complicated problems in self-assembly. This approach allows for a computationally tractable molecular dynamics-like simulation of coat protein interactions while retaining many relevant features of capsid self-assembly. Three simple simulation experiments are presented to illustrate the use of this model. These show the dependence of growth and malformation rates on the energetics of binding interactions, the tolerance of errors in binding positions, and the concentration of subunits in the examples. These experiments demonstrate a tradeoff within the model between growth rate and fidelity of assembly for the three parameters. A detailed discussion of the computational model is also provided.

  7. Experimental and Computational Study of Sonic and Supersonic Jet Plumes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Naughton, J. W.; Fletcher, D. G.; Edwards, Thomas A. (Technical Monitor)

    1994-01-01

    Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock-shear-layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.

  8. Sonic and Supersonic Jet Plumes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Naughton, J. W.; Flethcher, D. G.; Edwards, Thomas A. (Technical Monitor)

    1994-01-01

    Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock- shear- layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.

  9. Simplified ISCCP cloud regimes for evaluating cloudiness in CMIP5 models

    NASA Astrophysics Data System (ADS)

    Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin

    2017-01-01

    We take advantage of ISCCP simulator data available for many models that participated in CMIP5, in order to introduce a framework for comparing model cloud output with corresponding ISCCP observations based on the cloud regime (CR) concept. Simplified global CRs are employed derived from the co-variations of three variables, namely cloud optical thickness, cloud top pressure and cloud fraction ( τ, p c , CF). Following evaluation criteria established in a companion paper of ours (Jin et al. 2016), we assess model cloud simulation performance based on how well the simplified CRs are simulated in terms of similarity of centroids, global values and map correlations of relative-frequency-of-occurrence, and long-term total cloud amounts. Mirroring prior results, modeled clouds tend to be too optically thick and not as extensive as in observations. CRs with high-altitude clouds from storm activity are not as well simulated here compared to the previous study, but other regimes containing near-overcast low clouds show improvement. Models that have performed well in the companion paper against CRs defined by joint τ- p c histograms distinguish themselves again here, but improvements for previously underperforming models are also seen. Averaging across models does not yield a drastically better picture, except for cloud geographical locations. Cloud evaluation with simplified regimes seems thus more forgiving than that using histogram-based CRs while still strict enough to reveal model weaknesses.

  10. A Phenomenological Model and Validation of Shortening Induced Force Depression during Muscle Contractions

    PubMed Central

    McGowan, C.P.; Neptune, R.R.; Herzog, W.

    2009-01-01

    History dependent effects on muscle force development following active changes in length have been measured in a number of experimental studies. However, few muscle models have included these properties or examined their impact on force and power output in dynamic cyclic movements. The goal of this study was to develop and validate a modified Hill-type muscle model that includes shortening induced force depression and assess its influence on locomotor performance. The magnitude of force depression was defined by empirical relationships based on muscle mechanical work. To validate the model, simulations incorporating force depression were developed to emulate single muscle in situ and whole muscle group leg extension experiments. There was excellent agreement between simulation and experimental values, with in situ force patterns closely matching the experimental data (average RMS error < 1.5 N) and force depression in the simulated leg extension exercise being similar in magnitude to experimental values (6.0% vs 6.5%, respectively). To examine the influence of force depression on locomotor performance, simulations of maximum power pedaling with and without force depression were generated. Force depression decreased maximum crank power by 20% – 40%, depending on the relationship between force depression and muscle work used. These results indicate that force depression has the potential to substantially influence muscle power output in dynamic cyclic movements. However, to fully understand the impact of this phenomenon on human movement, more research is needed to characterize the relationship between force depression and mechanical work in large muscles with different morphologies. PMID:19879585

  11. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  12. Evaluation of total knee mechanics using a crouching simulator with a synthetic knee substitute.

    PubMed

    Lowry, Michael; Rosenbaum, Heather; Walker, Peter S

    2016-05-01

    Mechanical evaluation of total knees is frequently required for aspects such as wear, strength, kinematics, contact areas, and force transmission. In order to carry out such tests, we developed a crouching simulator, based on the Oxford-type machine, with novel features including a synthetic knee including ligaments. The instrumentation and data processing methods enabled the determination of contact area locations and interface forces and moments, for a full flexion-extension cycle. To demonstrate the use of the simulator, we carried out a comparison of two different total knee designs, cruciate retaining and substituting. The first part of the study describes the simulator design and the methodology for testing the knees without requiring cadaveric knee specimens. The degrees of freedom of the anatomic hip and ankle joints were reproduced. Flexion-extension was obtained by changing quadriceps length, while variable hamstring forces were applied using springs. The knee joint was represented by three-dimensional printed blocks on to which the total knee components were fixed. Pretensioned elastomeric bands of realistic stiffnesses passed through holes in the block at anatomical locations to represent ligaments. Motion capture of the knees during flexion, together with laser scanning and computer modeling, was used to reconstruct contact areas on the bearing surfaces. A method was also developed for measuring tibial component interface forces and moments as a comparative assessment of fixation. The method involved interposing Tekscan pads at locations on the interface. Overall, the crouching machine and the methodology could be used for many different mechanical measurements of total knee designs, adapted especially for comparative or parametric studies. © IMechE 2016.

  13. Sequence dependency of canonical base pair opening in the DNA double helix

    PubMed Central

    Villa, Alessandra

    2017-01-01

    The flipping-out of a DNA base from the double helical structure is a key step of many cellular processes, such as DNA replication, modification and repair. Base pair opening is the first step of base flipping and the exact mechanism is still not well understood. We investigate sequence effects on base pair opening using extensive classical molecular dynamics simulations targeting the opening of 11 different canonical base pairs in two DNA sequences. Two popular biomolecular force fields are applied. To enhance sampling and calculate free energies, we bias the simulation along a simple distance coordinate using a newly developed adaptive sampling algorithm. The simulation is guided back and forth along the coordinate, allowing for multiple opening pathways. We compare the calculated free energies with those from an NMR study and check assumptions of the model used for interpreting the NMR data. Our results further show that the neighboring sequence is an important factor for the opening free energy, but also indicates that other sequence effects may play a role. All base pairs are observed to have a propensity for opening toward the major groove. The preferred opening base is cytosine for GC base pairs, while for AT there is sequence dependent competition between the two bases. For AT opening, we identify two non-canonical base pair interactions contributing to a local minimum in the free energy profile. For both AT and CG we observe long-lived interactions with water and with sodium ions at specific sites on the open base pair. PMID:28369121

  14. Simulation of laminate composites degradation using mesoscopic non-local damage model and non-local layered shell element

    NASA Astrophysics Data System (ADS)

    Germain, Norbert; Besson, Jacques; Feyel, Frédéric

    2007-07-01

    Simulating damage and failure of laminate composites structures often fails when using the standard finite element procedure. The difficulties arise from an uncontrolled mesh dependence caused by damage localization and an increase in computational costs. One of the solutions to the first problem, widely used to predict the failure of metallic materials, consists of using non-local damage constitutive equations. The second difficulty can then be solved using specific finite element formulations, such as shell element, which decrease the number of degrees of freedom. The main contribution of this paper consists of extending these techniques to layered materials such as polymer matrix composites. An extension of the non-local implicit gradient formulation, accounting for anisotropy and stratification, and an original layered shell element, based on a new partition of the unity, are proposed. Finally the efficiency of the resulting numerical scheme is studied by comparing simulation with experimental results.

  15. Efficient kinetic method for fluid simulation beyond the Navier-Stokes equation.

    PubMed

    Zhang, Raoyang; Shan, Xiaowen; Chen, Hudong

    2006-10-01

    We present a further theoretical extension to the kinetic-theory-based formulation of the lattice Boltzmann method of Shan [J. Fluid Mech. 550, 413 (2006)]. In addition to the higher-order projection of the equilibrium distribution function and a sufficiently accurate Gauss-Hermite quadrature in the original formulation, a regularization procedure is introduced in this paper. This procedure ensures a consistent order of accuracy control over the nonequilibrium contributions in the Galerkin sense. Using this formulation, we construct a specific lattice Boltzmann model that accurately incorporates up to third-order hydrodynamic moments. Numerical evidence demonstrates that the extended model overcomes some major defects existing in conventionally known lattice Boltzmann models, so that fluid flows at finite Knudsen number Kn can be more quantitatively simulated. Results from force-driven Poiseuille flow simulations predict the Knudsen's minimum and the asymptotic behavior of flow flux at large Kn.

  16. Comparison of Chain Conformation of Poly(vinyl alcohol) in Solutions and Melts from Quantum Chemistry Based Molecular Dynamics Simulations

    NASA Technical Reports Server (NTRS)

    Jaffe, Richard; Han, Jie; Matsuda, Tsunetoshi; Yoon, Do; Langhoff, Stephen R. (Technical Monitor)

    1997-01-01

    Confirmations of 2,4-dihydroxypentane (DHP), a model molecule for poly(vinyl alcohol), have been studied by quantum chemistry (QC) calculations and molecular dynamics (MD) simulations. QC calculations at the 6-311G MP2 level show the meso tt conformer to be lowest in energy followed by the racemic tg, due to intramolecular hydrogen bond between the hydroxy groups. The Dreiding force field has been modified to reproduce the QC conformer energies for DHP. MD simulations using this force field have been carried out for DHP molecules in the gas phase, melt, and CHCl3 and water solutions. Extensive intramolecular hydrogen bonding is observed for the gas phase and CHCl3 solution, but not for the melt or aqueous solution, Such a condensed phase effect due to intermolecular interactions results in a drastic change in chain conformations, in agreement with experiments.

  17. A system of IAC neural networks as the basis for self-organization in a sociological dynamical system simulation.

    PubMed

    Duong, D V; Reilly, K D

    1995-10-01

    This sociological simulation uses the ideas of semiotics and symbolic interactionism to demonstrate how an appropriately developed associative memory in the minds of individuals on the microlevel can self-organize into macrolevel dissipative structures of societies such as racial cultural/economic classes, status symbols and fads. The associative memory used is based on an extension of the IAC neural network (the Interactive Activation and Competition network). Several IAC networks act together to form a society by virtue of their human-like properties of intuition and creativity. These properties give them the ability to create and understand signs, which lead to the macrolevel structures of society. This system is implemented in hierarchical object oriented container classes which facilitate change in deep structure. Graphs of general trends and an historical account of a simulation run of this dynamical system are presented.

  18. Potential evapotranspiration and the likelihood of future drought

    NASA Technical Reports Server (NTRS)

    Rind, D.; Hansen, J.; Goldberg, R.; Rosenzweig, C.; Ruedy, R.

    1990-01-01

    The possibility that the greenhouse warming predicted by the GISS general-circulation model and other GCMs could lead to severe droughts is investigated by means of numerical simulations, with a focus on the role of potential evapotranspiration E(P). The relationships between precipitation (P), E(P), soil moisture, and vegetation changes in GCMs are discussed; the empirically derived Palmer drought-intensity index and a new supply-demand index (SDDI) based on changes in P - E(P) are described; and simulation results for the period 1960-2060 are presented in extensive tables, graphs, and computer-generated color maps. Simulations with both drought indices predict increasing drought frequency for the U.S., with effects already apparent in the 1990s and a 50-percent frequency of severe droughts by the 2050s. Analyses of arid periods during the Mesozoic and Cenozoic are shown to support the use of the SDDI in GCM drought prediction.

  19. Faunus: An object oriented framework for molecular simulation

    PubMed Central

    Lund, Mikael; Trulsson, Martin; Persson, Björn

    2008-01-01

    Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331

  20. Modeling and Simulation of a Nuclear Fuel Element Test Section

    NASA Technical Reports Server (NTRS)

    Moran, Robert P.; Emrich, William

    2011-01-01

    "The Nuclear Thermal Rocket Element Environmental Simulator" test section closely simulates the internal operating conditions of a thermal nuclear rocket. The purpose of testing is to determine the ideal fuel rod characteristics for optimum thermal heat transfer to their hydrogen cooling/working fluid while still maintaining fuel rod structural integrity. Working fluid exhaust temperatures of up to 5,000 degrees Fahrenheit can be encountered. The exhaust gas is rendered inert and massively reduced in temperature for analysis using a combination of water cooling channels and cool N2 gas injectors in the H2-N2 mixer portion of the test section. An extensive thermal fluid analysis was performed in support of the engineering design of the H2-N2 mixer in order to determine the maximum "mass flow rate"-"operating temperature" curve of the fuel elements hydrogen exhaust gas based on the test facilities available cooling N2 mass flow rate as the limiting factor.

  1. Extension of a Kinetic Approach to Chemical Reactions to Electronic Energy Levels and Reactions Involving Charged Species with Application to DSMC Simulations

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    The ability to compute rarefied, ionized hypersonic flows is becoming more important as missions such as Earth reentry, landing high mass payloads on Mars, and the exploration of the outer planets and their satellites are being considered. Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties are extended in the current work to include electronic energy level transitions and reactions involving charged particles. These extensions are shown to agree favorably with reported transition and reaction rates from the literature for near-equilibrium conditions. Also, the extensions are applied to the second flight of the Project FIRE flight experiment at 1634 seconds with a Knudsen number of 0.001 at an altitude of 76.4 km. In order to accomplish this, NASA's direct simulation Monte Carlo code DAC was rewritten to include the ability to simulate charge-neutral ionized flows, take advantage of the recently introduced chemistry model, and to include the extensions presented in this work. The 1634 second data point was chosen for comparisons to be made in order to include a CFD solution. The Knudsen number at this point in time is such that the DSMC simulations are still tractable and the CFD computations are at the edge of what is considered valid because, although near-transitional, the flow is still considered to be continuum. It is shown that the inclusion of electronic energy levels in the DSMC simulation is necessary for flows of this nature and is required for comparison to the CFD solution. The flow field solutions are also post-processed by the nonequilibrium radiation code HARA to compute the radiative portion.

  2. Simulating ungulate herbivory across forest landscapes: A browsing extension for LANDIS-II

    USGS Publications Warehouse

    DeJager, Nathan R.; Drohan, Patrick J.; Miranda, Brian M.; Sturtevant, Brian R.; Stout, Susan L.; Royo, Alejandro; Gustafson, Eric J.; Romanski, Mark C.

    2017-01-01

    Browsing ungulates alter forest productivity and vegetation succession through selective foraging on species that often dominate early succession. However, the long-term and large-scale effects of browsing on forest succession are not possible to project without the use of simulation models. To explore the effects of ungulates on succession in a spatially explicit manner, we developed a Browse Extension that simulates the effects of browsing ungulates on the growth and survival of plant species cohorts within the LANDIS-II spatially dynamic forest landscape simulation model framework. We demonstrate the capabilities of the new extension and explore the spatial effects of ungulates on forest composition and dynamics using two case studies. The first case study examined the long-term effects of persistently high white-tailed deer browsing rates in the northern hardwood forests of the Allegheny National Forest, USA. In the second case study, we incorporated a dynamic ungulate population model to simulate interactions between the moose population and boreal forest landscape of Isle Royale National Park, USA. In both model applications, browsing reduced total aboveground live biomass and caused shifts in forest composition. Simulations that included effects of browsing resulted in successional patterns that were more similar to those observed in the study regions compared to simulations that did not incorporate browsing effects. Further, model estimates of moose population density and available forage biomass were similar to previously published field estimates at Isle Royale and in other moose-boreal forest systems. Our simulations suggest that neglecting effects of browsing when modeling forest succession in ecosystems known to be influenced by ungulates may result in flawed predictions of aboveground biomass and tree species composition.

  3. Xi-cam: Flexible High Throughput Data Processing for GISAXS

    NASA Astrophysics Data System (ADS)

    Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sarje, Abinav; Krishnan, Hari; Pellouchoud, Lenson; Ren, Fang; Fournier, Amanda; Jiang, Zhang; Tassone, Christopher; Mehta, Apurva; Sethian, James; Hexemer, Alexander

    With increasing capabilities and data demand for GISAXS beamlines, supporting software is under development to handle larger data rates, volumes, and processing needs. We aim to provide a flexible and extensible approach to GISAXS data treatment as a solution to these rising needs. Xi-cam is the CAMERA platform for data management, analysis, and visualization. The core of Xi-cam is an extensible plugin-based GUI platform which provides users an interactive interface to processing algorithms. Plugins are available for SAXS/GISAXS data and data series visualization, as well as forward modeling and simulation through HipGISAXS. With Xi-cam's advanced mode, data processing steps are designed as a graph-based workflow, which can be executed locally or remotely. Remote execution utilizes HPC or de-localized resources, allowing for effective reduction of high-throughput data. Xi-cam is open-source and cross-platform. The processing algorithms in Xi-cam include parallel cpu and gpu processing optimizations, also taking advantage of external processing packages such as pyFAI. Xi-cam is available for download online.

  4. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  5. Drug discovery using very large numbers of patents. General strategy with extensive use of match and edit operations

    NASA Astrophysics Data System (ADS)

    Robson, Barry; Li, Jin; Dettinger, Richard; Peters, Amanda; Boyer, Stephen K.

    2011-05-01

    A patent data base of 6.7 million compounds generated by a very high performance computer (Blue Gene) requires new techniques for exploitation when extensive use of chemical similarity is involved. Such exploitation includes the taxonomic classification of chemical themes, and data mining to assess mutual information between themes and companies. Importantly, we also launch candidates that evolve by "natural selection" as failure of partial match against the patent data base and their ability to bind to the protein target appropriately, by simulation on Blue Gene. An unusual feature of our method is that algorithms and workflows rely on dynamic interaction between match-and-edit instructions, which in practice are regular expressions. Similarity testing by these uses SMILES strings and, less frequently, graph or connectivity representations. Examining how this performs in high throughput, we note that chemical similarity and novelty are human concepts that largely have meaning by utility in specific contexts. For some purposes, mutual information involving chemical themes might be a better concept.

  6. NITPICK: peak identification for mass spectrometry data.

    PubMed

    Renard, Bernhard Y; Kirchner, Marc; Steen, Hanno; Steen, Judith A J; Hamprecht, Fred A

    2008-08-28

    The reliable extraction of features from mass spectra is a fundamental step in the automated analysis of proteomic mass spectrometry (MS) experiments. This contribution proposes a sparse template regression approach to peak picking called NITPICK. NITPICK is a Non-greedy, Iterative Template-based peak PICKer that deconvolves complex overlapping isotope distributions in multicomponent mass spectra. NITPICK is based on fractional averaging, a novel extension to Senko's well-known averaging model, and on a modified version of sparse, non-negative least angle regression, for which a suitable, statistically motivated early stopping criterion has been derived. The strength of NITPICK is the deconvolution of overlapping mixture mass spectra. Extensive comparative evaluation has been carried out and results are provided for simulated and real-world data sets. NITPICK outperforms pepex, to date the only alternate, publicly available, non-greedy feature extraction routine. NITPICK is available as software package for the R programming language and can be downloaded from (http://hci.iwr.uni-heidelberg.de/mip/proteomics/).

  7. Theory and simulations of covariance mapping in multiple dimensions for data analysis in high-event-rate experiments

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.

    2014-05-01

    Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.

  8. A comparison of four streamflow record extension techniques

    USGS Publications Warehouse

    Hirsch, Robert M.

    1982-01-01

    One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., ‘line of organic correlation,’ ‘reduced major axis,’ ‘unique solution,’ and ‘equivalence line.’ The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.

  9. A Comparison of Four Streamflow Record Extension Techniques

    NASA Astrophysics Data System (ADS)

    Hirsch, Robert M.

    1982-08-01

    One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., `line of organic correlation,' `reduced major axis,' `unique solution,' and `equivalence line.' The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.

  10. Simulations of material mixing in laser-driven reshock experiments

    NASA Astrophysics Data System (ADS)

    Haines, Brian M.; Grinstein, Fernando F.; Welser-Sherrill, Leslie; Fincke, James R.

    2013-02-01

    We perform simulations of a laser-driven reshock experiment [Welser-Sherrill et al., High Energy Density Phys. (unpublished)] in the strong-shock high energy-density regime to better understand material mixing driven by the Richtmyer-Meshkov instability. Validation of the simulations is based on direct comparison of simulation and radiographic data. Simulations are also compared with published direct numerical simulation and the theory of homogeneous isotropic turbulence. Despite the fact that the flow is neither homogeneous, isotropic nor fully turbulent, there are local regions in which the flow demonstrates characteristics of homogeneous isotropic turbulence. We identify and isolate these regions by the presence of high levels of turbulent kinetic energy (TKE) and vorticity. After reshock, our analysis shows characteristics consistent with those of incompressible isotropic turbulence. Self-similarity and effective Reynolds number assessments suggest that the results are reasonably converged at the finest resolution. Our results show that in shock-driven transitional flows, turbulent features such as self-similarity and isotropy only fully develop once de-correlation, characteristic vorticity distributions, and integrated TKE, have decayed significantly. Finally, we use three-dimensional simulation results to test the performance of two-dimensional Reynolds-averaged Navier-Stokes simulations. In this context, we also test a presumed probability density function turbulent mixing model extensively used in combustion applications.

  11. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.

    PubMed

    Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A

    2017-02-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed  ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.

  12. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model

    PubMed Central

    Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.

    2017-01-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746

  13. Simulation of transmission electron microscope images of biological specimens.

    PubMed

    Rullgård, H; Ofverstedt, L-G; Masich, S; Daneholt, B; Oktem, O

    2011-09-01

    We present a new approach to simulate electron cryo-microscope images of biological specimens. The framework for simulation consists of two parts; the first is a phantom generator that generates a model of a specimen suitable for simulation, the second is a transmission electron microscope simulator. The phantom generator calculates the scattering potential of an atomic structure in aqueous buffer and allows the user to define the distribution of molecules in the simulated image. The simulator includes a well defined electron-specimen interaction model based on the scalar Schrödinger equation, the contrast transfer function for optics, and a noise model that includes shot noise as well as detector noise including detector blurring. To enable optimal performance, the simulation framework also includes a calibration protocol for setting simulation parameters. To test the accuracy of the new framework for simulation, we compare simulated images to experimental images recorded of the Tobacco Mosaic Virus (TMV) in vitreous ice. The simulated and experimental images show good agreement with respect to contrast variations depending on dose and defocus. Furthermore, random fluctuations present in experimental and simulated images exhibit similar statistical properties. The simulator has been designed to provide a platform for development of new instrumentation and image processing procedures in single particle electron microscopy, two-dimensional crystallography and electron tomography with well documented protocols and an open source code into which new improvements and extensions are easily incorporated. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  14. Measurement with microscopic MRI and simulation of flow in different aneurysm models.

    PubMed

    Edelhoff, Daniel; Walczak, Lars; Frank, Frauke; Heil, Marvin; Schmitz, Inge; Weichert, Frank; Suter, Dieter

    2015-10-01

    The impact and the development of aneurysms depend to a significant degree on the exchange of liquid between the regular vessel and the pathological extension. A better understanding of this process will lead to improved prediction capabilities. The aim of the current study was to investigate fluid-exchange in aneurysm models of different complexities by combining microscopic magnetic resonance measurements with numerical simulations. In order to evaluate the accuracy and applicability of these methods, the fluid-exchange process between the unaltered vessel lumen and the aneurysm phantoms was analyzed quantitatively using high spatial resolution. Magnetic resonance flow imaging was used to visualize fluid-exchange in two different models produced with a 3D printer. One model of an aneurysm was based on histological findings. The flow distribution in the different models was measured on a microscopic scale using time of flight magnetic resonance imaging. The whole experiment was simulated using fast graphics processing unit-based numerical simulations. The obtained simulation results were compared qualitatively and quantitatively with the magnetic resonance imaging measurements, taking into account flow and spin-lattice relaxation. The results of both presented methods compared well for the used aneurysm models and the chosen flow distributions. The results from the fluid-exchange analysis showed comparable characteristics concerning measurement and simulation. Similar symmetry behavior was observed. Based on these results, the amount of fluid-exchange was calculated. Depending on the geometry of the models, 7% to 45% of the liquid was exchanged per second. The result of the numerical simulations coincides well with the experimentally determined velocity field. The rate of fluid-exchange between vessel and aneurysm was well-predicted. Hence, the results obtained by simulation could be validated by the experiment. The observed deviations can be caused by the noise in the measurement and by the limited resolution of the simulation. The resulting differences are small enough to allow reliable predictions of the flow distribution in vessels with stents and for pulsed blood flow.

  15. Systematic reconstruction of TRANSPATH data into Cell System Markup Language

    PubMed Central

    Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru

    2008-01-01

    Background Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. Results We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. Conclusion By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions. PMID:18570683

  16. Systematic reconstruction of TRANSPATH data into cell system markup language.

    PubMed

    Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru

    2008-06-23

    Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions.

  17. PhysiCell: An open source physics-based cell simulator for 3-D multicellular systems

    PubMed Central

    Ghaffarizadeh, Ahmadreza; Mumenthaler, Shannon M.

    2018-01-01

    Many multicellular systems problems can only be understood by studying how cells move, grow, divide, interact, and die. Tissue-scale dynamics emerge from systems of many interacting cells as they respond to and influence their microenvironment. The ideal “virtual laboratory” for such multicellular systems simulates both the biochemical microenvironment (the “stage”) and many mechanically and biochemically interacting cells (the “players” upon the stage). PhysiCell—physics-based multicellular simulator—is an open source agent-based simulator that provides both the stage and the players for studying many interacting cells in dynamic tissue microenvironments. It builds upon a multi-substrate biotransport solver to link cell phenotype to multiple diffusing substrates and signaling factors. It includes biologically-driven sub-models for cell cycling, apoptosis, necrosis, solid and fluid volume changes, mechanics, and motility “out of the box.” The C++ code has minimal dependencies, making it simple to maintain and deploy across platforms. PhysiCell has been parallelized with OpenMP, and its performance scales linearly with the number of cells. Simulations up to 105-106 cells are feasible on quad-core desktop workstations; larger simulations are attainable on single HPC compute nodes. We demonstrate PhysiCell by simulating the impact of necrotic core biomechanics, 3-D geometry, and stochasticity on the dynamics of hanging drop tumor spheroids and ductal carcinoma in situ (DCIS) of the breast. We demonstrate stochastic motility, chemical and contact-based interaction of multiple cell types, and the extensibility of PhysiCell with examples in synthetic multicellular systems (a “cellular cargo delivery” system, with application to anti-cancer treatments), cancer heterogeneity, and cancer immunology. PhysiCell is a powerful multicellular systems simulator that will be continually improved with new capabilities and performance improvements. It also represents a significant independent code base for replicating results from other simulation platforms. The PhysiCell source code, examples, documentation, and support are available under the BSD license at http://PhysiCell.MathCancer.org and http://PhysiCell.sf.net. PMID:29474446

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierre, John W.; Wies, Richard; Trudnowski, Daniel

    Time-synchronized measurements provide rich information for estimating a power-system's electromechanical modal properties via advanced signal processing. This information is becoming critical for the improved operational reliability of interconnected grids. A given mode's properties are described by its frequency, damping, and shape. Modal frequencies and damping are useful indicators of power-system stress, usually declining with increased load or reduced grid capacity. Mode shape provides critical information for operational control actions. This project investigated many advanced techniques for power system identification from measured data focusing on mode frequency and damping ratio estimation. Investigators from the three universities coordinated their effort with Pacificmore » Northwest National Laboratory (PNNL). Significant progress was made on developing appropriate techniques for system identification with confidence intervals and testing those techniques on field measured data and through simulation. Experimental data from the western area power system was provided by PNNL and Bonneville Power Administration (BPA) for both ambient conditions and for signal injection tests. Three large-scale tests were conducted for the western area in 2005 and 2006. Measured field PMU (Phasor Measurement Unit) data was provided to the three universities. A 19-machine simulation model was enhanced for testing the system identification algorithms. Extensive simulations were run with this model to test the performance of the algorithms. University of Wyoming researchers participated in four primary activities: (1) Block and adaptive processing techniques for mode estimation from ambient signals and probing signals, (2) confidence interval estimation, (3) probing signal design and injection method analysis, and (4) performance assessment and validation from simulated and field measured data. Subspace based methods have been use to improve previous results from block processing techniques. Bootstrap techniques have been developed to estimate confidence intervals for the electromechanical modes from field measured data. Results were obtained using injected signal data provided by BPA. A new probing signal was designed that puts more strength into the signal for a given maximum peak to peak swing. Further simulations were conducted on a model based on measured data and with the modifications of the 19-machine simulation model. Montana Tech researchers participated in two primary activities: (1) continued development of the 19-machine simulation test system to include a DC line; and (2) extensive simulation analysis of the various system identification algorithms and bootstrap techniques using the 19 machine model. Researchers at the University of Alaska-Fairbanks focused on the development and testing of adaptive filter algorithms for mode estimation using data generated from simulation models and on data provided in collaboration with BPA and PNNL. There efforts consist of pre-processing field data, testing and refining adaptive filter techniques (specifically the Least Mean Squares (LMS), the Adaptive Step-size LMS (ASLMS), and Error Tracking (ET) algorithms). They also improved convergence of the adaptive algorithms by using an initial estimate from block processing AR method to initialize the weight vector for LMS. Extensive testing was performed on simulated data from the 19 machine model. This project was also extensively involved in the WECC (Western Electricity Coordinating Council) system wide tests carried out in 2005 and 2006. These tests involved injecting known probing signals into the western power grid. One of the primary goals of these tests was the reliable estimation of electromechanical mode properties from measured PMU data. Applied to the system were three types of probing inputs: (1) activation of the Chief Joseph Dynamic Brake, (2) mid-level probing at the Pacific DC Intertie (PDCI), and (3) low-level probing on the PDCI. The Chief Joseph Dynamic Brake is a 1400 MW disturbance to the system and is injected for a half of a second. For the mid and low-level probing, the Celilo terminal of the PDCI is modulated with a known probing signal. Similar but less extensive tests were conducted in June of 2000. The low-level probing signals were designed at the University of Wyoming. A number of important design factors are considered. The designed low-level probing signal used in the tests is a multi-sine signal. Its frequency content is focused in the range of the inter-area electromechanical modes. The most frequently used of these low-level multi-sine signals had a period of over two minutes, a root-mean-square (rms) value of 14 MW, and a peak magnitude of 20 MW. Up to 15 cycles of this probing signal were injected into the system resulting in a processing gain of 15. The resulting measured response at points throughout the system was not much larger than the ambient noise present in the measurements.« less

  19. Toward a Principled Sampling Theory for Quasi-Orders

    PubMed Central

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  20. Toward a Principled Sampling Theory for Quasi-Orders.

    PubMed

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  1. Real gas CFD simulations of hydrogen/oxygen supercritical combustion

    NASA Astrophysics Data System (ADS)

    Pohl, S.; Jarczyk, M.; Pfitzner, M.; Rogg, B.

    2013-03-01

    A comprehensive numerical framework has been established to simulate reacting flows under conditions typically encountered in rocket combustion chambers. The model implemented into the commercial CFD Code ANSYS CFX includes appropriate real gas relations based on the volume-corrected Peng-Robinson (PR) equation of state (EOS) for the flow field and a real gas extension of the laminar flamelet combustion model. The results indicate that the real gas relations have a considerably larger impact on the flow field than on the detailed flame structure. Generally, a realistic flame shape could be achieved for the real gas approach compared to experimental data from the Mascotte test rig V03 operated at ONERA when the differential diffusion processes were only considered within the flame zone.

  2. Cavitation-based hydro-fracturing simulator

    DOEpatents

    Wang, Jy-An John; Wang, Hong; Ren, Fei; Cox, Thomas S.

    2016-11-22

    An apparatus 300 for simulating a pulsed pressure induced cavitation technique (PPCT) from a pressurized working fluid (F) provides laboratory research and development for enhanced geothermal systems (EGS), oil, and gas wells. A pump 304 is configured to deliver a pressurized working fluid (F) to a control valve 306, which produces a pulsed pressure wave in a test chamber 308. The pulsed pressure wave parameters are defined by the pump 304 pressure and control valve 306 cycle rate. When a working fluid (F) and a rock specimen 312 are included in the apparatus, the pulsed pressure wave causes cavitation to occur at the surface of the specimen 312, thus initiating an extensive network of fracturing surfaces and micro fissures, which are examined by researchers.

  3. Multispectral Resource Sampler (MRS): Proof of concept. Study on bidirectional reflectance. A simulation analysis of bidirectional reflectance properties and their effects on scene radiance. Implications for the MRS

    NASA Technical Reports Server (NTRS)

    Smith, J. A.

    1980-01-01

    A study was performed to evaluate the geometrical implication of a Multispectral Resource Sampler; a pointable sensor. Several vegetative targets representative of natural and agricultural canopies were considered in two wavelength bands. All combinations of Sun and view angles between 5 and 85 degrees zenith for a range of azimuths were simulated to examine geometrical dependance arising from seasonal as well as latitudinal variation. The effects of three different atmospheres corresponding to clear, medium and heavy haze conditions are included. An extensive model data base was generated to provide investigators with means for possible further study of atmospheric correction procedures and sensor design questions.

  4. Demixing-stimulated lane formation in binary complex plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, C.-R.; Jiang, K.; Suetterlin, K. R.

    2011-11-29

    Recently lane formation and phase separation have been reported for experiments with binary complex plasmas in the PK3-Plus laboratory onboard the International Space Station (ISS). Positive non-additivity of particle interactions is known to stimulate phase separation (demixing), but its effect on lane formation is unknown. In this work, we used Langevin dynamics (LD) simulation to probe the role of non-additivity interactions on lane formation. The competition between laning and demixing leads to thicker lanes. Analysis based on anisotropic scaling indices reveals a crossover from normal laning mode to a demixing-stimulated laning mode. Extensive numerical simulations enabled us to identify amore » critical value of the non-additivity parameter {Delta} for the crossover.« less

  5. CPV cells cooling system based on submerged jet impingement: CFD modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Montorfano, Davide; Gaetano, Antonio; Barbato, Maurizio C.; Ambrosetti, Gianluca; Pedretti, Andrea

    2014-09-01

    Concentrating photovoltaic (CPV) cells offer higher efficiencies with regard to the PV ones and allow to strongly reduce the overall solar cell area. However, to operate correctly and exploit their advantages, their temperature has to be kept low and as uniform as possible and the cooling circuit pressure drops need to be limited. In this work an impingement water jet cooling system specifically designed for an industrial HCPV receiver is studied. Through the literature and by means of accurate computational fluid dynamics (CFD) simulations, the nozzle to plate distance, the number of jets and the nozzle pitch, i.e. the distance between adjacent jets, were optimized. Afterwards, extensive experimental tests were performed to validate pressure drops and cooling power simulation results.

  6. Mass balances for a biological life support system simulation model

    NASA Technical Reports Server (NTRS)

    Volk, Tyler; Rummel, John D.

    1987-01-01

    Design decisions to aid the development of future space based biological life support systems (BLSS) can be made with simulation models. The biochemistry stoichiometry was developed for: (1) protein, carbohydrate, fat, fiber, and lignin production in the edible and inedible parts of plants; (2) food consumption and production of organic solids in urine, feces, and wash water by the humans; and (3) operation of the waste processor. Flux values for all components are derived for a steady state system with wheat as the sole food source. The large scale dynamics of a materially closed (BLSS) computer model is described in a companion paper. An extension of this methodology can explore multifood systems and more complex biochemical dynamics while maintaining whole system closure as a focus.

  7. Numerical Simulation on the Dynamic Splitting Tensile Test of reinforced concrete

    NASA Astrophysics Data System (ADS)

    Zhao, Zhuan; Jia, Haokai; Jing, Lin

    2018-03-01

    The research for crack resistance was of RC was based on the split Hopkinson bar and numerical simulate software LS-DYNA3D. In the research, the difference of dynamic splitting failure modes between plane concrete and reinforced concrete were completed, and the change rule of tensile stress distribution with reinforcement ratio was studied; also the effect rule with the strain rate and the crack resistance was also discussed by the radial tensile stress time history curve of RC specimen under different loading speeds. The results shows that the reinforcement in the concrete can impede the crack extension, defer the failure time of concrete, increase the tension intensity of concrete; with strain rate of concrete increased, the crack resistance of RC increased.

  8. Use of high performance networks and supercomputers for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  9. Double Cluster Heads Model for Secure and Accurate Data Fusion in Wireless Sensor Networks

    PubMed Central

    Fu, Jun-Song; Liu, Yun

    2015-01-01

    Secure and accurate data fusion is an important issue in wireless sensor networks (WSNs) and has been extensively researched in the literature. In this paper, by combining clustering techniques, reputation and trust systems, and data fusion algorithms, we propose a novel cluster-based data fusion model called Double Cluster Heads Model (DCHM) for secure and accurate data fusion in WSNs. Different from traditional clustering models in WSNs, two cluster heads are selected after clustering for each cluster based on the reputation and trust system and they perform data fusion independently of each other. Then, the results are sent to the base station where the dissimilarity coefficient is computed. If the dissimilarity coefficient of the two data fusion results exceeds the threshold preset by the users, the cluster heads will be added to blacklist, and the cluster heads must be reelected by the sensor nodes in a cluster. Meanwhile, feedback is sent from the base station to the reputation and trust system, which can help us to identify and delete the compromised sensor nodes in time. Through a series of extensive simulations, we found that the DCHM performed very well in data fusion security and accuracy. PMID:25608211

  10. A Novel DFT-Based DOA Estimation by a Virtual Array Extension Using Simple Multiplications for FMCW Radar

    PubMed Central

    Kim, Bongseok; Kim, Sangdong; Lee, Jonghun

    2018-01-01

    We propose a novel discrete Fourier transform (DFT)-based direction of arrival (DOA) estimation by a virtual array extension using simple multiplications for frequency modulated continuous wave (FMCW) radar. DFT-based DOA estimation is usually employed in radar systems because it provides the advantage of low complexity for real-time signal processing. In order to enhance the resolution of DOA estimation or to decrease the missing detection probability, it is essential to have a considerable number of channel signals. However, due to constraints of space and cost, it is not easy to increase the number of channel signals. In order to address this issue, we increase the number of effective channel signals by generating virtual channel signals using simple multiplications of the given channel signals. The increase in channel signals allows the proposed scheme to detect DOA more accurately than the conventional scheme while using the same number of channel signals. Simulation results show that the proposed scheme achieves improved DOA estimation compared to the conventional DFT-based method. Furthermore, the effectiveness of the proposed scheme in a practical environment is verified through the experiment. PMID:29758016

  11. A multilingual audiometer simulator software for training purposes.

    PubMed

    Kompis, Martin; Steffen, Pascal; Caversaccio, Marco; Brugger, Urs; Oesch, Ivo

    2012-04-01

    A set of algorithms, which allows a computer to determine the answers of simulated patients during pure tone and speech audiometry, is presented. Based on these algorithms, a computer program for training in audiometry was written and found to be useful for teaching purposes. To develop a flexible audiometer simulator software as a teaching and training tool for pure tone and speech audiometry, both with and without masking. First a set of algorithms, which allows a computer to determine the answers of a simulated, hearing-impaired patient, was developed. Then, the software was implemented. Extensive use was made of simple, editable text files to define all texts in the user interface and all patient definitions. The software 'audiometer simulator' is available for free download. It can be used to train pure tone audiometry (both with and without masking), speech audiometry, measurement of the uncomfortable level, and simple simulation tests. Due to the use of text files, the user can alter or add patient definitions and all texts and labels shown on the screen. So far, English, French, German, and Portuguese user interfaces are available and the user can choose between German or French speech audiometry.

  12. In-flight simulation investigation of rotorcraft pitch-roll cross coupling

    NASA Technical Reports Server (NTRS)

    Watson, Douglas C.; Hindson, William S.

    1988-01-01

    An in-flight simulation experiment investigating the handling qualities effects of the pitch-roll cross-coupling characteristic of single-main-rotor helicopters is described. The experiment was conducted using the NASA/Army CH-47B variable stability helicopter with an explicit-model-following control system. The research is an extension of an earlier ground-based investigation conducted on the NASA Ames Research Center's Vertical Motion Simulator. The model developed for the experiment is for an unaugmented helicopter with cross-coupling implemented using physical rotor parameters. The details of converting the model from the simulation to use in flight are described. A frequency-domain comparison of the model and actual aircraft responses showing the fidelity of the in-flight simulation is described. The evaluation task was representative of nap-of-the-Earth maneuvering flight. The results indicate that task demands are important in determining allowable levels of coupling. In addition, on-axis damping characteristics influence the frequency-dependent characteristics of coupling and affect the handling qualities. Pilot technique, in terms of learned control crossfeeds, can improve performance and lower workload for particular types of coupling. The results obtained in flight corroborated the simulation results.

  13. BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs

    NASA Astrophysics Data System (ADS)

    Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes

    2017-06-01

    Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.

  14. Advanced construction management for lunar base construction - Surface operations planner

    NASA Technical Reports Server (NTRS)

    Kehoe, Robert P.

    1992-01-01

    The study proposes a conceptual solution and lays the framework for developing a new, sophisticated and intelligent tool for a lunar base construction crew to use. This concept integrates expert systems for critical decision making, virtual reality for training, logistics and laydown optimization, automated productivity measurements, and an advanced scheduling tool to form a unique new planning tool. The concept features extensive use of computers and expert systems software to support the actual work, while allowing the crew to control the project from the lunar surface. Consideration is given to a logistics data base, laydown area management, flexible critical progress scheduler, video simulation of assembly tasks, and assembly information and tracking documentation.

  15. 2-D modeling of dual-mode acoustic phonon excitation of a triangular nanoplate

    NASA Astrophysics Data System (ADS)

    Tai, Po-Tse; Yu, Pyng; Tang, Jau

    2010-08-01

    In this theoretical work, we investigated coherent phonon excitation of a triangular nanoplate based on 2-D Fermi-Pasta-Ulam lattice model. Based on the two-temperature model commonly used in description of laser heating of metals, we considered two kinds of forces related to electronic and lattice stresses. Based on extensive simulation and analysis, we identified two major planar phonon modes, namely, a standing wave mode related to the triangle bisector and another mode corresponding to half of the side length. This work elucidates the roles of laser-induced electronic stress and lattice stress in controlling the initial phase and the amplitude ratio between these two phonon modes.

  16. A generic model of real-world non-ideal behaviour of FES-induced muscle contractions: simulation tool

    NASA Astrophysics Data System (ADS)

    Lynch, Cheryl L.; Graham, Geoff M.; Popovic, Milos R.

    2011-08-01

    Functional electrical stimulation (FES) applications are frequently evaluated in simulation prior to testing in human subjects. Such simulations are usually based on the typical muscle responses to electrical stimulation, which may result in an overly optimistic assessment of likely real-world performance. We propose a novel method for simulating FES applications that includes non-ideal muscle behaviour during electrical stimulation resulting from muscle fatigue, spasms and tremors. A 'non-idealities' block that can be incorporated into existing FES simulations and provides a realistic estimate of real-world performance is described. An implementation example is included, showing how the non-idealities block can be incorporated into a simulation of electrically stimulated knee extension against gravity for both a proportional-integral-derivative controller and a sliding mode controller. The results presented in this paper illustrate that the real-world performance of a FES system may be vastly different from the performance obtained in simulation using nominal muscle models. We believe that our non-idealities block should be included in future simulations that involve muscle response to FES, as this tool will provide neural engineers with a realistic simulation of the real-world performance of FES systems. This simulation strategy will help engineers and organizations save time and money by preventing premature human testing. The non-idealities block will become available free of charge at www.toronto-fes.ca in late 2011.

  17. Early Validation of Failure Detection, Isolation, and Recovery Design Using Heterogeneous Modelling and Simulation

    NASA Astrophysics Data System (ADS)

    van der Plas, Peter; Guerriero, Suzanne; Cristiano, Leorato; Rugina, Ana

    2012-08-01

    Modelling and simulation can support a number of use cases across the spacecraft development life-cycle. Given the increasing complexity of space missions, the observed general trend is for a more extensive usage of simulation already in the early phases. A major perceived advantage is that modelling and simulation can enable the validation of critical aspects of the spacecraft design before the actual development is started, as such reducing the risk in later phases.Failure Detection, Isolation, and Recovery (FDIR) is one of the areas with a high potential to benefit from early modelling and simulation. With the increasing level of required spacecraft autonomy, FDIR specifications can grow in such a way that the traditional document-based review process soon becomes inadequate.This paper shows that FDIR modelling and simulation in a system context can provide a powerful tool to support the FDIR verification process. It is highlighted that FDIR modelling at this early stage requires heterogeneous modelling tools and languages, in order to provide an adequate functional description of the different components (i.e. FDIR functions, environment, equipment, etc.) to be modelled.For this reason, an FDIR simulation framework is proposed in this paper. This framework is based on a number of tools already available in the Avionics Systems Laboratory at ESTEC, which are the Avionics Test Bench Functional Engineering Simulator (ATB FES), Matlab/Simulink, TASTE, and Real Time Developer Studio (RTDS).The paper then discusses the application of the proposed simulation framework to a real case-study, i.e. the FDIR modelling of a satellite in support of actual ESA mission. Challenges and benefits of the approach are described. Finally, lessons learned and the generality of the proposed approach are discussed.

  18. A method to incorporate the effect of beam quality on image noise in a digitally reconstructed radiograph (DRR) based computer simulation for optimisation of digital radiography

    NASA Astrophysics Data System (ADS)

    Moore, Craig S.; Wood, Tim J.; Saunderson, John R.; Beavis, Andrew W.

    2017-09-01

    The use of computer simulated digital x-radiographs for optimisation purposes has become widespread in recent years. To make these optimisation investigations effective, it is vital simulated radiographs contain accurate anatomical and system noise. Computer algorithms that simulate radiographs based solely on the incident detector x-ray intensity (‘dose’) have been reported extensively in the literature. However, while it has been established for digital mammography that x-ray beam quality is an important factor when modelling noise in simulated images there are no such studies for diagnostic imaging of the chest, abdomen and pelvis. This study investigates the influence of beam quality on image noise in a digital radiography (DR) imaging system, and incorporates these effects into a digitally reconstructed radiograph (DRR) computer simulator. Image noise was measured on a real DR imaging system as a function of dose (absorbed energy) over a range of clinically relevant beam qualities. Simulated ‘absorbed energy’ and ‘beam quality’ DRRs were then created for each patient and tube voltage under investigation. Simulated noise images, corrected for dose and beam quality, were subsequently produced from the absorbed energy and beam quality DRRs, using the measured noise, absorbed energy and beam quality relationships. The noise images were superimposed onto the noiseless absorbed energy DRRs to create the final images. Signal-to-noise measurements in simulated chest, abdomen and spine images were within 10% of the corresponding measurements in real images. This compares favourably to our previous algorithm where images corrected for dose only were all within 20%.

  19. The Logical Extension

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The same software controlling autonomous and crew-assisted operations for the International Space Station (ISS) is enabling commercial enterprises to integrate and automate manual operations, also known as decision logic, in real time across complex and disparate networked applications, databases, servers, and other devices, all with quantifiable business benefits. Auspice Corporation, of Framingham, Massachusetts, developed the Auspice TLX (The Logical Extension) software platform to effectively mimic the human decision-making process. Auspice TLX automates operations across extended enterprise systems, where any given infrastructure can include thousands of computers, servers, switches, and modems that are connected, and therefore, dependent upon each other. The concept behind the Auspice software spawned from a computer program originally developed in 1981 by Cambridge, Massachusetts-based Draper Laboratory for simulating tasks performed by astronauts aboard the Space Shuttle. At the time, the Space Shuttle Program was dependent upon paper-based procedures for its manned space missions, which typically averaged 2 weeks in duration. As the Shuttle Program progressed, NASA began increasing the length of manned missions in preparation for a more permanent space habitat. Acknowledging the need to relinquish paper-based procedures in favor of an electronic processing format to properly monitor and manage the complexities of these longer missions, NASA realized that Draper's task simulation software could be applied to its vision of year-round space occupancy. In 1992, Draper was awarded a NASA contract to build User Interface Language software to enable autonomous operations of a multitude of functions on Space Station Freedom (the station was redesigned in 1993 and converted into the international venture known today as the ISS)

  20. A simulation study on few parameters of Cherenkov photons in extensive air showers of different primaries incident at various zenith angles over a high altitude observation level

    NASA Astrophysics Data System (ADS)

    Das, G. S.; Hazarika, P.; Goswami, U. D.

    2018-07-01

    We have studied the distribution patterns of lateral density, arrival time and angular position of Cherenkov photons generated in Extensive Air Showers (EASs) initiated by γ-ray, proton and iron primaries incident with various energies and at various zenith angles. This study is the extension of our earlier work [1] to cover a wide energy range of ground based γ-ray astronomy with a wide range of zenith angles (≤40°) of primary particles, as well as the extension to study the angular distribution patterns of Cherenkov photons in EASs. This type of study is important for distinguishing the γ-ray initiated showers from the hadronic showers in the ground based γ-ray astronomy, where Atmospheric Cherenkov Technique (ACT) is being used. Importantly, such study gives an insight on the nature of γ-ray and hadronic showers in general. In this work, the CORSIKA 6.990 simulation code is used for generation of EASs. Similarly to the case of Ref. [1], this study also revealed that, the lateral density and arrival time distributions of Cherenkov photons vary almost in accordance with the functions: ρch(r) =ρ0e-βr and tch(r) =t0eΓ/rλ respectively by taking different values of the parameters of functions for the type, energy and zenith angle of the primary particle. The distribution of Cherenkov photon's angular positions with respect to shower axis shows distinctive features depending on the primary type, its energy and the zenith angle. As a whole this distribution pattern for the iron primary is noticeably different from those for γ-ray and proton primaries. The value of the angular position at which the maximum number of Cherenkov photons are concentrated, increases with increase in energy of vertically incident primary, but for inclined primary it lies within a small value (≤1°) for almost all energies and primary types. No significant difference in the results obtained by using the high energy hadronic interaction models, viz., QGSJETII and EPOS has been observed.

Top