Sample records for existing simulation tools

  1. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    EPA Science Inventory

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  2. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  3. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  4. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  5. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less

  6. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  7. Residential Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, Michael R; Abdelaziz, Omar A; Jackson, Rogerick K

    Residential Simulation Tool was developed to understand the impact of residential load consumption on utilities including the role of demand response. This is complicated as many different residential loads exist and are utilized for different purposes. The tool models human behavior and contributes this to load utilization, which contributes to the electrical consumption prediction by the tool. The tool integrates a number of different databases from Department of Energy and other Government websites to support the load consumption prediction.

  8. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    ERIC Educational Resources Information Center

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  9. Modeling the Environmental Impact of Air Traffic Operations

    NASA Technical Reports Server (NTRS)

    Chen, Neil

    2011-01-01

    There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.

  10. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  11. SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output

    PubMed Central

    Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.

    2011-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297

  12. SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†

    PubMed Central

    Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.

    2013-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136

  13. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  14. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  15. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  16. Technical Highlight: NREL Improves Building Energy Simulation Programs Through Diagnostic Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, B.

    2012-01-09

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market.

  17. An Advanced, Interactive, High-Performance Liquid Chromatography Simulator and Instructor Resources

    ERIC Educational Resources Information Center

    Boswell, Paul G.; Stoll, Dwight R.; Carr, Peter W.; Nagel, Megan L.; Vitha, Mark F.; Mabbott, Gary A.

    2013-01-01

    High-performance liquid chromatography (HPLC) simulation software has long been recognized as an effective educational tool, yet many of the existing HPLC simulators are either too expensive, outdated, or lack many important features necessary to make them widely useful for educational purposes. Here, a free, open-source HPLC simulator is…

  18. Engineering Effort Needed to Design Spacecraft with Radiation Constraints

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.

    2005-01-01

    A roadmap is articulated that describes what is needed to allow designers, to include researchers, management, and engineers, to investigate, design, build, test, and fly spacecraft that meet the mission requirements yet, be as low cost as possible. This roadmap describes seven levels of tool fidelity and application: 1) Mission Speculation, 2) Management Overview, 3) Mission Design, 4) Detailed Design, 5) Simulation and Training, 6) Operations, and 7) Research. The interfaces and output are described in top-level detail along with the transport engines needed, and deficiencies are noted. This roadmap, if implemented, will allow Multidisciplinary Optimization (MDO) ideas to incorporate radiation concerns. Also, as NASA moves towards Simulation Based Acquisition (SBA), these tools will facilitate the appropriate spending of government money. Most of the tools needed to serve these levels do not exist or exist in pieces and need to be integrated to create the tool.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.

  20. Using Simulation to Teach About Poverty in Nursing Education: A Review of Available Tools.

    PubMed

    Reid, Carol A; Evanson, Tracy A

    2016-01-01

    Poverty is one of the most significant social determinants of health, and as such, it is imperative that nurses have an understanding of the impact that living in poverty has upon one's life and health. A lack of such understanding will impede nurses from providing care that is patient centered, treats all patients fairly, and advocates for social justice. It is essential that nursing educators assure that poverty-related content and effective teaching strategies are used in nursing curricula in order to help students develop this understanding. Several poverty-simulation tools are available and may be able to assist with development of accurate knowledge, skills, and attitudes. Unfortunately, little evidence exists to evaluate most poverty simulation tools. This article will provide an introduction to several poverty-related simulation tools, discuss any related research that evaluates their effectiveness, and make recommendations for integration of such simulation tools into nursing curricula. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Response simulation and theoretical calibration of a dual-induction resistivity LWD tool

    NASA Astrophysics Data System (ADS)

    Xu, Wei; Ke, Shi-Zhen; Li, An-Zong; Chen, Peng; Zhu, Jun; Zhang, Wei

    2014-03-01

    In this paper, responses of a new dual-induction resistivity logging-while-drilling (LWD) tool in 3D inhomogeneous formation models are simulated by the vector finite element method (VFEM), the influences of the borehole, invaded zone, surrounding strata, and tool eccentricity are analyzed, and calibration loop parameters and calibration coefficients of the LWD tool are discussed. The results show that the tool has a greater depth of investigation than that of the existing electromagnetic propagation LWD tools and is more sensitive to azimuthal conductivity. Both deep and medium induction responses have linear relationships with the formation conductivity, considering optimal calibration loop parameters and calibration coefficients. Due to the different depths of investigation and resolution, deep induction and medium induction are affected differently by the formation model parameters, thereby having different correction factors. The simulation results can provide theoretical references for the research and interpretation of the dual-induction resistivity LWD tools.

  2. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  3. Federal Highway Administration (FHWA) work zone driver model software

    DOT National Transportation Integrated Search

    2016-11-01

    FHWA and the U.S. Department of Transportation (USDOT) Volpe Center are developing a work zone car-following model and simulation software that interfaces with existing microsimulation tools, enabling more accurate simulation of car-following through...

  4. Coherent tools for physics-based simulation and characterization of noise in semiconductor devices oriented to nonlinear microwave circuit CAD

    NASA Astrophysics Data System (ADS)

    Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan

    2004-05-01

    We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.

  5. A Standalone Vision Impairments Simulator for Java Swing Applications

    NASA Astrophysics Data System (ADS)

    Oikonomou, Theofanis; Votis, Konstantinos; Korn, Peter; Tzovaras, Dimitrios; Likothanasis, Spriridon

    A lot of work has been done lately in an attempt to assess accessibility. For the case of web rich-client applications several tools exist that simulate how a vision impaired or colour-blind person would perceive this content. In this work we propose a simulation tool for non-web JavaTM Swing applications. Developers and designers face a real challenge when creating software that has to cope with a lot of interaction situations, as well as specific directives for ensuring an accessible interaction. The proposed standalone tool will assist them to explore user-centered design and important accessibility issues for their JavaTM Swing implementations.

  6. Simulation of a Start-Up Manufacturing Facility for Nanopore Arrays

    ERIC Educational Resources Information Center

    Field, Dennis W.

    2009-01-01

    Simulation is a powerful tool in developing and troubleshooting manufacturing processes, particularly when considering process flows for manufacturing systems that do not yet exist. Simulation can bridge the gap in terms of setting up full-scale manufacturing for nanotechnology products if limited production experience is an issue. An effective…

  7. A survey of parallel programming tools

    NASA Technical Reports Server (NTRS)

    Cheng, Doreen Y.

    1991-01-01

    This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.

  8. Evaluation and demonstration of commercialization potential of CCSI tools within gPROMS advanced simulation platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawal, Adekola; Schmal, Pieter; Ramos, Alfredo

    PSE, in the first phase of the CCSI commercialization project, set out to identify market opportunities for the CCSI tools combined with existing gPROMS platform capabilities and develop a clear technical plan for the proposed commercialization activities.

  9. MACHETE: Environment for Space Networking Evaluation

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John S.; Woo, Simon

    2010-01-01

    Space Exploration missions requires the design and implementation of space networking that differs from terrestrial networks. In a space networking architecture, interplanetary communication protocols need to be designed, validated and evaluated carefully to support different mission requirements. As actual systems are expensive to build, it is essential to have a low cost method to validate and verify mission/system designs and operations. This can be accomplished through simulation. Simulation can aid design decisions where alternative solutions are being considered, support trade-studies and enable fast study of what-if scenarios. It can be used to identify risks, verify system performance against requirements, and as an initial test environment as one moves towards emulation and actual hardware implementation of the systems. We describe the development of Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) and its use cases in supporting architecture trade studies, protocol performance and its role in hybrid simulation/emulation. The MACHETE environment contains various tools and interfaces such that users may select the set of tools tailored for the specific simulation end goal. The use cases illustrate tool combinations for simulating space networking in different mission scenarios. This simulation environment is useful in supporting space networking design for planned and future missions as well as evaluating performance of existing networks where non-determinism exist in data traffic and/or link conditions.

  10. ST-analyzer: a web-based user interface for simulation trajectory analysis.

    PubMed

    Jeong, Jong Cheol; Jo, Sunhwan; Wu, Emilia L; Qi, Yifei; Monje-Galvan, Viviana; Yeom, Min Sun; Gorenstein, Lev; Chen, Feng; Klauda, Jeffery B; Im, Wonpil

    2014-05-05

    Molecular dynamics (MD) simulation has become one of the key tools to obtain deeper insights into biological systems using various levels of descriptions such as all-atom, united-atom, and coarse-grained models. Recent advances in computing resources and MD programs have significantly accelerated the simulation time and thus increased the amount of trajectory data. Although many laboratories routinely perform MD simulations, analyzing MD trajectories is still time consuming and often a difficult task. ST-analyzer, http://im.bioinformatics.ku.edu/st-analyzer, is a standalone graphical user interface (GUI) toolset to perform various trajectory analyses. ST-analyzer has several outstanding features compared to other existing analysis tools: (i) handling various formats of trajectory files from MD programs, such as CHARMM, NAMD, GROMACS, and Amber, (ii) intuitive web-based GUI environment--minimizing administrative load and reducing burdens on the user from adapting new software environments, (iii) platform independent design--working with any existing operating system, (iv) easy integration into job queuing systems--providing options of batch processing either on the cluster or in an interactive mode, and (v) providing independence between foreground GUI and background modules--making it easier to add personal modules or to recycle/integrate pre-existing scripts utilizing other analysis tools. The current ST-analyzer contains nine main analysis modules that together contain 18 options, including density profile, lipid deuterium order parameters, surface area per lipid, and membrane hydrophobic thickness. This article introduces ST-analyzer with its design, implementation, and features, and also illustrates practical analysis of lipid bilayer simulations. Copyright © 2014 Wiley Periodicals, Inc.

  11. Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation

    PubMed Central

    Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee

    2018-01-01

    This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964

  12. SolarTherm: A flexible Modelica-based simulator for CSP systems

    NASA Astrophysics Data System (ADS)

    Scott, Paul; Alonso, Alberto de la Calle; Hinkley, James T.; Pye, John

    2017-06-01

    Annual performance simulations provide a valuable tool for analysing the viability and overall impact of different concentrating solar power (CSP) component and system designs. However, existing tools work best with conventional systems and are difficult or impossible to adapt when novel components, configurations and operating strategies are of interest. SolarTherm is a new open source simulation tool that fulfils this need for the solar community. It includes a simulation framework and a library of flexible CSP components and control strategies that can be adapted or replaced with new designs to meet the special needs of end users. This paper provides an introduction to SolarTherm and a comparison of models for an energy-based trough system and a physical tower system to those in the well-established and widely-used simulator SAM. Differences were found in some components where the inner workings of SAM are undocumented or not well understood, while the other parts show strong agreement. These results help to validate the fundamentals of SolarTherm and demonstrate that, while at an early stage of development, it is already a useful tool for performing annual simulations.

  13. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    PubMed

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  14. Using Business Simulations as Authentic Assessment Tools

    ERIC Educational Resources Information Center

    Neely, Pat; Tucker, Jan

    2012-01-01

    New modalities for assessing student learning exist as a result of advances in computer technology. Conventional measurement practices have been transformed into computer based testing. Although current testing replicates assessment processes used in college classrooms, a greater opportunity exists to use computer technology to create authentic…

  15. Improving material removal determinacy based on the compensation of tool influence function

    NASA Astrophysics Data System (ADS)

    Zhong, Bo; Chen, Xian-hua; Deng, Wen-hui; Zhao, Shi-jie; Zheng, Nan

    2018-03-01

    In the process of computer-controlled optical surfacing (CCOS), the key of correcting the surface error of optical components is to ensure the consistency between the simulated tool influence function and the actual tool influence function (TIF). The existing removal model usually adopts the fixed-point TIF to remove the material with the planning path and velocity, and it considers that the polishing process is linear and time invariant. However, in the actual polishing process, the TIF is a function related to the feed speed. In this paper, the relationship between the actual TIF and the feed speed (i.e. the compensation relationship between static removal and dynamic removal) is determined by experimental method. Then, the existing removal model is modified based on the compensation relationship, to improve the conformity between simulated and actual processing. Finally, the surface error modification correction test are carried out. The results show that the fitting degree of the simulated surface and the experimental surface is better than 88%, and the surface correction accuracy can be better than 1/10 λ (Λ=632.8nm).

  16. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.

    2000-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  17. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  18. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  19. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  20. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenwood, Michael S; Cetiner, Mustafa S; Fugate, David L

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and supportmore » tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  1. NASA Simulation Capabilities

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon R.

    2017-01-01

    This presentation provides a high-level overview of NASA's Future ATM Concepts Evaluation Tool (FACET) with a high-level description of the system's inputs and outputs. This presentation is designed to support the joint simulations that NASA and the Chinese Aeronautical Establishment (CAE) will conduct under an existing Memorandum of Understanding.

  2. On extending parallelism to serial simulators

    NASA Technical Reports Server (NTRS)

    Nicol, David; Heidelberger, Philip

    1994-01-01

    This paper describes an approach to discrete event simulation modeling that appears to be effective for developing portable and efficient parallel execution of models of large distributed systems and communication networks. In this approach, the modeler develops submodels using an existing sequential simulation modeling tool, using the full expressive power of the tool. A set of modeling language extensions permit automatically synchronized communication between submodels; however, the automation requires that any such communication must take a nonzero amount off simulation time. Within this modeling paradigm, a variety of conservative synchronization protocols can transparently support conservative execution of submodels on potentially different processors. A specific implementation of this approach, U.P.S. (Utilitarian Parallel Simulator), is described, along with performance results on the Intel Paragon.

  3. A virtual therapeutic environment with user projective agents.

    PubMed

    Ookita, S Y; Tokuda, H

    2001-02-01

    Today, we see the Internet as more than just an information infrastructure, but a socializing place and a safe outlet of inner feelings. Many personalities develop aside from real world life due to its anonymous environment. Virtual world interactions are bringing about new psychological illnesses ranging from netaddiction to technostress, as well as online personality disorders and conflicts in multiple identities that exist in the virtual world. Presently, there are no standard therapy models for the virtual environment. There are very few therapeutic environments, or tools especially made for virtual therapeutic environments. The goal of our research is to provide the therapy model and middleware tools for psychologists to use in virtual therapeutic environments. We propose the Cyber Therapy Model, and Projective Agents, a tool used in the therapeutic environment. To evaluate the effectiveness of the tool, we created a prototype system, called the Virtual Group Counseling System, which is a therapeutic environment that allows the user to participate in group counseling through the eyes of their Projective Agent. Projective Agents inherit the user's personality traits. During the virtual group counseling, the user's Projective Agent interacts and collaborates to recover and increase their psychological growth. The prototype system provides a simulation environment where psychologists can adjust the parameters and customize their own simulation environment. The model and tool is a first attempt toward simulating online personalities that may exist only online, and provide data for observation.

  4. Microscopic transport model animation visualisation on KML base

    NASA Astrophysics Data System (ADS)

    Yatskiv, I.; Savrasovs, M.

    2012-10-01

    By reading classical literature devoted to the simulation theory it could be found that one of the greatest possibilities of simulation is the ability to present processes inside the system by animation. This gives to the simulation model additional value during presentation of simulation results for the public and authorities who are not familiar enough with simulation. That is why most of universal and specialised simulation tools have the ability to construct 2D and 3D representation of the model. Usually the development of such representation could take much time and there must be put a lot forces into creating an adequate 3D representation of the model. For long years such well-known microscopic traffic flow simulation software tools as VISSIM, AIMSUN and PARAMICS have had a possibility to produce 2D and 3D animation. But creation of realistic 3D model of the place where traffic flows are simulated, even in these professional software tools it is a hard and time consuming action. The goal of this paper is to describe the concepts of use the existing on-line geographical information systems for visualisation of animation produced by simulation software. For demonstration purposes the following technologies and tools have been used: PTV VISION VISSIM, KML and Google Earth.

  5. IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.

    This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less

  6. LibKiSAO: a Java library for Querying KiSAO.

    PubMed

    Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas

    2012-09-24

    The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.

  7. A Future Mars Environment for Science and Exploration

    NASA Astrophysics Data System (ADS)

    Green, J. L.; Hollingsworth, J.; Brain, D.; Airapetian, V.; Pulkkinen, A.; Dong, C.; Bamford, R.

    2017-02-01

    Investigation of a greatly enhanced atmosphere of higher pressure and temperature of Mars can be accomplished using existing simulation tools. Simulation results will be reviewed and a projection of how long it may take for Mars to become an exciting new planet to study and to live on.

  8. SpineCreator: a Graphical User Interface for the Creation of Layered Neural Models.

    PubMed

    Cope, A J; Richmond, P; James, S S; Gurney, K; Allerton, D J

    2017-01-01

    There is a growing requirement in computational neuroscience for tools that permit collaborative model building, model sharing, combining existing models into a larger system (multi-scale model integration), and are able to simulate models using a variety of simulation engines and hardware platforms. Layered XML model specification formats solve many of these problems, however they are difficult to write and visualise without tools. Here we describe a new graphical software tool, SpineCreator, which facilitates the creation and visualisation of layered models of point spiking neurons or rate coded neurons without requiring the need for programming. We demonstrate the tool through the reproduction and visualisation of published models and show simulation results using code generation interfaced directly into SpineCreator. As a unique application for the graphical creation of neural networks, SpineCreator represents an important step forward for neuronal modelling.

  9. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  10. Toward Interactive Scenario Analysis and Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gayle, Thomas R.; Summers, Kenneth Lee; Jungels, John

    2015-01-01

    As Modeling and Simulation (M&S) tools have matured, their applicability and importance have increased across many national security challenges. In particular, they provide a way to test how something may behave without the need to do real world testing. However, current and future changes across several factors including capabilities, policy, and funding are driving a need for rapid response or evaluation in ways that many M&S tools cannot address. Issues around large data, computational requirements, delivery mechanisms, and analyst involvement already exist and pose significant challenges. Furthermore, rising expectations, rising input complexity, and increasing depth of analysis will only increasemore » the difficulty of these challenges. In this study we examine whether innovations in M&S software coupled with advances in ''cloud'' computing and ''big-data'' methodologies can overcome many of these challenges. In particular, we propose a simple, horizontally-scalable distributed computing environment that could provide the foundation (i.e. ''cloud'') for next-generation M&S-based applications based on the notion of ''parallel multi-simulation''. In our context, the goal of parallel multi- simulation is to consider as many simultaneous paths of execution as possible. Therefore, with sufficient resources, the complexity is dominated by the cost of single scenario runs as opposed to the number of runs required. We show the feasibility of this architecture through a stable prototype implementation coupled with the Umbra Simulation Framework [6]. Finally, we highlight the utility through multiple novel analysis tools and by showing the performance improvement compared to existing tools.« less

  11. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  12. APRON: A Cellular Processor Array Simulation and Hardware Design Tool

    NASA Astrophysics Data System (ADS)

    Barr, David R. W.; Dudek, Piotr

    2009-12-01

    We present a software environment for the efficient simulation of cellular processor arrays (CPAs). This software (APRON) is used to explore algorithms that are designed for massively parallel fine-grained processor arrays, topographic multilayer neural networks, vision chips with SIMD processor arrays, and related architectures. The software uses a highly optimised core combined with a flexible compiler to provide the user with tools for the design of new processor array hardware architectures and the emulation of existing devices. We present performance benchmarks for the software processor array implemented on standard commodity microprocessors. APRON can be configured to use additional processing hardware if necessary and can be used as a complete graphical user interface and development environment for new or existing CPA systems, allowing more users to develop algorithms for CPA systems.

  13. Relaxation estimation of RMSD in molecular dynamics immunosimulations.

    PubMed

    Schreiner, Wolfgang; Karch, Rudolf; Knapp, Bernhard; Ilieva, Nevena

    2012-01-01

    Molecular dynamics simulations have to be sufficiently long to draw reliable conclusions. However, no method exists to prove that a simulation has converged. We suggest the method of "lagged RMSD-analysis" as a tool to judge if an MD simulation has not yet run long enough. The analysis is based on RMSD values between pairs of configurations separated by variable time intervals Δt. Unless RMSD(Δt) has reached a stationary shape, the simulation has not yet converged.

  14. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  15. The Many Faces of Patient-Centered Simulation: Implications for Researchers.

    PubMed

    Arnold, Jennifer L; McKenzie, Frederic Rick D; Miller, Jane Lindsay; Mancini, Mary E

    2018-06-01

    Patient-centered simulation for nonhealthcare providers is an emerging and innovative application for healthcare simulation. Currently, no consensus exists on what patient-centered simulation encompasses and outcomes research in this area is limited. Conceptually, patient-centered simulation aligns with the principles of patient- and family-centered care bringing this educational tool directly to patients and caregivers with the potential to improve patient care and outcomes. This descriptive article is a summary of findings presented at the 2nd International Meeting for Simulation in Healthcare Research Summit. Experts in the field delineated a categorization for better describing patient-centered simulation and reviewed the literature to identify a research agenda. Three types of patient-centered simulation patient-directed, patient-driven, and patient-specific are presented with research priorities identified for each. Patient-centered simulation has been shown to be an effective educational tool and has the potential to directly improve patient care outcomes. Presenting a typology for patient-centered simulation provides direction for future research.

  16. ESAS Deliverable PS 1.1.2.3: Customer Survey on Code Generations in Safety-Critical Applications

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Denney, Ewen

    2006-01-01

    Automated code generators (ACG) are tools that convert a (higher-level) model of a software (sub-)system into executable code without the necessity for a developer to actually implement the code. Although both commercially supported and in-house tools have been used in many industrial applications, little data exists on how these tools are used in safety-critical domains (e.g., spacecraft, aircraft, automotive, nuclear). The aims of the survey, therefore, were threefold: 1) to determine if code generation is primarily used as a tool for prototyping, including design exploration and simulation, or for fiight/production code; 2) to determine the verification issues with code generators relating, in particular, to qualification and certification in safety-critical domains; and 3) to determine perceived gaps in functionality of existing tools.

  17. Open source Modeling and optimization tools for Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  18. Practises to identify and prevent adverse aircraft-and-rotorcraft-pilot couplings-A ground simulator perspective

    NASA Astrophysics Data System (ADS)

    Pavel, Marilena D.; Jump, Michael; Masarati, Pierangelo; Zaichik, Larisa; Dang-Vu, Binh; Smaili, Hafid; Quaranta, Giuseppe; Stroosma, Olaf; Yilmaz, Deniz; Johnes, Michael; Gennaretti, Massimmo; Ionita, Achim

    2015-08-01

    The aviation community relies heavily on flight simulators as a fundamental tool for research, pilot training and development of any new aircraft design. The goal of the present paper is to provide a review on how effective ground simulation is as an assessment tool for unmasking adverse Aircraft-and-Rotorcraft Pilot Couplings (APC/RPC). Although it is generally believed that simulators are not reliable in revealing the existence of A/RPC tendencies, the paper demonstrates that a proper selection of high-gain tasks combined with appropriate motion and visual cueing can reveal negative features of a particular aircraft that may lead to A/RPC. The paper discusses new methods for real-time A/RPC detection that can be used as a tool for unmasking adverse A/RPC. Although flight simulators will not achieve the level of reality of in-flight testing, exposing A/RPC tendencies in the simulator may be the only convenient safe place to evaluate the wide range of conditions that could produce hazardous A/RPC events.

  19. Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo

    The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.

  20. Myokit: A simple interface to cardiac cellular electrophysiology.

    PubMed

    Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A

    2016-01-01

    Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Objective assessment of technique in laparoscopic colorectal surgery: what are the existing tools?

    PubMed

    Foster, J D; Francis, N K

    2015-01-01

    Assessment can improve the effectiveness of surgical training and enable valid judgments of competence. Laparoscopic colon resection surgery is now taught within surgical residency programs, and assessment tools are increasingly used to stimulate formative feedback and enhance learning. Formal assessment of technical performance in laparoscopic colon resection has been successfully applied at the specialist level in the English "LAPCO" National Training Program. Objective assessment tools need to be developed for training and assessment in laparoscopic rectal cancer resection surgery. Simulation may have a future role in assessment and accreditation in laparoscopic colorectal surgery; however, existing virtual reality models are not ready to be used for assessment of this advanced surgery.

  2. Best opening face system for sweepy, eccentric logs : a user’s guide

    Treesearch

    David W. Lewis

    1985-01-01

    Log breakdown simulation models have gained rapid acceptance within the sawmill industry in the last 15 years. Although they have many advantages over traditional decision making tools, the existing models do not calculate yield correctly when used to simulate the breakdown of eccentric, sweepy logs in North American sawmills producing softwood dimension lumber. In an...

  3. Chapter 8 - Mapping existing vegetation composition and structure for the LANDFIRE Prototype Project

    Treesearch

    Zhiliang Zhu; James Vogelmann; Donald Ohlen; Jay Kost; Xuexia Chen; Brian Tolk

    2006-01-01

    The Landscape Fire and Resource Management Planning Tools Prototype Project, or LANDFIRE Prototype Project, required the mapping of existing vegetation composition (cover type) and structural stages at a 30-m spatial resolution to provide baseline vegetation data for the development of wildland fuel maps and for comparison to simulated historical vegetation reference...

  4. Pika: A snow science simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.; Johnson, M.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the state-of-the-art in line with other scientific research efforts.

  5. Integrating interactive computational modeling in biology curricula.

    PubMed

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  6. Relaxation Estimation of RMSD in Molecular Dynamics Immunosimulations

    PubMed Central

    Schreiner, Wolfgang; Karch, Rudolf; Knapp, Bernhard; Ilieva, Nevena

    2012-01-01

    Molecular dynamics simulations have to be sufficiently long to draw reliable conclusions. However, no method exists to prove that a simulation has converged. We suggest the method of “lagged RMSD-analysis” as a tool to judge if an MD simulation has not yet run long enough. The analysis is based on RMSD values between pairs of configurations separated by variable time intervals Δt. Unless RMSD(Δt) has reached a stationary shape, the simulation has not yet converged. PMID:23019425

  7. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines.

    PubMed

    Dubrowski, Adam; Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-11-02

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation - occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation - collectively, the building blocks of optimal healthcare.

  8. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines

    PubMed Central

    Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-01-01

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation – occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation – collectively, the building blocks of optimal healthcare.  PMID:26677421

  9. Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia; hide

    2011-01-01

    An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environmentmore » and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less

  11. Simulation of recreational use in backcountry settings: an aid to management planning

    Treesearch

    David N. Cole

    2002-01-01

    Simulation models of recreation use patterns can be a valuable tool to managers of backcountry areas, such as wilderness areas and national parks. They can help fine-tune existing management programs, particularly in places that ration recreation use or that require the use of designated campsites. They can assist managers in evaluating the likely effects of increasing...

  12. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, Sharmista

    1993-01-01

    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  13. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  14. NDE and SHM Simulation for CFRP Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Parker, F. Raymond

    2014-01-01

    Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.

  15. Using artificial intelligence to control fluid flow computations

    NASA Technical Reports Server (NTRS)

    Gelsey, Andrew

    1992-01-01

    Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.

  16. Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Plantenga, Todd D.

    2010-06-01

    The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.

  17. Models and Simulations as a Service: Exploring the Use of Galaxy for Delivering Computational Models

    PubMed Central

    Walker, Mark A.; Madduri, Ravi; Rodriguez, Alex; Greenstein, Joseph L.; Winslow, Raimond L.

    2016-01-01

    We describe the ways in which Galaxy, a web-based reproducible research platform, can be used for web-based sharing of complex computational models. Galaxy allows users to seamlessly customize and run simulations on cloud computing resources, a concept we refer to as Models and Simulations as a Service (MaSS). To illustrate this application of Galaxy, we have developed a tool suite for simulating a high spatial-resolution model of the cardiac Ca2+ spark that requires supercomputing resources for execution. We also present tools for simulating models encoded in the SBML and CellML model description languages, thus demonstrating how Galaxy’s reproducible research features can be leveraged by existing technologies. Finally, we demonstrate how the Galaxy workflow editor can be used to compose integrative models from constituent submodules. This work represents an important novel approach, to our knowledge, to making computational simulations more accessible to the broader scientific community. PMID:26958881

  18. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  19. RFI and SCRIMP Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Loos, Alfred C.; Sayre, Jay

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.

  20. Progress on the Multiphysics Capabilities of the Parallel Electromagnetic ACE3P Simulation Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kononenko, Oleksiy

    2015-03-26

    ACE3P is a 3D parallel simulation suite that is being developed at SLAC National Accelerator Laboratory. Effectively utilizing supercomputer resources, ACE3P has become a key tool for the coupled electromagnetic, thermal and mechanical research and design of particle accelerators. Based on the existing finite-element infrastructure, a massively parallel eigensolver is developed for modal analysis of mechanical structures. It complements a set of the multiphysics tools in ACE3P and, in particular, can be used for the comprehensive study of microphonics in accelerating cavities ensuring the operational reliability of a particle accelerator.

  1. Virtual reality simulators for gastrointestinal endoscopy training

    PubMed Central

    Triantafyllou, Konstantinos; Lazaridis, Lazaros Dimitrios; Dimitriadis, George D

    2014-01-01

    The use of simulators as educational tools for medical procedures is spreading rapidly and many efforts have been made for their implementation in gastrointestinal endoscopy training. Endoscopy simulation training has been suggested for ascertaining patient safety while positively influencing the trainees’ learning curve. Virtual simulators are the most promising tool among all available types of simulators. These integrated modalities offer a human-like endoscopy experience by combining virtual images of the gastrointestinal tract and haptic realism with using a customized endoscope. From their first steps in the 1980s until today, research involving virtual endoscopic simulators can be divided in two categories: investigation of the impact of virtual simulator training in acquiring endoscopy skills and measuring competence. Emphasis should also be given to the financial impact of their implementation in endoscopy, including the cost of these state-of-the-art simulators and the potential economic benefits from their usage. Advances in technology will contribute to the upgrade of existing models and the development of new ones; while further research should be carried out to discover new fields of application. PMID:24527175

  2. Evaluation of Phosphorus Site Assessment Tools: Lessons from the USA.

    PubMed

    Sharpley, Andrew; Kleinman, Peter; Baffaut, Claire; Beegle, Doug; Bolster, Carl; Collick, Amy; Easton, Zachary; Lory, John; Nelson, Nathan; Osmond, Deanna; Radcliffe, David; Veith, Tamie; Weld, Jennifer

    2017-11-01

    Critical source area identification through phosphorus (P) site assessment is a fundamental part of modern nutrient management planning in the United States, yet there has been only sparse testing of the many versions of the P Index that now exist. Each P site assessment tool was developed to be applicable across a range of field conditions found in a given geographic area, making evaluation extremely difficult. In general, evaluation with in-field monitoring data has been limited, focusing primarily on corroborating manure and fertilizer "source" factors. Thus, a multiregional effort (Chesapeake Bay, Heartland, and Southern States) was undertaken to evaluate P Indices using a combination of limited field data, as well as output from simulation models (i.e., Agricultural Policy Environmental eXtender, Annual P Loss Estimator, Soil and Water Assessment Tool [SWAT], and Texas Best Management Practice Evaluation Tool [TBET]) to compare against P Index ratings. These comparisons show promise for advancing the weighting and formulation of qualitative P Index components but require careful vetting of the simulation models. Differences among regional conclusions highlight model strengths and weaknesses. For example, the Southern States region found that, although models could simulate the effects of nutrient management on P runoff, they often more accurately predicted hydrology than total P loads. Furthermore, SWAT and TBET overpredicted particulate P and underpredicted dissolved P, resulting in correct total P predictions but for the wrong reasons. Experience in the United States supports expanded regional approaches to P site assessment, assuming closely coordinated efforts that engage science, policy, and implementation communities, but limited scientific validity exists for uniform national P site assessment tools at the present time. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  3. Transforming GIS data into functional road models for large-scale traffic simulation.

    PubMed

    Wilkie, David; Sewall, Jason; Lin, Ming C

    2012-06-01

    There exists a vast amount of geographic information system (GIS) data that model road networks around the world as polylines with attributes. In this form, the data are insufficient for applications such as simulation and 3D visualization-tools which will grow in power and demand as sensor data become more pervasive and as governments try to optimize their existing physical infrastructure. In this paper, we propose an efficient method for enhancing a road map from a GIS database to create a geometrically and topologically consistent 3D model to be used in real-time traffic simulation, interactive visualization of virtual worlds, and autonomous vehicle navigation. The resulting representation provides important road features for traffic simulations, including ramps, highways, overpasses, legal merge zones, and intersections with arbitrary states, and it is independent of the simulation methodologies. We test the 3D models of road networks generated by our algorithm on real-time traffic simulation using both macroscopic and microscopic techniques.

  4. Simulations and Social Empathy: Domestic Violence Education in the New Millennium.

    PubMed

    Adelman, Madelaine; Rosenberg, Karen E; Hobart, Margaret

    2016-10-01

    When teaching about domestic violence, we hope that our students will be moved to act and organize against it within a social justice framework. We argue that instructional simulations can be used to inspire students to do so. Instructional simulations and gaming tools have been part of higher education pedagogical tool kits since at least the 1960s. Yet it is only recently that a domestic violence resource exists that reflects the interdisciplinary, interactive, and empathy-building orientation of feminist pedagogy. Drawing on the concept of "social empathy," we analyze the potential of the instructional simulation "In Her Shoes," developed by the Washington State Coalition Against Domestic Violence, to help students gain knowledge of and empathy for the constrained choices facing battered women, understand the frequent disjuncture between leaving and safety, and close the gap between cultural perceptions and lived realities. © The Author(s) 2016.

  5. A Framework for Daylighting Optimization in Whole Buildings with OpenStudio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-08-12

    We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less

  6. Adaptation of non-technical skills behavioural markers for delivery room simulation.

    PubMed

    Bracco, Fabrizio; Masini, Michele; De Tonetti, Gabriele; Brogioni, Francesca; Amidani, Arianna; Monichino, Sara; Maltoni, Alessandra; Dato, Andrea; Grattarola, Claudia; Cordone, Massimo; Torre, Giancarlo; Launo, Claudio; Chiorri, Carlo; Celleno, Danilo

    2017-03-17

    Simulation in healthcare has proved to be a useful method in improving skills and increasing the safety of clinical operations. The debriefing session, after the simulated scenario, is the core of the simulation, since it allows participants to integrate the experience with the theoretical frameworks and the procedural guidelines. There is consistent evidence for the relevance of non-technical skills (NTS) for the safe and efficient accomplishment of operations. However, the observation, assessment and feedback on these skills is particularly complex, because the process needs expert observers and the feedback is often provided in judgmental and ineffective ways. The aim of this study was therefore to develop and test a set of observation and rating forms for the NTS behavioural markers of multi-professional teams involved in delivery room emergency simulations (MINTS-DR, Multi-professional Inventory for Non-Technical Skills in the Delivery Room). The MINTS-DR was developed by adapting the existing tools and, when needed, by designing new tools according to the literature. We followed a bottom-up process accompanied by interviews and co-design between practitioners and psychology experts. The forms were specific for anaesthetists, gynaecologists, nurses/midwives, assistants, plus a global team assessment tool. We administered the tools in five editions of a simulation training course that involved 48 practitioners. Ratings on usability and usefulness were collected. The mean ratings of the usability and usefulness of the tools were not statistically different to or higher than 4 on a 5-point rating scale. In either case no significant differences were found across professional categories. The MINTS-DR is quick and easy to administer. It is judged to be a useful asset in maximising the learning experience that is provided by the simulation.

  7. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  8. Malicious Activity Simulation Tool (MAST) and Trust

    DTIC Science & Technology

    2015-06-01

    application through discovery and remediation of flaws. B. DESIGN AND DEVELOPMENT CONSIDERATIONS Design and development focuses on the actual...protection of the backup and restoration of the application. COBR -1 X V-16846 The IAO will ensure a disaster recovery plan exists in accordance

  9. Real simulation tools in introductory courses: packaging and repurposing our research code.

    NASA Astrophysics Data System (ADS)

    Heagy, L. J.; Cockett, R.; Kang, S.; Oldenburg, D.

    2015-12-01

    Numerical simulations are an important tool for scientific research and applications in industry. They provide a means to experiment with physics in a tangible, visual way, often providing insights into the problem. Over the last two years, we have been developing course and laboratory materials for an undergraduate geophysics course primarily taken by non-geophysics majors, including engineers and geologists. Our aim is to provide the students with resources to build intuition about geophysical techniques, promote curiosity driven exploration, and help them develop the skills necessary to communicate across disciplines. Using open-source resources and our existing research code, we have built modules around simulations, with supporting content to give student interactive tools for exploration into the impacts of input parameters and visualization of the resulting fields, fluxes and data for a variety of problems in applied geophysics, including magnetics, seismic, electromagnetics, and direct current resistivity. The content provides context for the problems, along with exercises that are aimed at getting students to experiment and ask 'what if...?' questions. In this presentation, we will discuss our approach for designing the structure of the simulation-based modules, the resources we have used, challenges we have encountered, general feedback from students and instructors, as well as our goals and roadmap for future improvement. We hope that our experiences and approach will be beneficial to other instructors who aim to put simulation tools in the hands of students.

  10. SmaggIce 2D Version 1.8: Software Toolkit Developed for Aerodynamic Simulation Over Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Vickerman, Mary B.

    2005-01-01

    SmaggIce 2D version 1.8 is a software toolkit developed at the NASA Glenn Research Center that consists of tools for modeling the geometry of and generating the grids for clean and iced airfoils. Plans call for the completed SmaggIce 2D version 2.0 to streamline the entire aerodynamic simulation process--the characterization and modeling of ice shapes, grid generation, and flow simulation--and to be closely coupled with the public-domain application flow solver, WIND. Grid generated using version 1.8, however, can be used by other flow solvers. SmaggIce 2D will help researchers and engineers study the effects of ice accretion on airfoil performance, which is difficult to do with existing software tools because of complex ice shapes. Using SmaggIce 2D, when fully developed, to simulate flow over an iced airfoil will help to reduce the cost of performing flight and wind-tunnel tests for certifying aircraft in natural and simulated icing conditions.

  11. Lipid-converter, a framework for lipid manipulations in molecular dynamics simulations

    PubMed Central

    Larsson, Per; Kasson, Peter M.

    2014-01-01

    Construction of lipid membrane and membrane protein systems for molecular dynamics simulations can be a challenging process. In addition, there are few available tools to extend existing studies by repeating simulations using other force fields and lipid compositions. To facilitate this, we introduce lipidconverter, a modular Python framework for exchanging force fields and lipid composition in coordinate files obtained from simulations. Force fields and lipids are specified by simple text files, making it easy to introduce support for additional force fields and lipids. The converter produces simulation input files that can be used for structural relaxation of the new membranes. PMID:25081234

  12. Role-playing simulation as an educational tool for health care personnel: developing an embedded assessment framework.

    PubMed

    Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne

    2010-04-01

    Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.

  13. Development and psychometric testing of a Clinical Reasoning Evaluation Simulation Tool (CREST) for assessing nursing students' abilities to recognize and respond to clinical deterioration.

    PubMed

    Liaw, Sok Ying; Rashasegaran, Ahtherai; Wong, Lai Fun; Deneen, Christopher Charles; Cooper, Simon; Levett-Jones, Tracy; Goh, Hongli Sam; Ignacio, Jeanette

    2018-03-01

    The development of clinical reasoning skills in recognising and responding to clinical deterioration is essential in pre-registration nursing education. Simulation has been increasingly used by educators to develop this skill. To develop and evaluate the psychometric properties of a Clinical Reasoning Evaluation Simulation Tool (CREST) for measuring clinical reasoning skills in recognising and responding to clinical deterioration in a simulated environment. A scale development with psychometric testing and mixed methods study. Nursing students and academic staff were recruited at a university. A three-phase prospective study was conducted. Phase 1 involved the development and content validation of the CREST; Phase 2 included the psychometric testing of the tool with 15 second-year and 15 third-year nursing students who undertook the simulation-based assessment; Phase 3 involved the usability testing of the tool with nine academic staff through a survey questionnaire and focus group discussion. A 10-item CREST was developed based on a model of clinical reasoning. A content validity of 0.93 was obtained from the validation of 15 international experts. The construct validity was supported as the third-year students demonstrated significantly higher (p<0.001) clinical reasoning scores than the second-year students. The concurrent validity was also supported with significant positive correlations between global rating scores and almost all subscale scores, and the total scores. The predictive validity was supported with an existing tool. The internal consistency was high with a Cronbach's alpha of 0.92. A high inter-rater reliability was demonstrated with an intraclass correlation coefficient of 0.88. The usability of the tool was rated positively by the nurse educators but the need to ease the scoring process was highlighted. A valid and reliable tool was developed to measure the effectiveness of simulation in developing clinical reasoning skills for recognising and responding to clinical deterioration. Copyright © 2017. Published by Elsevier Ltd.

  14. How sleep problems contribute to simulator sickness: Preliminary results from a realistic driving scenario.

    PubMed

    Altena, Ellemarije; Daviaux, Yannick; Sanz-Arigita, Ernesto; Bonhomme, Emilien; de Sevin, Étienne; Micoulaud-Franchi, Jean-Arthur; Bioulac, Stéphanie; Philip, Pierre

    2018-04-17

    Virtual reality and simulation tools enable us to assess daytime functioning in environments that simulate real life as close as possible. Simulator sickness, however, poses a problem in the application of these tools, and has been related to pre-existing health problems. How sleep problems contribute to simulator sickness has not yet been investigated. In the current study, 20 female chronic insomnia patients and 32 female age-matched controls drove in a driving simulator covering realistic city, country and highway scenes. Fifty percent of the insomnia patients as opposed to 12.5% of controls reported excessive simulator sickness leading to experiment withdrawal. In the remaining participants, patients with insomnia showed overall increased levels of oculomotor symptoms even before driving, while nausea symptoms further increased after driving. These results, as well as the realistic simulation paradigm developed, give more insight on how vestibular and oculomotor functions as well as interoceptive functions are affected in insomnia. Importantly, our results have direct implications for both the actual driving experience and the wider context of deploying simulation techniques to mimic real life functioning, in particular in those professions often exposed to sleep problems. © 2018 European Sleep Research Society.

  15. Integration of Irma tactical scene generator into directed-energy weapon system simulation

    NASA Astrophysics Data System (ADS)

    Owens, Monte A.; Cole, Madison B., III; Laine, Mark R.

    2003-08-01

    Integrated high-fidelity physics-based simulations that include engagement models, image generation, electro-optical hardware models and control system algorithms have previously been developed by Boeing-SVS for various tracking and pointing systems. These simulations, however, had always used images with featureless or random backgrounds and simple target geometries. With the requirement to engage tactical ground targets in the presence of cluttered backgrounds, a new type of scene generation tool was required to fully evaluate system performance in this challenging environment. To answer this need, Irma was integrated into the existing suite of Boeing-SVS simulation tools, allowing scene generation capabilities with unprecedented realism. Irma is a US Air Force research tool used for high-resolution rendering and prediction of target and background signatures. The MATLAB/Simulink-based simulation achieves closed-loop tracking by running track algorithms on the Irma-generated images, processing the track errors through optical control algorithms, and moving simulated electro-optical elements. The geometry of these elements determines the sensor orientation with respect to the Irma database containing the three-dimensional background and target models. This orientation is dynamically passed to Irma through a Simulink S-function to generate the next image. This integrated simulation provides a test-bed for development and evaluation of tracking and control algorithms against representative images including complex background environments and realistic targets calibrated using field measurements.

  16. An object oriented Python interface for atomistic simulations

    NASA Astrophysics Data System (ADS)

    Hynninen, T.; Himanen, L.; Parkkinen, V.; Musso, T.; Corander, J.; Foster, A. S.

    2016-01-01

    Programmable simulation environments allow one to monitor and control calculations efficiently and automatically before, during, and after runtime. Environments directly accessible in a programming environment can be interfaced with powerful external analysis tools and extensions to enhance the functionality of the core program, and by incorporating a flexible object based structure, the environments make building and analysing computational setups intuitive. In this work, we present a classical atomistic force field with an interface written in Python language. The program is an extension for an existing object based atomistic simulation environment.

  17. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection.

    PubMed

    Kasaie, Parastu; Mathema, Barun; Kelton, W David; Azman, Andrew S; Pennington, Jeff; Dowdy, David W

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission ("recent transmission proportion"), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional 'n-1' approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the 'n-1' technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the 'n-1' model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models' performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data.

  18. A Novel Tool Improves Existing Estimates of Recent Tuberculosis Transmission in Settings of Sparse Data Collection

    PubMed Central

    Kasaie, Parastu; Mathema, Barun; Kelton, W. David; Azman, Andrew S.; Pennington, Jeff; Dowdy, David W.

    2015-01-01

    In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission (“recent transmission proportion”), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where data are incomplete. We constructed a stochastic individual-based model of a TB epidemic and designed a set of simulations (derivation set) to develop two regression-based tools for estimating the recent transmission proportion from five inputs: underlying TB incidence, sampling coverage, study duration, clustered proportion of observed cases, and proportion of observed clusters in the sample. We tested these tools on a set of unrelated simulations (validation set), and compared their performance against that of the traditional ‘n-1’ approach. In the validation set, the regression tools reduced the absolute estimation bias (difference between estimated and true recent transmission proportion) in the ‘n-1’ technique by a median [interquartile range] of 60% [9%, 82%] and 69% [30%, 87%]. The bias in the ‘n-1’ model was highly sensitive to underlying levels of study coverage and duration, and substantially underestimated the recent transmission proportion in settings of incomplete data coverage. By contrast, the regression models’ performance was more consistent across different epidemiological settings and study characteristics. We provide one of these regression models as a user-friendly, web-based tool. Novel tools can improve our ability to estimate the recent TB transmission proportion from data that are observable (or estimable) by public health practitioners with limited available molecular data. PMID:26679499

  19. PetriScape - A plugin for discrete Petri net simulations in Cytoscape.

    PubMed

    Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan

    2016-06-04

    Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.

  20. PetriScape - A plugin for discrete Petri net simulations in Cytoscape.

    PubMed

    Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan

    2016-03-01

    Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.

  1. What we call what we do affects how we do it: a new nomenclature for simulation research in medical education.

    PubMed

    Haji, Faizal A; Hoppe, Daniel J; Morin, Marie-Paule; Giannoulakis, Konstantine; Koh, Jansen; Rojas, David; Cheung, Jeffrey J H

    2014-05-01

    Rapid technological advances and concern for patient safety have increased the focus on simulation as a pedagogical tool for educating health care providers. To date, simulation research scholarship has focused on two areas; evaluating instructional designs of simulation programs, and the integration of simulation into a broader educational context. However, these two categories of research currently exist under a single label-Simulation-Based Medical Education. In this paper we argue that introducing a more refined nomenclature within which to frame simulation research is necessary for researchers, to appropriately design research studies and describe their findings, and for end-point users (such as program directors and educators), to more appropriately understand and utilize this evidence.

  2. Technologies to Increase PV Hosting Capacity in Distribution Feeders: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Mather, Barry; Gotseff, Peter

    This paper studies the distributed photovoltaic (PV) hosting capacity in distribution feeders by using the stochastic analysis approach. Multiple scenario simulations are conducted to analyze several factors that affect PV hosting capacity, including the existence of voltage regulator, PV location, the power factor of PV inverter and Volt/VAR control. Based on the conclusions obtained from simulation results, three approaches are then proposed to increase distributed PV hosting capacity, which can be formulated as the optimization problem to obtain the optimal solution. All technologies investigated in this paper utilize only existing assets in the feeder and therefore are implementable for amore » low cost. Additionally, the tool developed for these studies is described.« less

  3. Technologies to Increase PV Hosting Capacity in Distribution Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Mather, Barry; Gotseff, Peter

    This paper studies the distributed photovoltaic (PV) hosting capacity in distribution feeders by using the stochastic analysis approach. Multiple scenario simulations are conducted to analyze several factors that affect PV hosting capacity, including the existence of voltage regulator, PV location, the power factor of PV inverter and Volt/VAR control. Based on the conclusions obtained from simulation results, three approaches are then proposed to increase distributed PV hosting capacity, which can be formulated as the optimization problem to obtain the optimal solution. All technologies investigated in this paper utilize only existing assets in the feeder and therefore are implementable for amore » low cost. Additionally, the tool developed for these studies is described.« less

  4. A tool to convert CAD models for importation into Geant4

    NASA Astrophysics Data System (ADS)

    Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration

    2017-10-01

    The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.

  5. A review of training research and virtual reality simulators for the da Vinci surgical system.

    PubMed

    Liu, May; Curet, Myriam

    2015-01-01

    PHENOMENON: Virtual reality simulators are the subject of several recent studies of skills training for robot-assisted surgery. Yet no consensus exists regarding what a core skill set comprises or how to measure skill performance. Defining a core skill set and relevant metrics would help surgical educators evaluate different simulators. This review draws from published research to propose a core technical skill set for using the da Vinci surgeon console. Publications on three commercial simulators were used to evaluate the simulators' content addressing these skills and associated metrics. An analysis of published research suggests that a core technical skill set for operating the surgeon console includes bimanual wristed manipulation, camera control, master clutching to manage hand position, use of third instrument arm, activating energy sources, appropriate depth perception, and awareness of forces applied by instruments. Validity studies of three commercial virtual reality simulators for robot-assisted surgery suggest that all three have comparable content and metrics. However, none have comprehensive content and metrics for all core skills. INSIGHTS: Virtual reality simulation remains a promising tool to support skill training for robot-assisted surgery, yet existing commercial simulator content is inadequate for performing and assessing a comprehensive basic skill set. The results of this evaluation help identify opportunities and challenges that exist for future developments in virtual reality simulation for robot-assisted surgery. Specifically, the inclusion of educational experts in the development cycle alongside clinical and technological experts is recommended.

  6. Blending technology in teaching advanced health assessment in a family nurse practitioner program: using personal digital assistants in a simulation laboratory.

    PubMed

    Elliott, Lydia; DeCristofaro, Claire; Carpenter, Alesia

    2012-09-01

    This article describes the development and implementation of integrated use of personal handheld devices (personal digital assistants, PDAs) and high-fidelity simulation in an advanced health assessment course in a graduate family nurse practitioner (NP) program. A teaching tool was developed that can be utilized as a template for clinical case scenarios blending these separate technologies. Review of the evidence-based literature, including peer-reviewed articles and reviews. Blending the technologies of high-fidelity simulation and handheld devices (PDAs) provided a positive learning experience for graduate NP students in a teaching laboratory setting. Combining both technologies in clinical case scenarios offered a more real-world learning experience, with a focus on point-of-care service and integration of interview and physical assessment skills with existing standards of care and external clinical resources. Faculty modeling and advance training with PDA technology was crucial to success. Faculty developed a general template tool and systems-based clinical scenarios integrating PDA and high-fidelity simulation. Faculty observations, the general template tool, and one scenario example are included in this article. ©2012 The Author(s) Journal compilation ©2012 American Academy of Nurse Practitioners.

  7. Web-based applications for building, managing and analysing kinetic models of biological systems.

    PubMed

    Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A

    2009-01-01

    Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.

  8. U.S. Army Research Laboratory (ARL) XPairIt Simulator for Peptide Docking and Analysis

    DTIC Science & Technology

    2014-07-01

    results from a case study, docking a short peptide to a small protein. For this test we choose the 1RXZ system from the Protein Data Bank, which...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data ...core of XPairIt, which additionally contains many data management and organization options, analysis tools, and custom simulation methodology. Two

  9. Evaluation of existing and modified wetland equations in the SWAT model

    USDA-ARS?s Scientific Manuscript database

    The drainage significantly alters flow and nutrient pathways in small watersheds and reliable simulation at this scale is needed for effective planning of nutrient reduction strategies. The Soil and Water Assessment Tool (SWAT) has been widely utilized for prediction of flow and nutrient loads, but...

  10. A SCREENING MODEL FOR SIMULATING DNAPL FLOW AND TRANSPORT IN POROUS MEDIA: THEORETICAL DEVELOPMENT

    EPA Science Inventory

    There exists a need for a simple tool that will allow us to analyze a DNAPL contamination scenario from free-product release to transport of soluble constituents to downgradient receptor wells. The objective of this manuscript is to present the conceptual model and formulate the ...

  11. Rapid ISS Power Availability Simulator

    NASA Technical Reports Server (NTRS)

    Downing, Nicholas

    2011-01-01

    The ISS (International Space Station) Power Resource Officers (PROs) needed a tool to automate the calculation of thousands of ISS power availability simulations used to generate power constraint matrices. Each matrix contains 864 cells, and each cell represents a single power simulation that must be run. The tools available to the flight controllers were very operator intensive and not conducive to rapidly running the thousands of simulations necessary to generate the power constraint data. SOLAR is a Java-based tool that leverages commercial-off-the-shelf software (Satellite Toolkit) and an existing in-house ISS EPS model (SPEED) to rapidly perform thousands of power availability simulations. SOLAR has a very modular architecture and consists of a series of plug-ins that are loosely coupled. The modular architecture of the software allows for the easy replacement of the ISS power system model simulator, re-use of the Satellite Toolkit integration code, and separation of the user interface from the core logic. Satellite Toolkit (STK) is used to generate ISS eclipse and insulation times, solar beta angle, position of the solar arrays over time, and the amount of shadowing on the solar arrays, which is then provided to SPEED to calculate power generation forecasts. The power planning turn-around time is reduced from three months to two weeks (83-percent decrease) using SOLAR, and the amount of PRO power planning support effort is reduced by an estimated 30 percent.

  12. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  13. Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration

    NASA Astrophysics Data System (ADS)

    Hariri, Ali

    Solar photovoltaic (PV) energy integration in distribution networks is one of the fastest growing sectors of distributed energy integration. The growth in solar PV integration is incentivized by various clean power policies, global interest in solar energy, and reduction in manufacturing and installation costs of solar energy systems. The increase in solar PV integration has raised a number of concerns regarding the potential impacts that might arise as a result of high PV penetration. Some impacts have already been recorded in networks with high PV penetration such as in China, Germany, and USA (Hawaii and California). Therefore, network planning is becoming more intricate as new technologies are integrated into the existing electric grid. The integrated new technologies pose certain compatibility concerns regarding the existing electric grid infrastructure. Therefore, PV integration impact studies are becoming more essential in order to have a better understanding of how to advance the solar PV integration efforts without introducing adverse impacts into the network. PV impact studies are important for understanding the nature of the new introduced phenomena. Understanding the nature of the potential impacts is a key factor for mitigating and accommodating for said impacts. Traditionally, electric power utilities relied on phasor-based power flow simulations for planning their electric networks. However, the conventional, commercially available, phasor-based simulation tools do not provide proper visibility across a wide spectrum of electric phenomena. Moreover, different types of simulation approaches are suitable for specific types of studies. For instance, power flow software cannot be used for studying time varying phenomena. At the same time, it is not practical to use electromagnetic transient (EMT) tools to perform power flow solutions. Therefore, some electric phenomena caused by the variability of PV generation are not visible using the conventional utility simulation software. On the other hand, EMT simulation tools provide high accuracy and visibility over a wide bandwidth of frequencies at the expense of larger processing and memory requirements, limited network size, and long simulation time. Therefore, there is a gap in simulation tools and techniques that can efficiently and effectively identify potential PV impact. New planning simulation tools are needed in order to accommodate for the simulation requirements of new integrated technologies in the electric grid. The dissertation at hand starts by identifying some of the potential impacts that are caused by high PV penetration. A phasor-based quasi-static time series (QSTS) analysis tool is developed in order to study the slow dynamics that are caused by the variations in the PV generation that lead to voltage fluctuations. Moreover, some EMT simulations are performed in order to study the impacts of PV systems on the electric network harmonic levels. These studies provide insights into the type and duration of certain impacts, as well as the conditions that may lead to adverse phenomena. In addition these studies present an idea about the type of simulation tools that are sufficient for each type of study. After identifying some of the potential impacts, certain planning tools and techniques are proposed. The potential PV impacts may cause certain utilities to refrain from integrating PV systems into their networks. However, each electric network has a certain limit beyond which the impacts become substantial and may adversely interfere with the system operation and the equipment along the feeder; this limit is referred to as the hosting limit (or hosting capacity). Therefore, it is important for utilities to identify the PV hosting limit on a specific electric network in order to safely and confidently integrate the maximum possible PV systems. In the following dissertation, two approaches have been proposed for identifying the hosing limit: 1. Analytical approach: this is a theoretical mathematical approach that demonstrated the understanding of the fundamentals of electric power system operation. It provides an easy way to estimate the maximum amount of PV power that can be injected at each node in the network. This approach has been tested and validated. 2. Stochastic simulation software approach: this approach provides a comprehensive simulation software that can be used in order to identify the PV hosting limit. The software performs a large number of stochastic simulation while varying the PV system size and location. The collected data is then analyzed for violations in the voltage levels, voltage fluctuations and reverse power flow. (Abstract shortened by ProQuest.).

  14. A Tool to Simulate the Transmission, Reception, and Execution of Interactive TV Applications

    PubMed Central

    Kulesza, Raoni; Rodrigues, Thiago; Machado, Felipe A. L.; Santos, Celso A. S.

    2017-01-01

    The emergence of Interactive Digital Television (iDTV) opened a set of technological possibilities that go beyond those offered by conventional TV. Among these opportunities we can highlight interactive contents that run together with linear TV program (television service where the viewer has to watch a scheduled TV program at the particular time it is offered and on the particular channel it is presented on). However, developing interactive contents for this new platform is not as straightforward as, for example, developing Internet applications. One of the options to make this development process easier and safer is to use an iDTV simulator. However, after having investigated some of the existing iDTV simulation environments, we have found a limitation: these simulators mainly present solutions focused on the TV receiver, whose interactive content must be loaded in advance by the programmer to a local repository (e.g., Hard Drive, USB). Therefore, in this paper, we propose a tool, named BiS (Broadcast iDTV content Simulator), which makes possible a broader solution for the simulation of interactive contents. It allows simulating the transmission of interactive content along with the linear TV program (simulating the transmission of content over the air and in broadcast to the receivers). To enable this, we defined a generic and easy-to-customize communication protocol that was implemented in the tool. The proposed environment differs from others because it allows simulating reception of both linear content and interactive content while running Java applications to allow such a content presentation. PMID:28280770

  15. Pydna: a simulation and documentation tool for DNA assembly strategies using python.

    PubMed

    Pereira, Filipa; Azevedo, Flávio; Carvalho, Ângela; Ribeiro, Gabriela F; Budde, Mark W; Johansson, Björn

    2015-05-02

    Recent advances in synthetic biology have provided tools to efficiently construct complex DNA molecules which are an important part of many molecular biology and biotechnology projects. The planning of such constructs has traditionally been done manually using a DNA sequence editor which becomes error-prone as scale and complexity of the construction increase. A human-readable formal description of cloning and assembly strategies, which also allows for automatic computer simulation and verification, would therefore be a valuable tool. We have developed pydna, an extensible, free and open source Python library for simulating basic molecular biology DNA unit operations such as restriction digestion, ligation, PCR, primer design, Gibson assembly and homologous recombination. A cloning strategy expressed as a pydna script provides a description that is complete, unambiguous and stable. Execution of the script automatically yields the sequence of the final molecule(s) and that of any intermediate constructs. Pydna has been designed to be understandable for biologists with limited programming skills by providing interfaces that are semantically similar to the description of molecular biology unit operations found in literature. Pydna simplifies both the planning and sharing of cloning strategies and is especially useful for complex or combinatorial DNA molecule construction. An important difference compared to existing tools with similar goals is the use of Python instead of a specifically constructed language, providing a simulation environment that is more flexible and extensible by the user.

  16. Modeling Tools Predict Flow in Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "Because rocket engines operate under extreme temperature and pressure, they present a unique challenge to designers who must test and simulate the technology. To this end, CRAFT Tech Inc., of Pipersville, Pennsylvania, won Small Business Innovation Research (SBIR) contracts from Marshall Space Flight Center to develop software to simulate cryogenic fluid flows and related phenomena. CRAFT Tech enhanced its CRUNCH CFD (computational fluid dynamics) software to simulate phenomena in various liquid propulsion components and systems. Today, both government and industry clients in the aerospace, utilities, and petrochemical industries use the software for analyzing existing systems as well as designing new ones."

  17. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as optical or radar sensors and simulate observation campaigns. The tool outputs field-of-view crossings and actual detections of the selected NEO population objects. Using the Observation Analysis users are able to process and plot the results of the Observation Simulation. In order to enable end-users to handle the tool in a user-intuitive and comfortable way, a GUI has been created based on the modular Eclipse Rich Client Platform (RCP) technology. Through the GUI users can easily enter input data for the tool, execute it and view its output data in a clear way. Additionally, the GUI runs gnuplot to create plot pictures and presents them to the user. Furthermore, users can create projects to organise executions of the tool.

  18. Air freight demand models: An overview

    NASA Technical Reports Server (NTRS)

    Dajani, J. S.; Bernstein, G. W.

    1978-01-01

    A survey is presented of some of the approaches which have been considered in freight demand estimation. The few existing continuous time computer simulations of aviation systems are reviewed, with a view toward the assessment of this approach as a tool for structuring air freight studies and for relating the different components of the air freight system. The variety of available data types and sources, without which the calibration, validation and the testing of both modal split and simulation models would be impossible are also reviewed.

  19. Numerical simulation of deformation and failure processes of a complex technical object under impact loading

    NASA Astrophysics Data System (ADS)

    Kraus, E. I.; Shabalin, I. I.; Shabalin, T. I.

    2018-04-01

    The main points of development of numerical tools for simulation of deformation and failure of complex technical objects under nonstationary conditions of extreme loading are presented. The possibility of extending the dynamic method for construction of difference grids to the 3D case is shown. A 3D realization of discrete-continuum approach to the deformation and failure of complex technical objects is carried out. The efficiency of the existing software package for 3D modelling is shown.

  20. Publishing and sharing of hydrologic models through WaterHUB

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.; Song, C.; Zhao, L.; Kim, J.; Assi, A.

    2011-12-01

    Most hydrologists use hydrologic models to simulate the hydrologic processes to understand hydrologic pathways and fluxes for research, decision making and engineering design. Once these tasks are complete including publication of results, the models generally are not published or made available to the public for further use and improvement. Although publication or sharing of models is not required for journal publications, sharing of models may open doors for new collaborations, and avoids duplication of efforts if other researchers are interested in simulating a particular watershed for which a model already exists. For researchers, who are interested in sharing models, there are limited avenues to publishing their models to the wider community. Towards filling this gap, a prototype cyberinfrastructure (CI), called WaterHUB, is developed for sharing hydrologic data and modeling tools in an interactive environment. To test the utility of WaterHUB for sharing hydrologic models, a system to publish and share SWAT (Soil Water Assessment Tool) is developed. Users can utilize WaterHUB to search and download existing SWAT models, and also upload new SWAT models. Metadata such as the name of the watershed, name of the person or agency who developed the model, simulation period, time step, and list of calibrated parameters also published with individual model.

  1. FLASH Interface; a GUI for managing runtime parameters in FLASH simulations

    NASA Astrophysics Data System (ADS)

    Walker, Christopher; Tzeferacos, Petros; Weide, Klaus; Lamb, Donald; Flocke, Norbert; Feister, Scott

    2017-10-01

    We present FLASH Interface, a novel graphical user interface (GUI) for managing runtime parameters in simulations performed with the FLASH code. FLASH Interface supports full text search of available parameters; provides descriptions of each parameter's role and function; allows for the filtering of parameters based on categories; performs input validation; and maintains all comments and non-parameter information already present in existing parameter files. The GUI can be used to edit existing parameter files or generate new ones. FLASH Interface is open source and was implemented with the Electron framework, making it available on Mac OSX, Windows, and Linux operating systems. The new interface lowers the entry barrier for new FLASH users and provides an easy-to-use tool for experienced FLASH simulators. U.S. Department of Energy (DOE), NNSA ASC/Alliances Center for Astrophysical Thermonuclear Flashes, U.S. DOE NNSA ASC through the Argonne Institute for Computing in Science, U.S. National Science Foundation.

  2. Landscape analysis software tools

    Treesearch

    Don Vandendriesche

    2008-01-01

    Recently, several new computer programs have been developed to assist in landscape analysis. The “Sequential Processing Routine for Arraying Yields” (SPRAY) program was designed to run a group of stands with particular treatment activities to produce vegetation yield profiles for forest planning. SPRAY uses existing Forest Vegetation Simulator (FVS) software coupled...

  3. How to identify dislocations in molecular dynamics simulations?

    NASA Astrophysics Data System (ADS)

    Li, Duo; Wang, FengChao; Yang, ZhenYu; Zhao, YaPu

    2014-12-01

    Dislocations are of great importance in revealing the underlying mechanisms of deformed solid crystals. With the development of computational facilities and technologies, the observations of dislocations at atomic level through numerical simulations are permitted. Molecular dynamics (MD) simulation suggests itself as a powerful tool for understanding and visualizing the creation of dislocations as well as the evolution of crystal defects. However, the numerical results from the large-scale MD simulations are not very illuminating by themselves and there exist various techniques for analyzing dislocations and the deformed crystal structures. Thus, it is a big challenge for the beginners in this community to choose a proper method to start their investigations. In this review, we summarized and discussed up to twelve existing structure characterization methods in MD simulations of deformed crystal solids. A comprehensive comparison was made between the advantages and disadvantages of these typical techniques. We also examined some of the recent advances in the dynamics of dislocations related to the hydraulic fracturing. It was found that the dislocation emission has a significant effect on the propagation and bifurcation of the crack tip in the hydraulic fracturing.

  4. PSAMM: A Portable System for the Analysis of Metabolic Models

    PubMed Central

    Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying

    2016-01-01

    The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591

  5. SOAP based web services and their future role in VO projects

    NASA Astrophysics Data System (ADS)

    Topf, F.; Jacquey, C.; Génot, V.; Cecconi, B.; André, N.; Zhang, T. L.; Kallio, E.; Lammer, H.; Facsko, G.; Stöckler, R.; Khodachenko, M.

    2011-10-01

    Modern state-of-the-art web services are from crucial importance for the interoperability of different VO tools existing in the planetary community. SOAP based web services assure the interconnectability between different data sources and tools by providing a common protocol for communication. This paper will point out a best practice approach with the Automated Multi-Dataset Analysis Tool (AMDA) developed by CDPP, Toulouse and the provision of VEX/MAG data from a remote database located at IWF, Graz. Furthermore a new FP7 project IMPEx will be introduced with a potential usage example of AMDA web services in conjunction with simulation models.

  6. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures.

    PubMed

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.

  7. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures

    PubMed Central

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315

  8. A generic model of real-world non-ideal behaviour of FES-induced muscle contractions: simulation tool

    NASA Astrophysics Data System (ADS)

    Lynch, Cheryl L.; Graham, Geoff M.; Popovic, Milos R.

    2011-08-01

    Functional electrical stimulation (FES) applications are frequently evaluated in simulation prior to testing in human subjects. Such simulations are usually based on the typical muscle responses to electrical stimulation, which may result in an overly optimistic assessment of likely real-world performance. We propose a novel method for simulating FES applications that includes non-ideal muscle behaviour during electrical stimulation resulting from muscle fatigue, spasms and tremors. A 'non-idealities' block that can be incorporated into existing FES simulations and provides a realistic estimate of real-world performance is described. An implementation example is included, showing how the non-idealities block can be incorporated into a simulation of electrically stimulated knee extension against gravity for both a proportional-integral-derivative controller and a sliding mode controller. The results presented in this paper illustrate that the real-world performance of a FES system may be vastly different from the performance obtained in simulation using nominal muscle models. We believe that our non-idealities block should be included in future simulations that involve muscle response to FES, as this tool will provide neural engineers with a realistic simulation of the real-world performance of FES systems. This simulation strategy will help engineers and organizations save time and money by preventing premature human testing. The non-idealities block will become available free of charge at www.toronto-fes.ca in late 2011.

  9. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  10. RF transient analysis and stabilization of the phase and energy of the proposed PIP-II LINAC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelen, J. P.; Chase, B. E.

    This paper describes a recent effort to develop and benchmark a simulation tool for the analysis of RF transients and their compensation in an H- linear accelerator. Existing tools in this area either focus on electron LINACs or lack fundamental details about the LLRF system that are necessary to provide realistic performance estimates. In our paper we begin with a discussion of our computational models followed by benchmarking with existing beam-dynamics codes and measured data. We then analyze the effect of RF transients and their compensation in the PIP-II LINAC, followed by an analysis of calibration errors and how amore » Newton’s Method based feedback scheme can be used to regulate the beam energy to within the specified limits.« less

  11. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    PubMed

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  12. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    NASA Astrophysics Data System (ADS)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated analysis. Options will be explored using at least two competing simulation capabilities, but emphasis will be placed on reasoned analyses as supported by the simulations.

  13. Estimation of Nutation Time Constant Model Parameters for On-Axis Spinning Spacecraft

    NASA Technical Reports Server (NTRS)

    Schlee, Keith; Sudermann, James

    2008-01-01

    Calculating an accurate nutation time constant for a spinning spacecraft is an important step for ensuring mission success. Spacecraft nutation is caused by energy dissipation about the spin axis. Propellant slosh in the spacecraft fuel tanks is the primary source for this dissipation and can be simulated using a forced motion spin table. Mechanical analogs, such as pendulums and rotors, are typically used to simulate propellant slosh. A strong desire exists for an automated method to determine these analog parameters. The method presented accomplishes this task by using a MATLAB Simulink/SimMechanics based simulation that utilizes the Parameter Estimation Tool.

  14. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  15. Method for simulating paint mixing on computer monitors

    NASA Astrophysics Data System (ADS)

    Carabott, Ferdinand; Lewis, Garth; Piehl, Simon

    2002-06-01

    Computer programs like Adobe Photoshop can generate a mixture of two 'computer' colors by using the Gradient control. However, the resulting colors diverge from the equivalent paint mixtures in both hue and value. This study examines why programs like Photoshop are unable to simulate paint or pigment mixtures, and offers a solution using Photoshops existing tools. The article discusses how a library of colors, simulating paint mixtures, is created from 13 artists' colors. The mixtures can be imported into Photoshop as a color swatch palette of 1248 colors and as 78 continuous or stepped gradient files, all accessed in a new software package, Chromafile.

  16. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  17. Research of TREETOPS Structural Dynamics Controls Simulation Upgrade

    NASA Technical Reports Server (NTRS)

    Yates, Rose M.

    1996-01-01

    Under the provisions of contract number NAS8-40194, which was entitled 'TREETOPS Structural Dynamics and Controls Simulation System Upgrade', Oakwood College contracted to produce an upgrade to the existing TREETOPS suite of analysis tools. This suite includes the main simulation program, TREETOPS, two interactive preprocessors, TREESET and TREEFLX, an interactive post processor, TREEPLOT, and an adjunct program, TREESEL. A 'Software Design Document', which provides descriptions of the argument lists and internal variables for each subroutine in the TREETOPS suite, was established. Additionally, installation guides for both DOS and UNIX platforms were developed. Finally, updated User's Manuals, as well as a Theory Manual, were generated.

  18. TU-EF-204-07: Add Tube Current Modulation to a Low Dose Simulation Tool for CT Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Y.; Department of Physics, University of Arizona, Tucson, AZ; Wen, G.

    2015-06-15

    Purpose: We extended the capabilities of a low dose simulation tool to model Tube-Current Modulation (TCM). TCM is widely used in clinical practice to reduce radiation dose in CT scans. We expect the tool to be valuable for various clinical applications (e.g., optimize protocols, compare reconstruction techniques and evaluate TCM methods). Methods: The tube current is input as a function of z location, instead of a fixed value. Starting from the line integrals of a scan, a new Poisson noise realization at a lower dose is generated for each view. To validate the new functionality, we compared simulated scans withmore » real scans in image space. Results: First we assessed noise in the difference between the low-dose simulations and the original high-dose scan. When the simulated tube current is a step function of z location, the noise at each segment matches the noise of 3 separate constant-tube-current-simulations. Secondly, with a phantom that forces TCM, we compared a low-dose simulation with an equivalent real low-dose scan. The mean CT number of the simulated scan and the real low-dose scan were 137.7±0.6 and 137.8±0.5 respectively. Furthermore, with 240 ROIs, the noise of the simulated scan and the real low-dose scan were 24.03±0.45 and 23.99±0.43 respectively, and they were not statistically different (2-sample t-test, p-value=0.28). The facts that the noise reflected the trend of the TCM curve, and that the absolute noise measurements were not statistically different validated the TCM function. Conclusion: We successfully added tube-current modulation functionality in an existing low dose simulation tool. We demonstrated that the noise reflected an input tube-current modulation curve. In addition, we verified that the noise and mean CT number of our simulation agreed with a real low dose scan. The authors are all employees of Philips. Yijun Ding is also supported by NIBIB P41EB002035 and NIBIB R01EB000803.« less

  19. Designing a Qualitative Data Collection Strategy (QDCS) for Africa - Phase 1: A Gap Analysis of Existing Models, Simulations, and Tools Relating to Africa

    DTIC Science & Technology

    2012-06-01

    generalized behavioral model characterized after the fictional Seldon equations (the one elaborated upon by Isaac Asimov in the 1951 novel, The...Foundation). Asimov described the Seldon equations as essentially statistical models with historical data of a sufficient size and variability that they

  20. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  1. Cement bond evaluation method in horizontal wells using segmented bond tool

    NASA Astrophysics Data System (ADS)

    Song, Ruolong; He, Li

    2018-06-01

    Most of the existing cement evaluation technologies suffer from tool eccentralization due to gravity in highly deviated wells and horizontal wells. This paper proposes a correction method to lessen the effects of tool eccentralization on evaluation results of cement bond using segmented bond tool, which has an omnidirectional sonic transmitter and eight segmented receivers evenly arranged around the tool 2 ft from the transmitter. Using 3-D finite difference parallel numerical simulation method, we investigate the logging responses of centred and eccentred segmented bond tool in a variety of bond conditions. From the numerical results, we find that the tool eccentricity and channel azimuth can be estimated from measured sector amplitude. The average of the sector amplitude when the tool is eccentred can be corrected to the one when the tool is centred. Then the corrected amplitude will be used to calculate the channel size. The proposed method is applied to both synthetic and field data. For synthetic data, it turns out that this method can estimate the tool eccentricity with small error and the bond map is improved after correction. For field data, the tool eccentricity has a good agreement with the measured well deviation angle. Though this method still suffers from the low accuracy of calculating channel azimuth, the credibility of corrected bond map is improved especially in horizontal wells. It gives us a choice to evaluate the bond condition for horizontal wells using existing logging tool. The numerical results in this paper can provide aids for understanding measurements of segmented tool in both vertical and horizontal wells.

  2. Virtual Observatories for Space Physics Observations and Simulations: New Routes to Efficient Access and Visualization

    NASA Technical Reports Server (NTRS)

    Roberts, Aaron

    2005-01-01

    New tools for data access and visualization promise to make the analysis of space plasma data both more efficient and more powerful, especially for answering questions about the global structure and dynamics of the Sun-Earth system. We will show how new existing tools (particularly the Virtual Space Physics Observatory-VSPO-and the Visual System for Browsing, Analysis and Retrieval of Data-ViSBARD; look for the acronyms in Google) already provide rapid access to such information as spacecraft orbits, browse plots, and detailed data, as well as visualizations that can quickly unite our view of multispacecraft observations. We will show movies illustrating multispacecraft observations of the solar wind and magnetosphere during a magnetic storm, and of simulations of 3 0-spacecraft observations derived from MHD simulations of the magnetosphere sampled along likely trajectories of the spacecraft for the MagCon mission. An important issue remaining to be solved is how best to integrate simulation data and services into the Virtual Observatory environment, and this talk will hopefully stimulate further discussion along these lines.

  3. Linguistic geometry for technologies procurement

    NASA Astrophysics Data System (ADS)

    Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg; Boyd, Ron

    2005-05-01

    In the modern world of rapidly rising prices of new military hardware, the importance of Simulation Based Acquisition (SBA) is hard to overestimate. With SAB, DOD would be able to test, develop CONOPS for, debug, and evaluate new conceptual military equipment before actually building the expensive hardware. However, only recently powerful tools for real SBA have been developed. Linguistic Geometry (LG) permits full-scale modeling and evaluation of new military technologies, combinations of hardware systems and concepts of their application. Using LG tools, the analysts can create a gaming environment populated with the Blue forces armed with the new conceptual hardware as well as with appropriate existing weapons and equipment. This environment will also contain the intelligent enemy with appropriate weaponry and, if desired, with a conceptual counters to the new Blue weapons. Within such LG gaming environment, the analyst can run various what-ifs with the LG tools providing the simulated combatants with strategies and tactics solving their goals with minimal resources spent.

  4. Dataflow computing approach in high-speed digital simulation

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Karplus, W. J.

    1984-01-01

    New computational tools and methodologies for the digital simulation of continuous systems were explored. Programmability, and cost effective performance in multiprocessor organizations for real time simulation was investigated. Approach is based on functional style languages and data flow computing principles, which allow for the natural representation of parallelism in algorithms and provides a suitable basis for the design of cost effective high performance distributed systems. The objectives of this research are to: (1) perform comparative evaluation of several existing data flow languages and develop an experimental data flow language suitable for real time simulation using multiprocessor systems; (2) investigate the main issues that arise in the architecture and organization of data flow multiprocessors for real time simulation; and (3) develop and apply performance evaluation models in typical applications.

  5. [Application of microelectronics CAD tools to synthetic biology].

    PubMed

    Madec, Morgan; Haiech, Jacques; Rosati, Élise; Rezgui, Abir; Gendrault, Yves; Lallement, Christophe

    2017-02-01

    Synthetic biology is an emerging science that aims to create new biological functions that do not exist in nature, based on the knowledge acquired in life science over the last century. Since the beginning of this century, several projects in synthetic biology have emerged. The complexity of the developed artificial bio-functions is relatively low so that empirical design methods could be used for the design process. Nevertheless, with the increasing complexity of biological circuits, this is no longer the case and a large number of computer aided design softwares have been developed in the past few years. These tools include languages for the behavioral description and the mathematical modelling of biological systems, simulators at different levels of abstraction, libraries of biological devices and circuit design automation algorithms. All of these tools already exist in other fields of engineering sciences, particularly in microelectronics. This is the approach that is put forward in this paper. © 2017 médecine/sciences – Inserm.

  6. Coke formation in the thermal cracking of hydrocarbons. 4: Modeling of coke formation in naphtha cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyniers, G.C.; Froment, G.F.; Kopinke, F.D.

    1994-11-01

    An extensive experimental program has been carried out in a pilot unit for the thermal cracking of hydrocarbons. On the basis of the experimental information and the insight in the mechanisms for coke formation in pyrolysis reactors, a mathematical model describing the coke formation has been derived. This model has been incorporated in the existing simulation tools at the Laboratorium voor Petrochemische Techniek, and the run length of an industrial naphtha cracking furnace has been accurately simulated. In this way the coking model has been validated.

  7. Building energy simulation in real time through an open standard interface

    DOE PAGES

    Pang, Xiufeng; Nouidui, Thierry S.; Wetter, Michael; ...

    2015-10-20

    Building energy models (BEMs) are typically used for design and code compliance for new buildings and in the renovation of existing buildings to predict energy use. We present the increasing adoption of BEM as standard practice in the building industry presents an opportunity to extend the use of BEMs into construction, commissioning and operation. In 2009, the authors developed a real-time simulation framework to execute an EnergyPlus model in real time to improve building operation. This paper reports an enhancement of that real-time energy simulation framework. The previous version only works with software tools that implement the custom co-simulation interfacemore » of the Building Controls Virtual Test Bed (BCVTB), such as EnergyPlus, Dymola and TRNSYS. The new version uses an open standard interface, the Functional Mockup Interface (FMI), to provide a generic interface to any application that supports the FMI protocol. In addition, the new version utilizes the Simple Measurement and Actuation Profile (sMAP) tool as the data acquisition system to acquire, store and present data. Lastly, this paper introduces the updated architecture of the real-time simulation framework using FMI and presents proof-of-concept demonstration results which validate the new framework.« less

  8. Simulation in Otolaryngology: A teaching and training tool.

    PubMed

    Thone, Natalie; Winter, Matías; García-Matte, Raimundo J; González, Claudia

    Simulation in medical education is an effective method of teaching and learning, allowing standardisation of the learning and teaching processes without compromising the patient. Different types of simulation exist within subspecialty areas of Otolaryngology. Models that have been developed include phantom imaging, dummy patients, virtual models and animal models that are used to teach and practice different skills. Each model has advantages and disadvantages, where virtual reality is an emerging model with a promising future. However, there is still a need for further development of simulation in the area of Otolaryngology. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Otorrinolaringología y Cirugía de Cabeza y Cuello. All rights reserved.

  9. Predicting the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol mixtures via molecular simulation.

    PubMed

    Paluch, Andrew S; Parameswaran, Sreeja; Liu, Shuai; Kolavennu, Anasuya; Mobley, David L

    2015-01-28

    We present a general framework to predict the excess solubility of small molecular solids (such as pharmaceutical solids) in binary solvents via molecular simulation free energy calculations at infinite dilution with conventional molecular models. The present study used molecular dynamics with the General AMBER Force Field to predict the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol solvents. The simulations are able to predict the existence of solubility enhancement and the results are in good agreement with available experimental data. The accuracy of the predictions in addition to the generality of the method suggests that molecular simulations may be a valuable design tool for solvent selection in drug development processes.

  10. Predicting the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol mixtures via molecular simulation

    NASA Astrophysics Data System (ADS)

    Paluch, Andrew S.; Parameswaran, Sreeja; Liu, Shuai; Kolavennu, Anasuya; Mobley, David L.

    2015-01-01

    We present a general framework to predict the excess solubility of small molecular solids (such as pharmaceutical solids) in binary solvents via molecular simulation free energy calculations at infinite dilution with conventional molecular models. The present study used molecular dynamics with the General AMBER Force Field to predict the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol solvents. The simulations are able to predict the existence of solubility enhancement and the results are in good agreement with available experimental data. The accuracy of the predictions in addition to the generality of the method suggests that molecular simulations may be a valuable design tool for solvent selection in drug development processes.

  11. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  12. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies

    PubMed Central

    Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.

    2016-01-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405

  13. The acoustic performance of double-skin facades: A design support tool for architects

    NASA Astrophysics Data System (ADS)

    Batungbakal, Aireen

    This study assesses and validates the influence of measuring sound in the urban environment and the influence of glass facade components in reducing sound transmission to the indoor environment. Among the most reported issues affecting workspaces, increased awareness to minimize noise led building designers to reconsider the design of building envelopes and its site environment. Outdoor sound conditions, such as traffic noise, challenge designers to accurately estimate the capability of glass facades in acquiring an appropriate indoor sound quality. Indicating the density of the urban environment, field-tests acquired existing sound levels in areas of high commercial development, employment, and traffic activity, establishing a baseline for sound levels common in urban work areas. Composed from the direct sound transmission loss of glass facades simulated through INSUL, a sound insulation software, data is utilized as an informative tool correlating the response of glass facade components towards existing outdoor sound levels of a project site in order to achieve desired indoor sound levels. This study progresses to link the disconnection in validating the acoustic performance of glass facades early in a project's design, from conditioned settings such as field-testing and simulations to project completion. Results obtained from the study's facade simulations and facade comparison supports that acoustic comfort is not limited to a singular solution, but multiple design options responsive to its environment.

  14. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  15. A multiphysics and multiscale software environment for modeling astrophysical systems

    NASA Astrophysics Data System (ADS)

    Portegies Zwart, Simon; McMillan, Steve; Harfst, Stefan; Groen, Derek; Fujii, Michiko; Nualláin, Breanndán Ó.; Glebbeek, Evert; Heggie, Douglas; Lombardi, James; Hut, Piet; Angelou, Vangelis; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Gaburov, Evghenii; Izzard, Rob; Jurić, Mario; Justham, Stephen; Sottoriva, Andrea; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel

    2009-05-01

    We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a "Noah's Ark" milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multiscale and multiphysics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe three examples calculated using MUSE: the merger of two galaxies, the merger of two evolving stars, and a hybrid N-body simulation. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  16. Status and future of MUSE

    NASA Astrophysics Data System (ADS)

    Harfst, S.; Portegies Zwart, S.; McMillan, S.

    2008-12-01

    We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  17. Modeling the direct sun component in buildings using matrix algebraic approaches: Methods and validation

    DOE PAGES

    Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory

    2017-12-23

    Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less

  18. Modeling the direct sun component in buildings using matrix algebraic approaches: Methods and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Eleanor S.; Geisler-Moroder, David; Ward, Gregory

    Simulation tools that enable annual energy performance analysis of optically-complex fenestration systems have been widely adopted by the building industry for use in building design, code development, and the development of rating and certification programs for commercially-available shading and daylighting products. The tools rely on a three-phase matrix operation to compute solar heat gains, using as input low-resolution bidirectional scattering distribution function (BSDF) data (10–15° angular resolution; BSDF data define the angle-dependent behavior of light-scattering materials and systems). Measurement standards and product libraries for BSDF data are undergoing development to support solar heat gain calculations. Simulation of other metrics suchmore » as discomfort glare, annual solar exposure, and potentially thermal discomfort, however, require algorithms and BSDF input data that more accurately model the spatial distribution of transmitted and reflected irradiance or illuminance from the sun (0.5° resolution). This study describes such algorithms and input data, then validates the tools (i.e., an interpolation tool for measured BSDF data and the five-phase method) through comparisons with ray-tracing simulations and field monitored data from a full-scale testbed. Simulations of daylight-redirecting films, a micro-louvered screen, and venetian blinds using variable resolution, tensor tree BSDF input data derived from interpolated scanning goniophotometer measurements were shown to agree with field monitored data to within 20% for greater than 75% of the measurement period for illuminance-based performance parameters. The three-phase method delivered significantly less accurate results. We discuss the ramifications of these findings on industry and provide recommendations to increase end user awareness of the current limitations of existing software tools and BSDF product libraries.« less

  19. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.

  20. Using a medical simulation center as an electronic health record usability laboratory

    PubMed Central

    Landman, Adam B; Redden, Lisa; Neri, Pamela; Poole, Stephen; Horsky, Jan; Raja, Ali S; Pozner, Charles N; Schiff, Gordon; Poon, Eric G

    2014-01-01

    Usability testing is increasingly being recognized as a way to increase the usability and safety of health information technology (HIT). Medical simulation centers can serve as testing environments for HIT usability studies. We integrated the quality assurance version of our emergency department (ED) electronic health record (EHR) into our medical simulation center and piloted a clinical care scenario in which emergency medicine resident physicians evaluated a simulated ED patient and documented electronically using the ED EHR. Meticulous planning and close collaboration with expert simulation staff was important for designing test scenarios, pilot testing, and running the sessions. Similarly, working with information systems teams was important for integration of the EHR. Electronic tools are needed to facilitate entry of fictitious clinical results while the simulation scenario is unfolding. EHRs can be successfully integrated into existing simulation centers, which may provide realistic environments for usability testing, training, and evaluation of human–computer interactions. PMID:24249778

  1. Development of a simulation of the surficial groundwater system for the CONUS

    NASA Astrophysics Data System (ADS)

    Zell, W.; Sanford, W. E.

    2016-12-01

    Water resource and environmental managers across the country face a variety of questions involving groundwater availability and/or groundwater transport pathways. Emerging management questions require prediction of groundwater response to changing climate regimes (e.g., how drought-induced water-table recession may degrade near-stream vegetation and result in increased wildfire risks), while existing questions can require identification of current groundwater contributions to surface water (e.g., groundwater linkages between landscape contaminant inputs and receiving streams may help explain in-stream phenomena such as fish intersex). At present, few national-coverage simulation tools exist to help characterize groundwater contributions to receiving streams and predict potential changes in base-flow regimes under changing climate conditions. We will describe the Phase 1 development of a simulation of the water table and shallow groundwater system for the entire CONUS. We use national-scale datasets such as the National Recharge Map and the Map Database for Surficial Materials in the CONUS to develop groundwater flow (MODFLOW) and transport (MODPATH) models that are calibrated against groundwater level and stream elevation data from NWIS and NHD, respectively. Phase 1 includes the development of a national transmissivity map for the surficial groundwater system and examines the impact of model-grid resolution on the simulated steady-state discharge network (and associated recharge areas) and base-flow travel time distributions for different HUC scales. In the course of developing the transmissivity map we show that transmissivity in fractured bedrock systems is dependent on depth to water. Subsequent phases of this work will simulate water table changes at a monthly time step (using MODIS-dependent recharge estimates) and serve as a critical complement to surface-water-focused USGS efforts to provide national coverage hydrologic modeling tools.

  2. SOA approach to battle command: simulation interoperability

    NASA Astrophysics Data System (ADS)

    Mayott, Gregory; Self, Mid; Miller, Gordon J.; McDonnell, Joseph S.

    2010-04-01

    NVESD is developing a Sensor Data and Management Services (SDMS) Service Oriented Architecture (SOA) that provides an innovative approach to achieve seamless application functionality across simulation and battle command systems. In 2010, CERDEC will conduct a SDMS Battle Command demonstration that will highlight the SDMS SOA capability to couple simulation applications to existing Battle Command systems. The demonstration will leverage RDECOM MATREX simulation tools and TRADOC Maneuver Support Battle Laboratory Virtual Base Defense Operations Center facilities. The battle command systems are those specific to the operation of a base defense operations center in support of force protection missions. The SDMS SOA consists of four components that will be discussed. An Asset Management Service (AMS) will automatically discover the existence, state, and interface definition required to interact with a named asset (sensor or a sensor platform, a process such as level-1 fusion, or an interface to a sensor or other network endpoint). A Streaming Video Service (SVS) will automatically discover the existence, state, and interfaces required to interact with a named video stream, and abstract the consumers of the video stream from the originating device. A Task Manager Service (TMS) will be used to automatically discover the existence of a named mission task, and will interpret, translate and transmit a mission command for the blue force unit(s) described in a mission order. JC3IEDM data objects, and software development kit (SDK), will be utilized as the basic data object definition for implemented web services.

  3. A Simulation Study of Acoustic-Assisted Tracking of Whales for Mark-Recapture Surveys

    PubMed Central

    Peel, David; Miller, Brian S.; Kelly, Natalie; Dawson, Steve; Slooten, Elisabeth; Double, Michael C.

    2014-01-01

    Collecting enough data to obtain reasonable abundance estimates of whales is often difficult, particularly when studying rare species. Passive acoustics can be used to detect whale sounds and are increasingly used to estimate whale abundance. Much of the existing effort centres on the use of acoustics to estimate abundance directly, e.g. analysing detections in a distance sampling framework. Here, we focus on acoustics as a tool incorporated within mark-recapture surveys. In this context, acoustic tools are used to detect and track whales, which are then photographed or biopsied to provide data for mark-recapture analyses. The purpose of incorporating acoustics is to increase the encounter rate beyond using visual searching only. While this general approach is not new, its utility is rarely quantified. This paper predicts the “acoustically-assisted” encounter rate using a discrete-time individual-based simulation of whales and survey vessel. We validate the simulation framework using existing data from studies of sperm whales. We then use the framework to predict potential encounter rates in a study of Antarctic blue whales. We also investigate the effects of a number of the key parameters on encounter rate. Mean encounter rates from the simulation of sperm whales matched well with empirical data. Variance of encounter rate, however, was underestimated. The simulation of Antarctic blue whales found that passive acoustics should provide a 1.7–3.0 fold increase in encounter rate over visual-only methods. Encounter rate was most sensitive to acoustic detection range, followed by vocalisation rate. During survey planning and design, some indication of the relationship between expected sample size and effort is paramount; this simulation framework can be used to predict encounter rates and establish this relationship. For a case in point, the simulation framework indicates unequivocally that real-time acoustic tracking should be considered for quantifying the abundance of Antarctic blue whales via mark-recapture methods. PMID:24827919

  4. A simulation study of acoustic-assisted tracking of whales for mark-recapture surveys.

    PubMed

    Peel, David; Miller, Brian S; Kelly, Natalie; Dawson, Steve; Slooten, Elisabeth; Double, Michael C

    2014-01-01

    Collecting enough data to obtain reasonable abundance estimates of whales is often difficult, particularly when studying rare species. Passive acoustics can be used to detect whale sounds and are increasingly used to estimate whale abundance. Much of the existing effort centres on the use of acoustics to estimate abundance directly, e.g. analysing detections in a distance sampling framework. Here, we focus on acoustics as a tool incorporated within mark-recapture surveys. In this context, acoustic tools are used to detect and track whales, which are then photographed or biopsied to provide data for mark-recapture analyses. The purpose of incorporating acoustics is to increase the encounter rate beyond using visual searching only. While this general approach is not new, its utility is rarely quantified. This paper predicts the "acoustically-assisted" encounter rate using a discrete-time individual-based simulation of whales and survey vessel. We validate the simulation framework using existing data from studies of sperm whales. We then use the framework to predict potential encounter rates in a study of Antarctic blue whales. We also investigate the effects of a number of the key parameters on encounter rate. Mean encounter rates from the simulation of sperm whales matched well with empirical data. Variance of encounter rate, however, was underestimated. The simulation of Antarctic blue whales found that passive acoustics should provide a 1.7-3.0 fold increase in encounter rate over visual-only methods. Encounter rate was most sensitive to acoustic detection range, followed by vocalisation rate. During survey planning and design, some indication of the relationship between expected sample size and effort is paramount; this simulation framework can be used to predict encounter rates and establish this relationship. For a case in point, the simulation framework indicates unequivocally that real-time acoustic tracking should be considered for quantifying the abundance of Antarctic blue whales via mark-recapture methods.

  5. Arrhythmic risk biomarkers for the assessment of drug cardiotoxicity: from experiments to computer simulations

    PubMed Central

    Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.

    2010-01-01

    In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology. PMID:20478918

  6. Hydrothermal Microflow Technology as a Research Tool for Origin-of-Life Studies in Extreme Earth Environments

    PubMed Central

    Kawamura, Kunio

    2017-01-01

    Although studies about the origin of life are a frontier in science and a number of effective approaches have been developed, drawbacks still exist. Examples include: (1) simulation of chemical evolution experiments (which were demonstrated for the first time by Stanley Miller); (2) approaches tracing back the most primitive life-like systems (on the basis of investigations of present organisms); and (3) constructive approaches for making life-like systems (on the basis of molecular biology), such as in vitro construction of the RNA world. Naturally, simulation experiments of chemical evolution under plausible ancient Earth environments have been recognized as a potentially fruitful approach. Nevertheless, simulation experiments seem not to be sufficient for identifying the scenario from molecules to life. This is because primitive Earth environments are still not clearly defined and a number of possibilities should be taken into account. In addition, such environments frequently comprise extreme conditions when compared to the environments of present organisms. Therefore, we need to realize the importance of accurate and convenient experimental approaches that use practical research tools, which are resistant to high temperature and pressure, to facilitate chemical evolution studies. This review summarizes improvements made in such experimental approaches over the last two decades, focusing primarily on our hydrothermal microflow reactor technology. Microflow reactor systems are a powerful tool for performing simulation experiments in diverse simulated hydrothermal Earth conditions in order to measure the kinetics of formation and degradation and the interactions of biopolymers. PMID:28974048

  7. Generating Virtual Patients by Multivariate and Discrete Re-Sampling Techniques.

    PubMed

    Teutonico, D; Musuamba, F; Maas, H J; Facius, A; Yang, S; Danhof, M; Della Pasqua, O

    2015-10-01

    Clinical Trial Simulations (CTS) are a valuable tool for decision-making during drug development. However, to obtain realistic simulation scenarios, the patients included in the CTS must be representative of the target population. This is particularly important when covariate effects exist that may affect the outcome of a trial. The objective of our investigation was to evaluate and compare CTS results using re-sampling from a population pool and multivariate distributions to simulate patient covariates. COPD was selected as paradigm disease for the purposes of our analysis, FEV1 was used as response measure and the effects of a hypothetical intervention were evaluated in different populations in order to assess the predictive performance of the two methods. Our results show that the multivariate distribution method produces realistic covariate correlations, comparable to the real population. Moreover, it allows simulation of patient characteristics beyond the limits of inclusion and exclusion criteria in historical protocols. Both methods, discrete resampling and multivariate distribution generate realistic pools of virtual patients. However the use of a multivariate distribution enable more flexible simulation scenarios since it is not necessarily bound to the existing covariate combinations in the available clinical data sets.

  8. NEURON and Python.

    PubMed

    Hines, Michael L; Davison, Andrew P; Muller, Eilif

    2009-01-01

    The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications.

  9. NEURON and Python

    PubMed Central

    Hines, Michael L.; Davison, Andrew P.; Muller, Eilif

    2008-01-01

    The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications. PMID:19198661

  10. Potential impacts of carbon taxes on carbon flux in western Oregon private forests

    Treesearch

    Eun Ho Im; Darius M. Adams; Gregory S. Latta

    2007-01-01

    This study considers a carbon tax system as a policy tool for encouraging carbon sequestration through modification of management in existing forests and examines its welfare impacts and costs of the carbon sequestered. The simulated carbon tax leads to reduced harvest and increased carbon stock in the standing trees and understory biomass. Changes in the level of...

  11. Simulation of ridesourcing using agent-based demand and supply regional models : potential market demand for first-mile transit travel and reduction in vehicle miles traveled in the San Francisco Bay Area.

    DOT National Transportation Integrated Search

    2016-01-01

    In this study, we use existing modeling tools and data from the San Francisco Bay Area : (California) to understand the potential market demand for a first mile transit access service : and possible reductions in vehicle miles traveled (VMT) (a...

  12. Using historical simulations of vegetation to assess departure of current vegetation conditions across large landscapes[Chapter 11

    Treesearch

    Lisa Holsinger; Robert E. Keane; Brian Steele; Matthew C. Reeves; Sarah Pratt

    2006-01-01

    The Landscape Fire and Resource Management Planning Tools Prototype Project, or LANDFIRE Prototype Project, was conceived, in part, to identify areas across the nation where existing landscape conditions are markedly different from historical conditions (Keane and Rollins, Ch. 3). This objective arose from the recognition that over 100 years of land use and wildland...

  13. The use of psychiatry-focused simulation in undergraduate nursing education: A systematic search and review.

    PubMed

    Vandyk, Amanda D; Lalonde, Michelle; Merali, Sabrina; Wright, Erica; Bajnok, Irmajean; Davies, Barbara

    2018-04-01

    Evidence on the use of simulation to teach psychiatry and mental health (including addiction) content is emerging, yet no summary of the implementation processes or associated outcomes exists. The aim of this study was to systematically search and review empirical literature on the use of psychiatry-focused simulation in undergraduate nursing education. Objectives were to (i) assess the methodological quality of existing evidence on the use of simulation to teach mental health content to undergraduate nursing students, (ii) describe the operationalization of the simulations, and (iii) summarize the associated quantitative and qualitative outcomes. We conducted online database (MEDLINE, Embase, ERIC, CINAHL, PsycINFO from January 2004 to October 2015) and grey literature searches. Thirty-two simulation studies were identified describing and evaluating six types of simulations (standardized patients, audio simulations, high-fidelity simulators, virtual world, multimodal, and tabletop). Overall, 2724 participants were included in the studies. Studies reflected a limited number of intervention designs, and outcomes were evaluated with qualitative and quantitative methods incorporating a variety of tools. Results indicated that simulation was effective in reducing student anxiety and improving their knowledge, empathy, communication, and confidence. The summarized qualitative findings all supported the benefit of simulation; however, more research is needed to assess the comparative effectiveness of the types of simulations. Recommendations from the findings include the development of guidelines for educators to deliver each simulation component (briefing, active simulation, debriefing). Finally, consensus around appropriate training of facilitators is needed, as is consistent and agreed upon simulation terminology. © 2017 Australian College of Mental Health Nurses Inc.

  14. X-ray optics simulation and beamline design for the APS upgrade

    NASA Astrophysics Data System (ADS)

    Shi, Xianbo; Reininger, Ruben; Harder, Ross; Haeffner, Dean

    2017-08-01

    The upgrade of the Advanced Photon Source (APS) to a Multi-Bend Achromat (MBA) will increase the brightness of the APS by between two and three orders of magnitude. The APS upgrade (APS-U) project includes a list of feature beamlines that will take full advantage of the new machine. Many of the existing beamlines will be also upgraded to profit from this significant machine enhancement. Optics simulations are essential in the design and optimization of these new and existing beamlines. In this contribution, the simulation tools used and developed at APS, ranging from analytical to numerical methods, are summarized. Three general optical layouts are compared in terms of their coherence control and focusing capabilities. The concept of zoom optics, where two sets of focusing elements (e.g., CRLs and KB mirrors) are used to provide variable beam sizes at a fixed focal plane, is optimized analytically. The effects of figure errors on the vertical spot size and on the local coherence along the vertical direction of the optimized design are investigated.

  15. Computational fluid dynamics-habitat suitability index (CFD-HSI) modelling as an exploratory tool for assessing passability of riverine migratory challenge zones for fish

    USGS Publications Warehouse

    Haro, Alexander J.; Chelminski, Michael; Dudley, Robert W.

    2015-01-01

    We developed two-dimensional computational fluid hydraulics-habitat suitability index (CFD-HSI) models to identify and qualitatively assess potential zones of shallow water depth and high water velocity that may present passage challenges for five major anadromous fish species in a 2.63-km reach of the main stem Penobscot River, Maine, as a result of a dam removal downstream of the reach. Suitability parameters were based on distribution of fish lengths and body depths and transformed to cruising, maximum sustained and sprint swimming speeds. Zones of potential depth and velocity challenges were calculated based on the hydraulic models; ability of fish to pass a challenge zone was based on the percent of river channel that the contiguous zone spanned and its maximum along-current length. Three river flows (low: 99.1 m3 sec-1; normal: 344.9 m3 sec-1; and high: 792.9 m3 sec-1) were modelled to simulate existing hydraulic conditions and hydraulic conditions simulating removal of a dam at the downstream boundary of the reach. Potential depth challenge zones were nonexistent for all low-flow simulations of existing conditions for deeper-bodied fishes. Increasing flows for existing conditions and removal of the dam under all flow conditions increased the number and size of potential velocity challenge zones, with the effects of zones being more pronounced for smaller species. The two-dimensional CFD-HSI model has utility in demonstrating gross effects of flow and hydraulic alteration, but may not be as precise a predictive tool as a three-dimensional model. Passability of the potential challenge zones cannot be precisely quantified for two-dimensional or three-dimensional models due to untested assumptions and incomplete data on fish swimming performance and behaviours.

  16. A surgical simulation curriculum for senior medical students based on TeamSTEPPS.

    PubMed

    Meier, Andreas H; Boehler, Maggie L; McDowell, Chris M; Schwind, Cathy; Markwell, Steve; Roberts, Nicole K; Sanfey, Hilary

    2012-08-01

    To investigate whether the existing Team Strategies and Tools to Enhance Performance and Patient Safety (TeamSTEPPS) curriculum can effectively teach senior medical students team skills. DESIGN Single-group preintervention and postintervention study. We integrated a TeamSTEPPS module into our existing resident readiness elective. The curriculum included interactive didactic sessions, discussion groups, role-plays, and videotaped immersive simulation scenarios. Improvement of self-assessment scores, multiple-choice examination scores, and performance ratings of videotaped simulation scenarios before and after intervention. The videos were rated by masked reviewers on the basis of a global rating instrument (TeamSTEPPS) and a more detailed nontechnical skills evaluation tool(NOTECHS). Seventeen students participated and completed the study. The self-evaluation scores improved from 12.76 to 16.06 (P < .001). The increase was significant for all of the TeamSTEPPS competencies and highest for leadership skills (from 2.2 to 3.2; P < .001). The multiple-choice score rose from 84.9% to 94.1% (P < .01). The postintervention video ratings were significantly higher for both instruments (TeamSTEPPS, from 2.99 to 3.56; P < .01; and NOTECHS, from 4.07 to 4.59; P < .001). The curriculum led to improved self-evaluation and multiple-choice scores as well as improved team skills during simulated immersive patient encounters. The TeamSTEPPS framework may be suitable for teaching medical students teamwork concepts and improving their competencies. Larger studies using this framework should be considered to further evaluate the generalizability of our results and the effectiveness of TeamSTEPPS for medical students.

  17. Matlab Geochemistry: An open source geochemistry solver based on MRST

    NASA Astrophysics Data System (ADS)

    McNeece, C. J.; Raynaud, X.; Nilsen, H.; Hesse, M. A.

    2017-12-01

    The study of geological systems often requires the solution of complex geochemical relations. To address this need we present an open source geochemical solver based on the Matlab Reservoir Simulation Toolbox (MRST) developed by SINTEF. The implementation supports non-isothermal multicomponent aqueous complexation, surface complexation, ion exchange, and dissolution/precipitation reactions. The suite of tools available in MRST allows for rapid model development, in particular the incorporation of geochemical calculations into transport simulations of multiple phases, complex domain geometry and geomechanics. Different numerical schemes and additional physics can be easily incorporated into the existing tools through the object-oriented framework employed by MRST. The solver leverages the automatic differentiation tools available in MRST to solve arbitrarily complex geochemical systems with any choice of species or element concentration as input. Four mathematical approaches enable the solver to be quite robust: 1) the choice of chemical elements as the basis components makes all entries in the composition matrix positive thus preserving convexity, 2) a log variable transformation is used which transfers the nonlinearity to the convex composition matrix, 3) a priori bounds on variables are calculated from the structure of the problem, constraining Netwon's path and 4) an initial guess is calculated implicitly by sequentially adding model complexity. As a benchmark we compare the model to experimental and semi-analytic solutions of the coupled salinity-acidity transport system. Together with the reservoir simulation capabilities of MRST the solver offers a promising tool for geochemical simulations in reservoir domains for applications in a diversity of fields from enhanced oil recovery to radionuclide storage.

  18. Real-time scene and signature generation for ladar and imaging sensors

    NASA Astrophysics Data System (ADS)

    Swierkowski, Leszek; Christie, Chad L.; Antanovskii, Leonid; Gouthas, Efthimios

    2014-05-01

    This paper describes development of two key functionalities within the VIRSuite scene simulation program, broadening its scene generation capabilities and increasing accuracy of thermal signatures. Firstly, a new LADAR scene generation module has been designed. It is capable of simulating range imagery for Geiger mode LADAR, in addition to the already existing functionality for linear mode systems. Furthermore, a new 3D heat diffusion solver has been developed within the VIRSuite signature prediction module. It is capable of calculating the temperature distribution in complex three-dimensional objects for enhanced dynamic prediction of thermal signatures. With these enhancements, VIRSuite is now a robust tool for conducting dynamic simulation for missiles with multi-mode seekers.

  19. Understanding Building Infrastructure and Building Operation through DOE Asset Score Model: Lessons Learned from a Pilot Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Gorrissen, Willy J.

    2013-06-24

    The U.S. Department of Energy (DOE) is developing a national voluntary energy asset score system to help building owners to evaluate the as-built physical characteristics (including building envelope, the mechanical and electrical systems) and overall building energy efficiency, independent of occupancy and operational choices. The energy asset score breaks down building energy use information by simulating building performance under typical operating and occupancy conditions for a given use type. A web-based modeling tool, the energy asset score tool facilitates the implementation of the asset score system. The tool consists of a simplified user interface built on a centralized simulation enginemore » (EnergyPlus). It is intended to reduce both the implementation cost for the users and increase modeling standardization compared with an approach that requires users to build their own energy models. A pilot project with forty-two buildings (consisting mostly offices and schools) was conducted in 2012. This paper reports the findings. Participants were asked to collect a minimum set of building data and enter it into the asset score tool. Participants also provided their utility bills, existing ENERGY STAR scores, and previous energy audit/modeling results if available. The results from the asset score tool were compared with the building energy use data provided by the pilot participants. Three comparisons were performed. First, the actual building energy use, either from the utility bills or via ENERGY STAR Portfolio Manager, was compared with the modeled energy use. It was intended to examine how well the energy asset score represents a building’s system efficiencies, and how well it is correlated to a building’s actual energy consumption. Second, calibrated building energy models (where they exist) were used to examine any discrepancies between the asset score model and the pilot participant buildings’ [known] energy use pattern. This comparison examined the end use breakdowns and more detailed time series data. Third, ASHRAE 90.1 prototype buildings were also used as an industry standard modeling approach to test the accuracy level of the asset score tool. Our analysis showed that the asset score tool, which uses simplified building simulation, could provide results comparable to a more detailed energy model. The buildings’ as-built efficiency can be reflected in the energy asset score. An analysis between the modeled energy use through the asset score tool and the actual energy use from the utility bills can further inform building owners about the effectiveness of their building’s operation and maintenance.« less

  20. SNSEDextend: SuperNova Spectral Energy Distributions extrapolation toolkit

    NASA Astrophysics Data System (ADS)

    Pierel, Justin D. R.; Rodney, Steven A.; Avelino, Arturo; Bianco, Federica; Foley, Ryan J.; Friedman, Andrew; Hicken, Malcolm; Hounsell, Rebekah; Jha, Saurabh W.; Kessler, Richard; Kirshner, Robert; Mandel, Kaisey; Narayan, Gautham; Filippenko, Alexei V.; Scolnic, Daniel; Strolger, Louis-Gregory

    2018-05-01

    SNSEDextend extrapolates core-collapse and Type Ia Spectral Energy Distributions (SEDs) into the UV and IR for use in simulations and photometric classifications. The user provides a library of existing SED templates (such as those in the authors' SN SED Repository) along with new photometric constraints in the UV and/or NIR wavelength ranges. The software then extends the existing template SEDs so their colors match the input data at all phases. SNSEDextend can also extend the SALT2 spectral time-series model for Type Ia SN for a "first-order" extrapolation of the SALT2 model components, suitable for use in survey simulations and photometric classification tools; as the code does not do a rigorous re-training of the SALT2 model, the results should not be relied on for precision applications such as light curve fitting for cosmology.

  1. Preliminary validation of a new methodology for estimating dose reduction protocols in neonatal chest computed radiographs

    NASA Astrophysics Data System (ADS)

    Don, Steven; Whiting, Bruce R.; Hildebolt, Charles F.; Sehnert, W. James; Ellinwood, Jacquelyn S.; Töpfer, Karin; Masoumzadeh, Parinaz; Kraus, Richard A.; Kronemer, Keith A.; Herman, Thomas; McAlister, William H.

    2006-03-01

    The risk of radiation exposure is greatest for pediatric patients and, thus, there is a great incentive to reduce the radiation dose used in diagnostic procedures for children to "as low as reasonably achievable" (ALARA). Testing of low-dose protocols presents a dilemma, as it is unethical to repeatedly expose patients to ionizing radiation in order to determine optimum protocols. To overcome this problem, we have developed a computed-radiography (CR) dose-reduction simulation tool that takes existing images and adds synthetic noise to create realistic images that correspond to images generated with lower doses. The objective of our study was to determine the extent to which simulated, low-dose images corresponded with original (non-simulated) low-dose images. To make this determination, we created pneumothoraces of known volumes in five neonate cadavers and obtained images of the neonates at 10 mR, 1 mR and 0.1 mR (as measured at the cassette plate). The 10-mR exposures were considered "relatively-noise-free" images. We used these 10 mR-images and our simulation tool to create simulated 0.1- and 1-mR images. For the simulated and original images, we identified regions of interest (ROI) of the entire chest, free-in-air region, and liver. We compared the means and standard deviations of the ROI grey-scale values of the simulated and original images with paired t tests. We also had observers rate simulated and original images for image quality and for the presence or absence of pneumothoraces. There was no statistically significant difference in grey-scale-value means nor standard deviations between simulated and original entire chest ROI regions. The observer performance suggests that an exposure >=0.2 mR is required to detect the presence or absence of pneumothoraces. These preliminary results indicate that the use of the simulation tool is promising for achieving ALARA exposures in children.

  2. Update on SLD Engineering Tools Development

    NASA Technical Reports Server (NTRS)

    Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.

    2004-01-01

    The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.

  3. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  4. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    PubMed

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  5. On the evaluation of segmentation editing tools

    PubMed Central

    Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.

    2014-01-01

    Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063

  6. Brian: a simulator for spiking neural networks in python.

    PubMed

    Goodman, Dan; Brette, Romain

    2008-01-01

    "Brian" is a new simulator for spiking neural networks, written in Python (http://brian. di.ens.fr). It is an intuitive and highly flexible tool for rapidly developing new models, especially networks of single-compartment neurons. In addition to using standard types of neuron models, users can define models by writing arbitrary differential equations in ordinary mathematical notation. Python scientific libraries can also be used for defining models and analysing data. Vectorisation techniques allow efficient simulations despite the overheads of an interpreted language. Brian will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations. With its easy and intuitive syntax, Brian is also very well suited for teaching computational neuroscience.

  7. Predicting the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol mixtures via molecular simulation

    PubMed Central

    Paluch, Andrew S.; Parameswaran, Sreeja; Liu, Shuai; Kolavennu, Anasuya; Mobley, David L.

    2015-01-01

    We present a general framework to predict the excess solubility of small molecular solids (such as pharmaceutical solids) in binary solvents via molecular simulation free energy calculations at infinite dilution with conventional molecular models. The present study used molecular dynamics with the General AMBER Force Field to predict the excess solubility of acetanilide, acetaminophen, phenacetin, benzocaine, and caffeine in binary water/ethanol solvents. The simulations are able to predict the existence of solubility enhancement and the results are in good agreement with available experimental data. The accuracy of the predictions in addition to the generality of the method suggests that molecular simulations may be a valuable design tool for solvent selection in drug development processes. PMID:25637996

  8. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods.

    PubMed

    Skjerdal, Taran; Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; de Cecare, Alessandra; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Trevisiani, Marcello; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes , quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as "good"; "sufficient"; or "corrective action needed" based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.

  9. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods

    PubMed Central

    Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch.; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes, quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as “good”; “sufficient”; or “corrective action needed” based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users. PMID:29457031

  10. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  11. Testing forward model against OCO-2 and TANSO-FTS/GOSAT observed spectra in near infrared range

    NASA Astrophysics Data System (ADS)

    Zadvornykh, Ilya V.; Gribanov, Konstantin G.

    2015-11-01

    An existing software package FIRE-ARMS (Fine InfraRed Explorer for Atmospheric Remote MeasurementS) was modified by embedding vector radiative transfer model VLIDORT. Thus the program tool includes both thermal (TIR) and near infrared (NIR) regions. We performed forward simulation of near infrared spectra on the top of the atmosphere for outgoing radiation accounting multiple scattering in cloudless atmosphere. Simulated spectra are compared with spectra measured by TANSO-FTS/GOSAT and OCO-2 in the condition of cloudless atmosphere over Western Siberia. NCEP/NCAR reanalysis data were used to complete model atmosphere.

  12. ModFossa: A library for modeling ion channels using Python.

    PubMed

    Ferneyhough, Gareth B; Thibealut, Corey M; Dascalu, Sergiu M; Harris, Frederick C

    2016-06-01

    The creation and simulation of ion channel models using continuous-time Markov processes is a powerful and well-used tool in the field of electrophysiology and ion channel research. While several software packages exist for the purpose of ion channel modeling, most are GUI based, and none are available as a Python library. In an attempt to provide an easy-to-use, yet powerful Markov model-based ion channel simulator, we have developed ModFossa, a Python library supporting easy model creation and stimulus definition, complete with a fast numerical solver, and attractive vector graphics plotting.

  13. Integrating the simulation of domestic water demand behaviour to an urban water model using agent based modelling

    NASA Astrophysics Data System (ADS)

    Koutiva, Ifigeneia; Makropoulos, Christos

    2015-04-01

    The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model integration is that it allows the investigation of the effects of different water demand management strategies to an urban population's water demand behaviour and ultimately the effects of these policies to the volume of domestic water demand and the water resources system. The proposed modelling platform is optimised to simulate the effects of water policies during the Athens drought period of 1988-1994. The calibrated modelling platform is then applied to evaluate scenarios of water supply, water demand and water demand management strategies.

  14. Learning to Make Change Happen in Chinese Schools: Adapting a Problem-Based Computer Simulation for Developing School Leaders

    ERIC Educational Resources Information Center

    Hallinger, Philip; Shaobing, Tang; Jiafang, Lu

    2017-01-01

    School leader training has become a critical strategy in educational reform. However, in China, there still exists a big gap in terms of how to transfer leadership knowledge into practice. Thus, tools that can integrate formal knowledge into practice are called for urgently in school leader training. This paper presents the results of a research…

  15. A Multirate Control Strategy to the Slow Sensors Problem: An Interactive Simulation Tool for Controller Assisted Design

    PubMed Central

    Salt, Julián; Cuenca, Ángel; Palau, Francisco; Dormido, Sebastián

    2014-01-01

    In many control applications, the sensor technology used for the measurement of the variable to be controlled is not able to maintain a restricted sampling period. In this context, the assumption of regular and uniform sampling pattern is questionable. Moreover, if the control action updating can be faster than the output measurement frequency in order to fulfill the proposed closed loop behavior, the solution is usually a multirate controller. There are some known aspects to be careful of when a multirate system (MR) is going to be designed. The proper multiplicity between input-output sampling periods, the proper controller structure, the existence of ripples and others issues need to be considered. A useful way to save time and achieve good results is to have an assisted computer design tool. An interactive simulation tool to deal with MR seems to be the right solution. In this paper this kind of simulation application is presented. It allows an easy understanding of the performance degrading or improvement when changing the multirate sampling pattern parameters. The tool was developed using Sysquake, a Matlab-like language with fast execution and powerful graphic facilities. It can be delivered as an executable. In the paper a detailed explanation of MR treatment is also included and the design of four different MR controllers with flexible structure to be adapted to different schemes will also be presented. The Smith's predictor in these MR schemes is also explained, justified and used when time delays appear. Finally some interesting observations achieved using this interactive tool are included. PMID:24583971

  16. Rule-based modeling with Virtual Cell

    PubMed Central

    Schaff, James C.; Vasilescu, Dan; Moraru, Ion I.; Loew, Leslie M.; Blinov, Michael L.

    2016-01-01

    Summary: Rule-based modeling is invaluable when the number of possible species and reactions in a model become too large to allow convenient manual specification. The popular rule-based software tools BioNetGen and NFSim provide powerful modeling and simulation capabilities at the cost of learning a complex scripting language which is used to specify these models. Here, we introduce a modeling tool that combines new graphical rule-based model specification with existing simulation engines in a seamless way within the familiar Virtual Cell (VCell) modeling environment. A mathematical model can be built integrating explicit reaction networks with reaction rules. In addition to offering a large choice of ODE and stochastic solvers, a model can be simulated using a network free approach through the NFSim simulation engine. Availability and implementation: Available as VCell (versions 6.0 and later) at the Virtual Cell web site (http://vcell.org/). The application installs and runs on all major platforms and does not require registration for use on the user’s computer. Tutorials are available at the Virtual Cell website and Help is provided within the software. Source code is available at Sourceforge. Contact: vcell_support@uchc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27497444

  17. Simulation as a planning tool for job-shop production environment

    NASA Astrophysics Data System (ADS)

    Maram, Venkataramana; Nawawi, Mohd Kamal Bin Mohd; Rahman, Syariza Abdul; Sultan, Sultan Juma

    2015-12-01

    In this paper, we made an attempt to use discrete event simulation software ARENA® as a planning tool for job shop production environment. We considered job shop produces three types of Jigs with different sequence of operations to study and improve shop floor performance. The sole purpose of the study is to identifying options to improve machines utilization, reducing job waiting times at bottleneck machines. First, the performance of the existing system was evaluated by using ARENA®. Then identified improvement opportunities by analyzing base system results. Second, updated the model with most economical options. The proposed new system outperforms with that of the current base system by 816% improvement in delay times at paint shop by increase 2 to 3 and Jig cycle time reduces by Jig1 92%, Jig2 65% and Jig3 41% and hence new proposal was recommended.

  18. Application Of Moldex3D For Thin-wall Injection Moulding Simulation

    NASA Astrophysics Data System (ADS)

    Šercer, Mladen; Godec, Damir; Bujanić, Božo

    2007-05-01

    The benefits associated with decreasing wall thicknesses below their current values are still measurable and desired even if the final wall thickness is nowhere near those of the aggressive portable electronics industry. It is important to note that gains in wall section reduction do not always occur without investment, in this case, in tooling and machinery upgrades. Equally important is the fact that productivity and performance benefits of reduced material usage, fast cycle times, and lighter weight can often outweigh most of the added costs. In order to eliminate unnecessary mould trials, minimize product development cycle, reduce overall costs and improve product quality, polymeric engineers use new CAE technology (Computer Aided Engineering). This technology is a simulation tool, which combines proven theories, material properties and process conditions to generate realistic simulations and produce valuable recommendations. Based on these recommendations, an optional combination of product design, material and process conditions can be identified. In this work, Moldex3D software was used for simulation of injection moulding in order to avoid potential moulding problems. The results gained from the simulation were used for the optimization of an existing product design, for mould development and for optimization of processing parameters, e.g. injection pressure, mould cavity temperature, etc.

  19. Comparing Natural Gas Leakage Detection Technologies Using an Open-Source "Virtual Gas Field" Simulator.

    PubMed

    Kemp, Chandler E; Ravikumar, Arvind P; Brandt, Adam R

    2016-04-19

    We present a tool for modeling the performance of methane leak detection and repair programs that can be used to evaluate the effectiveness of detection technologies and proposed mitigation policies. The tool uses a two-state Markov model to simulate the evolution of methane leakage from an artificial natural gas field. Leaks are created stochastically, drawing from the current understanding of the frequency and size distributions at production facilities. Various leak detection and repair programs can be simulated to determine the rate at which each would identify and repair leaks. Integrating the methane leakage over time enables a meaningful comparison between technologies, using both economic and environmental metrics. We simulate four existing or proposed detection technologies: flame ionization detection, manual infrared camera, automated infrared drone, and distributed detectors. Comparing these four technologies, we found that over 80% of simulated leakage could be mitigated with a positive net present value, although the maximum benefit is realized by selectively targeting larger leaks. Our results show that low-cost leak detection programs can rely on high-cost technology, as long as it is applied in a way that allows for rapid detection of large leaks. Any strategy to reduce leakage should require a careful consideration of the differences between low-cost technologies and low-cost programs.

  20. Haptic feedback improves surgeons' user experience and fracture reduction in facial trauma simulation.

    PubMed

    Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka

    2016-01-01

    Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.

  1. Integration of SimSET photon history generator in GATE for efficient Monte Carlo simulations of pinhole SPECT.

    PubMed

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J S; Tsui, Benjamin M W

    2008-07-01

    The authors developed and validated an efficient Monte Carlo simulation (MCS) workflow to facilitate small animal pinhole SPECT imaging research. This workflow seamlessly integrates two existing MCS tools: simulation system for emission tomography (SimSET) and GEANT4 application for emission tomography (GATE). Specifically, we retained the strength of GATE in describing complex collimator/detector configurations to meet the anticipated needs for studying advanced pinhole collimation (e.g., multipinhole) geometry, while inserting the fast SimSET photon history generator (PHG) to circumvent the relatively slow GEANT4 MCS code used by GATE in simulating photon interactions inside voxelized phantoms. For validation, data generated from this new SimSET-GATE workflow were compared with those from GATE-only simulations as well as experimental measurements obtained using a commercial small animal pinhole SPECT system. Our results showed excellent agreement (e.g., in system point response functions and energy spectra) between SimSET-GATE and GATE-only simulations, and, more importantly, a significant computational speedup (up to approximately 10-fold) provided by the new workflow. Satisfactory agreement between MCS results and experimental data were also observed. In conclusion, the authors have successfully integrated SimSET photon history generator in GATE for fast and realistic pinhole SPECT simulations, which can facilitate research in, for example, the development and application of quantitative pinhole and multipinhole SPECT for small animal imaging. This integrated simulation tool can also be adapted for studying other preclinical and clinical SPECT techniques.

  2. Estimating rare events in biochemical systems using conditional sampling.

    PubMed

    Sundar, V S

    2017-01-28

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dierickx, Marion I. P.; Loeb, Abraham, E-mail: mdierickx@cfa.harvard.edu, E-mail: aloeb@cfa.harvard.edu

    The extensive span of the Sagittarius (Sgr) stream makes it a promising tool for studying the gravitational potential of the Milky Way (MW). Characterizing its stellar kinematics can constrain halo properties and provide a benchmark for the paradigm of galaxy formation from cold dark matter. Accurate models of the disruption dynamics of the Sgr progenitor are necessary to employ this tool. Using a combination of analytic modeling and N -body simulations, we build a new model of the Sgr orbit and resulting stellar stream. In contrast to previous models, we simulate the full infall trajectory of the Sgr progenitor frommore » the time it first crossed the MW virial radius 8 Gyr ago. An exploration of the parameter space of initial phase-space conditions yields tight constraints on the angular momentum of the Sgr progenitor. Our best-fit model is the first to accurately reproduce existing data on the 3D positions and radial velocities of the debris detected 100 kpc away in the MW halo. In addition to replicating the mapped stream, the simulation also predicts the existence of several arms of the Sgr stream extending to hundreds of kiloparsecs. The two most distant stars known in the MW halo coincide with the predicted structure. Additional stars in the newly predicted arms can be found with future data from the Large Synoptic Survey Telescope. Detecting a statistical sample of stars in the most distant Sgr arms would provide an opportunity to constrain the MW potential out to unprecedented Galactocentric radii.« less

  4. Micromagnetic computer simulations of spin waves in nanometre-scale patterned magnetic elements

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Koog

    2010-07-01

    Current needs for further advances in the nanotechnologies of information-storage and -processing devices have attracted a great deal of interest in spin (magnetization) dynamics in nanometre-scale patterned magnetic elements. For instance, the unique dynamic characteristics of non-uniform magnetic microstructures such as various types of domain walls, magnetic vortices and antivortices, as well as spin wave dynamics in laterally restricted thin-film geometries, have been at the centre of extensive and intensive researches. Understanding the fundamentals of their unique spin structure as well as their robust and novel dynamic properties allows us to implement new functionalities into existing or future devices. Although experimental tools and theoretical approaches are effective means of understanding the fundamentals of spin dynamics and of gaining new insights into them, the limitations of those same tools and approaches have left gaps of unresolved questions in the pertinent physics. As an alternative, however, micromagnetic modelling and numerical simulation has recently emerged as a powerful tool for the study of a variety of phenomena related to spin dynamics of nanometre-scale magnetic elements. In this review paper, I summarize the recent results of simulations of the excitation and propagation and other novel wave characteristics of spin waves, highlighting how the micromagnetic computer simulation approach contributes to an understanding of spin dynamics of nanomagnetism and considering some of the merits of numerical simulation studies. Many examples of micromagnetic modelling for numerical calculations, employing various dimensions and shapes of patterned magnetic elements, are given. The current limitations of continuum micromagnetic modelling and of simulations based on the Landau-Lifshitz-Gilbert equation of motion of magnetization are also discussed, along with further research directions for spin-wave studies.

  5. Damping parameter study of a perforated plate with bias flow

    NASA Astrophysics Data System (ADS)

    Mazdeh, Alireza

    One of the main impediments to successful operation of combustion systems in industrial and aerospace applications including gas turbines, ramjets, rocket motors, afterburners (augmenters) and even large heaters/boilers is the dynamic instability also known as thermo-acoustic instability. Concerns with this ongoing problem have grown with the introduction of Lean Premixed Combustion (LPC) systems developed to address the environmental concerns associated with the conventional combustion systems. The most common way to mitigate thermo-acoustic instability is adding acoustic damping to the combustor using acoustic liners. Recently damping properties of bias flow initially introduced to liners only for cooling purposes have been recognized and proven to be an asset in enhancing the damping effectiveness of liners. Acoustic liners are currently being designed using empirical design rules followed by build-test-improve steps; basically by trial and error. There is growing concerns on the lack of reliability associated with the experimental evaluation of the acoustic liners with small size apertures. The development of physics-based tools in assisting the design of such liners has become of great interest to practitioners recently. This dissertation focuses primarily on how Large-Eddy Simulations (LES) or similar techniques such as Scaled Adaptive Simulation (SAS) can be used to characterize damping properties of bias flow. The dissertation also reviews assumptions made in the existing analytical, semi-empirical, and numerical models, provides a criteria to rank order the existing models, and identifies the best existing theoretical model. Flow field calculations by LES provide good insight into the mechanisms that led to acoustic damping. Comparison of simulation results with empirical and analytical studies shows that LES simulation is a viable alternative to the empirical and analytical methods and can accurately predict the damping behavior of liners. Currently the role of LES for research studies concerned with damping properties of liners is limited to validation of other empirical or theoretical approaches. This research has shown that LES can go beyond that and can be used for performing parametric studies to characterize the sensitivity of acoustic properties of multi--perforated liners to the changes in the geometry and flow conditions and be used as a tool to design acoustic liners. The conducted research provides an insightful understanding about the contribution of different flow and geometry parameters such as perforated plate thickness, aperture radius, porosity factors and bias flow velocity. While the study agrees with previous observations obtained by analytical or experimental methods, it also quantifies the impact from these parameters on the acoustic impedance of perforated plate, a key parameter to determine the acoustic performance of any system. The conducted study has also explored the limitations and capabilities of commercial tool when are applied for performing simulation studies on damping properties of liners. The overall agreement between LES results and previous studies proves that commercial tools can be effectively used for these applications under certain conditions.

  6. A software system for the simulation of chest lesions

    NASA Astrophysics Data System (ADS)

    Ryan, John T.; McEntee, Mark; Barrett, Saoirse; Evanoff, Michael; Manning, David; Brennan, Patrick

    2007-03-01

    We report on the development of a novel software tool for the simulation of chest lesions. This software tool was developed for use in our study to attain optimal ambient lighting conditions for chest radiology. This study involved 61 consultant radiologists from the American Board of Radiology. Because of its success, we intend to use the same tool for future studies. The software has two main functions: the simulation of lesions and retrieval of information for ROC (Receiver Operating Characteristic) and JAFROC (Jack-Knife Free Response ROC) analysis. The simulation layer operates by randomly selecting an image from a bank of reportedly normal chest x-rays. A random location is then generated for each lesion, which is checked against a reference lung-map. If the location is within the lung fields, as derived from the lung-map, a lesion is superimposed. Lesions are also randomly selected from a bank of manually created chest lesion images. A blending algorithm determines which are the best intensity levels for the lesion to sit naturally within the chest x-ray. The same software was used to run a study for all 61 radiologists. A sequence of images is displayed in random order. Half of these images had simulated lesions, ranging from subtle to obvious, and half of the images were normal. The operator then selects locations where he/she thinks lesions exist and grades the lesion accordingly. We have found that this software was very effective in this study and intend to use the same principles for future studies.

  7. Design tool for estimating chemical hydrogen storage system characteristics for light-duty fuel cell vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.

    The U.S. Department of Energy (DOE) has developed a vehicle framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to DOE’s Technical Targets using four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework model for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be easily estimated. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates the systems parameters required to run the storage system model. Additionally, this design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the framework model and compare it to the DOE Technical Targets. These models will be explained and exercised with existing hydrogen storage materials.« less

  8. Progress in modeling and simulation.

    PubMed

    Kindler, E

    1998-01-01

    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  9. Transforming BIM to BEM: Generation of Building Geometry for the NASA Ames Sustainability Base BIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Donnell, James T.; Maile, Tobias; Rose, Cody

    Typical processes of whole Building Energy simulation Model (BEM) generation are subjective, labor intensive, time intensive and error prone. Essentially, these typical processes reproduce already existing data, i.e. building models already created by the architect. Accordingly, Lawrence Berkeley National Laboratory (LBNL) developed a semi-automated process that enables reproducible conversions of Building Information Model (BIM) representations of building geometry into a format required by building energy modeling (BEM) tools. This is a generic process that may be applied to all building energy modeling tools but to date has only been used for EnergyPlus. This report describes and demonstrates each stage inmore » the semi-automated process for building geometry using the recently constructed NASA Ames Sustainability Base throughout. This example uses ArchiCAD (Graphisoft, 2012) as the originating CAD tool and EnergyPlus as the concluding whole building energy simulation tool. It is important to note that the process is also applicable for professionals that use other CAD tools such as Revit (“Revit Architecture,” 2012) and DProfiler (Beck Technology, 2012) and can be extended to provide geometry definitions for BEM tools other than EnergyPlus. Geometry Simplification Tool (GST) was used during the NASA Ames project and was the enabling software that facilitated semi-automated data transformations. GST has now been superseded by Space Boundary Tool (SBT-1) and will be referred to as SBT-1 throughout this report. The benefits of this semi-automated process are fourfold: 1) reduce the amount of time and cost required to develop a whole building energy simulation model, 2) enable rapid generation of design alternatives, 3) improve the accuracy of BEMs and 4) result in significantly better performing buildings with significantly lower energy consumption than those created using the traditional design process, especially if the simulation model was used as a predictive benchmark during operation. Developing BIM based criteria to support the semi-automated process should result in significant reliable improvements and time savings in the development of BEMs. In order to define successful BIMS, CAD export of IFC based BIMs for BEM must adhere to a standard Model View Definition (MVD) for simulation as provided by the concept design BIM MVD (buildingSMART, 2011). In order to ensure wide scale adoption, companies would also need to develop their own material libraries to support automated activities and undertake a pilot project to improve understanding of modeling conventions and design tool features and limitations.« less

  10. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  11. Simulation for ward processes of surgical care.

    PubMed

    Pucher, Philip H; Darzi, Ara; Aggarwal, Rajesh

    2013-07-01

    The role of simulation in surgical education, initially confined to technical skills and procedural tasks, increasingly includes training nontechnical skills including communication, crisis management, and teamwork. Research suggests that many preventable adverse events can be attributed to nontechnical error occurring within a ward context. Ward rounds represent the primary point of interaction between patient and physician but take place without formalized training or assessment. The simulated ward should provide an environment in which processes of perioperative care can be performed safely and realistically, allowing multidisciplinary assessment and training of full ward rounds. We review existing literature and describe our experience in setting up our ward simulator. We examine the facilities, equipment, cost, and personnel required for establishing a surgical ward simulator and consider the scenario development, assessment, and feedback tools necessary to integrate it into a surgical curriculum. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Local-feature analysis for automated coarse-graining of bulk-polymer molecular dynamics simulations.

    PubMed

    Xue, Y; Ludovice, P J; Grover, M A

    2012-12-01

    A method for automated coarse-graining of bulk polymers is presented, using the data-mining tool of local feature analysis. Most existing methods for polymer coarse-graining define superatoms based on their covalent bonding topology along the polymer backbone, but here superatoms are defined based only on their correlated motions, as observed in molecular dynamics simulations. Correlated atomic motions are identified in the simulation data using local feature analysis, between atoms in the same or in different polymer chains. Groups of highly correlated atoms constitute the superatoms in the coarse-graining scheme, and the positions of their seed coordinates are then projected forward in time. Based on only the seed positions, local feature analysis enables the full reconstruction of all atomic positions. This reconstruction suggests an iterative scheme to reduce the computation of the simulations to initialize another short molecular dynamic simulation, identify new superatoms, and again project forward in time.

  13. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  14. Operational Simulation Tools and Long Term Strategic Planning for High Penetrations of PV in the Southeastern United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuohy, Aidan; Smith, Jeff; Rylander, Matt

    2016-07-11

    Increasing levels of distributed and utility scale Solar Photovoltaics (PV) will have an impact on many utility functions, including distribution system operations, bulk system performance, business models and scheduling of generation. In this project, EPRI worked with Southern Company Services and its affiliates and the Tennessee Valley Authority to assist these utilities in their strategic planning efforts for integrating PV, based on modeling, simulation and analysis using a set of innovative tools. Advanced production simulation models were used to investigate operating reserve requirements. To leverage existing work and datasets, this last task was carried out on the California system. Overall,more » the project resulted in providing useful information to both of the utilities involved and through the final reports and interactions during the project. The results from this project can be used to inform the industry about new and improved methodologies for understanding solar PV penetration, and will influence ongoing and future research. This report summarizes each of the topics investigated over the 2.5-year project period.« less

  15. Computational Methods for HSCT-Inlet Controls/CFD Interdisciplinary Research

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Melcher, Kevin J.; Chicatelli, Amy K.; Hartley, Tom T.; Chung, Joongkee

    1994-01-01

    A program aimed at facilitating the use of computational fluid dynamics (CFD) simulations by the controls discipline is presented. The objective is to reduce the development time and cost for propulsion system controls by using CFD simulations to obtain high-fidelity system models for control design and as numerical test beds for control system testing and validation. An interdisciplinary team has been formed to develop analytical and computational tools in three discipline areas: controls, CFD, and computational technology. The controls effort has focused on specifying requirements for an interface between the controls specialist and CFD simulations and a new method for extracting linear, reduced-order control models from CFD simulations. Existing CFD codes are being modified to permit time accurate execution and provide realistic boundary conditions for controls studies. Parallel processing and distributed computing techniques, along with existing system integration software, are being used to reduce CFD execution times and to support the development of an integrated analysis/design system. This paper describes: the initial application for the technology being developed, the high speed civil transport (HSCT) inlet control problem; activities being pursued in each discipline area; and a prototype analysis/design system in place for interactive operation and visualization of a time-accurate HSCT-inlet simulation.

  16. PyRhO: A Multiscale Optogenetics Simulation Platform

    PubMed Central

    Evans, Benjamin D.; Jarvis, Sarah; Schultz, Simon R.; Nikolic, Konstantin

    2016-01-01

    Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences. PMID:27148037

  17. PyRhO: A Multiscale Optogenetics Simulation Platform.

    PubMed

    Evans, Benjamin D; Jarvis, Sarah; Schultz, Simon R; Nikolic, Konstantin

    2016-01-01

    Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences.

  18. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    PubMed

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap that exists between the CAS system and implant manufacturers, hospitals, and surgeons.

  19. Creating executable architectures using Visual Simulation Objects (VSO)

    NASA Astrophysics Data System (ADS)

    Woodring, John W.; Comiskey, John B.; Petrov, Orlin M.; Woodring, Brian L.

    2005-05-01

    Investigations have been performed to identify a methodology for creating executable models of architectures and simulations of architecture that lead to an understanding of their dynamic properties. Colored Petri Nets (CPNs) are used to describe architecture because of their strong mathematical foundations, the existence of techniques for their verification and graph theory"s well-established history of success in modern science. CPNs have been extended to interoperate with legacy simulations via a High Level Architecture (HLA) compliant interface. It has also been demonstrated that an architecture created as a CPN can be integrated with Department of Defense Architecture Framework products to ensure consistency between static and dynamic descriptions. A computer-aided tool, Visual Simulation Objects (VSO), which aids analysts in specifying, composing and executing architectures, has been developed to verify the methodology and as a prototype commercial product.

  20. The Resource Usage Aware Backfilling

    NASA Astrophysics Data System (ADS)

    Guim, Francesc; Rodero, Ivan; Corbalan, Julita

    Job scheduling policies for HPC centers have been extensively studied in the last few years, especially backfilling based policies. Almost all of these studies have been done using simulation tools. All the existent simulators use the runtime (either estimated or real) provided in the workload as a basis of their simulations. In our previous work we analyzed the impact on system performance of considering the resource sharing (memory bandwidth) of running jobs including a new resource model in the Alvio simulator. Based on this studies we proposed the LessConsume and LessConsume Threshold resource selection policies. Both are oriented to reduce the saturation of the shared resources thus increasing the performance of the system. The results showed how both resource allocation policies shown how the performance of the system can be improved by considering where the jobs are finally allocated.

  1. Lightweight computational steering of very large scale molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less

  2. Taxiing, Take-Off, and Landing Simulation of the High Speed Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Reaves, Mercedes C.; Horta, Lucas G.

    1999-01-01

    The aircraft industry jointly with NASA is studying enabling technologies for higher speed, longer range aircraft configurations. Higher speeds, higher temperatures, and aerodynamics are driving these newer aircraft configurations towards long, slender, flexible fuselages. Aircraft response during ground operations, although often overlooked, is a concern due to the increased fuselage flexibility. This paper discusses modeling and simulation of the High Speed Civil Transport aircraft during taxiing, take-off, and landing. Finite element models of the airframe for various configurations are used and combined with nonlinear landing gear models to provide a simulation tool to study responses to different ground input conditions. A commercial computer simulation program is used to numerically integrate the equations of motion and to compute estimates of the responses using an existing runway profile. Results show aircraft responses exceeding safe acceptable human response levels.

  3. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  4. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    PubMed

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.

  5. Analyzing and Visualizing Cosmological Simulations with ParaView

    NASA Astrophysics Data System (ADS)

    Woodring, Jonathan; Heitmann, Katrin; Ahrens, James; Fasel, Patricia; Hsu, Chung-Hsing; Habib, Salman; Pope, Adrian

    2011-07-01

    The advent of large cosmological sky surveys—ushering in the era of precision cosmology—has been accompanied by ever larger cosmological simulations. The analysis of these simulations, which currently encompass tens of billions of particles and up to a trillion particles in the near future, is often as daunting as carrying out the simulations in the first place. Therefore, the development of very efficient analysis tools combining qualitative and quantitative capabilities is a matter of some urgency. In this paper, we introduce new analysis features implemented within ParaView, a fully parallel, open-source visualization toolkit, to analyze large N-body simulations. A major aspect of ParaView is that it can live and operate on the same machines and utilize the same parallel power as the simulation codes themselves. In addition, data movement is in a serious bottleneck now and will become even more of an issue in the future; an interactive visualization and analysis tool that can handle data in situ is fast becoming essential. The new features in ParaView include particle readers and a very efficient halo finder that identifies friends-of-friends halos and determines common halo properties, including spherical overdensity properties. In combination with many other functionalities already existing within ParaView, such as histogram routines or interfaces to programming languages like Python, this enhanced version enables fast, interactive, and convenient analyses of large cosmological simulations. In addition, development paths are available for future extensions.

  6. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  7. Benchmark Problems of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less

  8. MeshVoro: A Three-Dimensional Voronoi Mesh Building Tool for the TOUGH Family of Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, C. M.; Boyle, K. L.; Reagan, M.

    2013-09-30

    Few tools exist for creating and visualizing complex three-dimensional simulation meshes, and these have limitations that restrict their application to particular geometries and circumstances. Mesh generation needs to trend toward ever more general applications. To that end, we have developed MeshVoro, a tool that is based on the Voro (Rycroft 2009) library and is capable of generating complex threedimensional Voronoi tessellation-based (unstructured) meshes for the solution of problems of flow and transport in subsurface geologic media that are addressed by the TOUGH (Pruess et al. 1999) family of codes. MeshVoro, which includes built-in data visualization routines, is a particularly usefulmore » tool because it extends the applicability of the TOUGH family of codes by enabling the scientifically robust and relatively easy discretization of systems with challenging 3D geometries. We describe several applications of MeshVoro. We illustrate the ability of the tool to straightforwardly transform a complex geological grid into a simulation mesh that conforms to the specifications of the TOUGH family of codes. We demonstrate how MeshVoro can describe complex system geometries with a relatively small number of grid blocks, and we construct meshes for geometries that would have been practically intractable with a standard Cartesian grid approach. We also discuss the limitations and appropriate applications of this new technology.« less

  9. Computer Simulation Of An In-Process Surface Finish Sensor.

    NASA Astrophysics Data System (ADS)

    Rakels, Jan H.

    1987-01-01

    It is generally accepted, that optical methods are the most promising for the in-process measurement of surface finish. These methods have the advantages of being non-contacting and fast data acquisition. Furthermore, these optical instruments can be easily retrofitted on existing machine-tools. In the Micro-Engineering Centre at the University of Warwick, an optical sensor has been developed which can measure the rms roughness, slope and wavelength of turned and precision ground surfaces during machining. The operation of this device is based upon the Kirchhoff-Fresnel diffraction integral. Application of this theory to ideal turned and ground surfaces is straightforward, and indeed the calculated diffraction patterns are in close agreement with patterns produced by an actual optical instrument. Since it is mathematically difficult to introduce real machine-tool behaviour into the diffraction integral, a computer program has been devised, which simulates the operation of the optical sensor. The program produces a diffraction pattern as a graphical output. Comparison between computer generated and actual diffraction patterns of the same surfaces show a high correlation. The main aim of this program is to construct an atlas, which maps known machine-tool errors versus optical diffraction patterns. This atlas can then be used for machine-tool condition diagnostics. It has been found that optical monitoring is very sensitive to minor defects. Therefore machine-tool detoriation can be detected before it is detrimental.

  10. Application of a forest-simulation model to assess the energy yield and ecological impact of forest utilization for energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, T W; Shugart, H H; West, D C

    1981-01-01

    This study examines the utilization and management of natural forest lands to meet growing wood-energy demands. An application of a forest simulation model is described for assessing energy returns and long-term ecological impacts of wood-energy harvesting under four general silvicultural practices. Results indicate that moderate energy yields could be expected from mild cutting operations which would significantly effect neither the commercial timber market nor the composition, structure, or diversity of these forests. Forest models can provide an effective tool for determining optimal management strategies that maximize energy returns, minimize environmental detriment, and complement existing land-use plans.

  11. A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON

    PubMed Central

    King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix

    2008-01-01

    As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597

  12. Assessment of the Impacts of ACLS on the ISS Life Support System Using Dynamic Simulations in V-HAB

    NASA Technical Reports Server (NTRS)

    Putz, Daniel; Olthoff, Claas; Ewert, Michael; Anderson, Molly

    2016-01-01

    The Advanced Closed Loop System (ACLS) is currently under development by Airbus Defense and Space and is slated for launch to the International Space Station (ISS) in 2017. The addition of new hardware into an already complex system such as the ISS life support system (LSS) always poses operational risks. It is therefore important to understand the impacts ACLS will have on the existing systems to ensure smooth operations for the ISS. This analysis can be done by using dynamic computer simulations and one possible tool for such a simulation is the Virtual Habitat (V-HAB). Based on MATLAB, V-HAB has been under development at the Institute of Astronautics of the Technical University of Munich (TUM) since 2004 and in the past has been successfully used to simulate the ISS life support systems. The existing V-HAB ISS simulation model treated the interior volume of the space station as one large, ideally-stirred container. This model was improved to allow the calculation of the atmospheric composition inside individual modules of the ISS by splitting it into twelve distinct volumes. The virtual volumes are connected by a simulation of the inter-module ventilation flows. This allows for a combined simulation of the LSS hardware and the atmospheric composition aboard the ISS. A dynamic model of ACLS is added to the ISS Simulation and several different operating modes for both ACLS and the existing ISS life support systems are studied and the impacts of ACLS on the rest of the system are determined. The results suggest that the US, Russian and ACLS CO2 systems can operate at the same time without impeding each other. Furthermore, based on the results of this analysis, the US and ACLS Sabatier systems can be operated in parallel as well to a achieve a very low CO2 concentration in the cabin atmosphere.

  13. Assessment of the Impacts of ACLS on the ISS Life Support System using Dynamic Simulations in V-HAB

    NASA Technical Reports Server (NTRS)

    Puetz, Daniel; Olthoff, Claas; Ewert, Michael K.; Anderson, Molly S.

    2016-01-01

    The Advanced Closed Loop System (ACLS) is currently under development by Airbus Defense and Space and is slated for launch to the International Space Station (ISS) in 2017. The addition of new hardware into an already complex system such as the ISS life support system (LSS) always poses operational risks. It is therefore important to understand the impacts ACLS will have on the existing systems to ensure smooth operations for the ISS. This analysis can be done by using dynamic computer simulations and one possible tool for such a simulation is Virtual Habitat (V-HAB). Based on Matlab (Registered Trademark) V-HAB has been under development at the Institute of Astronautics of the Technical University Munich (TUM) since 2006 and in the past has been successfully used to simulate the ISS life support systems. The existing V-HAB ISS simulation model treated the interior volume of the space station as one large ideally-stirred container. This model was improved to allow the calculation of the atmospheric composition inside the individual modules of the ISS by splitting it into ten distinct volumes. The virtual volumes are connected by a simulation of the inter-module ventilation flows. This allows for a combined simulation of the LSS hardware and the atmospheric composition aboard the ISS. A dynamic model of ACLS is added to the ISS simulation and different operating modes for both ACLS and the existing ISS life support systems are studied to determine the impacts of ACLS on the rest of the system. The results suggest that the US, Russian and ACLS CO2 systems can operate at the same time without impeding each other. Furthermore, based on the results of this analysis, the US and ACLS Sabatier systems can be operated in parallel as well to achieve the highest possible CO2 recycling together with a low CO2 concentration.

  14. Design and implementation of a general main axis controller for the ESO telescopes

    NASA Astrophysics Data System (ADS)

    Sandrock, Stefan; Di Lieto, Nicola; Pettazzi, Lorenzo; Erm, Toomas

    2012-09-01

    Most of the real-time control systems at the existing ESO telescopes were developed with "traditional" methods, using general purpose VMEbus electronics, and running applications that were coded by hand, mostly using the C programming language under VxWorks. As we are moving towards more modern design methods, we have explored a model-based design approach for real-time applications in the telescope area, and used the control algorithm of a standard telescope main axis as a first example. We wanted to have a clear work-flow that follows the "correct-by-construction" paradigm, where the implementation is testable in simulation on the development host, and where the testing time spent by debugging on target is minimized. It should respect the domains of control, electronics, and software engineers in the choice of tools. It should be a targetindependent approach so that the result could be deployed on various platforms. We have selected the Mathworks tools Simulink, Stateflow, and Embedded Coder for design and implementation, and LabVIEW with NI hardware for hardware-in-the-loop testing, all of which are widely used in industry. We describe how these tools have been used in order to model, simulate, and test the application. We also evaluate the benefits of this approach compared to the traditional method with respect to testing effort and maintainability. For a specific axis controller application we have successfully integrated the result into the legacy platform of the existing VLT software, as well as demonstrated how to use the same design for a new development with a completely different environment.

  15. The cutting-edge training modalities and educational platforms for accredited surgical training: A systematic review.

    PubMed

    Forgione, Antonello; Guraya, Salman Y

    2017-01-01

    Historically, operating room (OR) has always been considered as a stand-alone trusted platform for surgical education and training. However, concerns about financial constraints, quality control, and patient safety have urged the surgical educators to develop more cost-effective, surgical educational platforms that can be employed outside the OR. Furthermore, trained surgeons need to regularly update their surgical skills to keep abreast with the emerging surgical technologies. This research aimed to explore the value of currently available modern surgical tools that can be used outside the OR and also elaborates the existing laparoscopic surgical training programs in world-class centers across the globe with a view to formulate a blended and unified structured surgical training program. Several data sources were searched using MeSH terms "Laparoscopic surgery" and "Surgical training" and "Surgical curriculum" and "fundamentals of endoscopic surgery" and "fundamentals of laparoscopic surgery" and "Telementoring" and "Box trainer." The eligibility criteria used in data extraction searched for original and review articles and by excluding the editorial articles, short communications, conference proceedings, personal view, and commentaries. Data synthesis and data analysis were done by reviewing the initially retrieved 211 articles. Irrelevant and duplicate and redundant articles were excluded from the study. Finally, 12 articles were selected for this systematic review. Data results showed that a myriad of cutting-edge technical innovations have provided modern surgical training tools such as the simulation-based mechanical and virtual reality simulators, animal and cadaveric labs, telementoring, telerobotic-assisted surgery, and video games. Surgical simulators allow the trainees to acquire surgical skills in a tension-free environment without supervision or time constraints. The existing world-renowned surgical training centers employ various clusters of training tools that essentially endeavor to embed the acquisition of knowledge and technical skills. However, a unified training curriculum that may be accepted worldwide is currently not available.

  16. The cutting-edge training modalities and educational platforms for accredited surgical training: A systematic review

    PubMed Central

    Forgione, Antonello; Guraya, Salman Y.

    2017-01-01

    Background: Historically, operating room (OR) has always been considered as a stand-alone trusted platform for surgical education and training. However, concerns about financial constraints, quality control, and patient safety have urged the surgical educators to develop more cost-effective, surgical educational platforms that can be employed outside the OR. Furthermore, trained surgeons need to regularly update their surgical skills to keep abreast with the emerging surgical technologies. This research aimed to explore the value of currently available modern surgical tools that can be used outside the OR and also elaborates the existing laparoscopic surgical training programs in world-class centers across the globe with a view to formulate a blended and unified structured surgical training program. Materials and Methods: Several data sources were searched using MeSH terms “Laparoscopic surgery” and “Surgical training” and “Surgical curriculum” and “fundamentals of endoscopic surgery” and “fundamentals of laparoscopic surgery” and “Telementoring” and “Box trainer.” The eligibility criteria used in data extraction searched for original and review articles and by excluding the editorial articles, short communications, conference proceedings, personal view, and commentaries. Data synthesis and data analysis were done by reviewing the initially retrieved 211 articles. Irrelevant and duplicate and redundant articles were excluded from the study. Results: Finally, 12 articles were selected for this systematic review. Data results showed that a myriad of cutting-edge technical innovations have provided modern surgical training tools such as the simulation-based mechanical and virtual reality simulators, animal and cadaveric labs, telementoring, telerobotic-assisted surgery, and video games. Surgical simulators allow the trainees to acquire surgical skills in a tension-free environment without supervision or time constraints. Conclusion: The existing world-renowned surgical training centers employ various clusters of training tools that essentially endeavor to embed the acquisition of knowledge and technical skills. However, a unified training curriculum that may be accepted worldwide is currently not available. PMID:28567070

  17. Development of an Advanced Stimulation / Production Predictive Simulator for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritchett, John W.

    2015-04-15

    There are several well-known obstacles to the successful deployment of EGS projects on a commercial scale, of course. EGS projects are expected to be deeper, on the average, than conventional “natural” geothermal reservoirs, and drilling costs are already a formidable barrier to conventional geothermal projects. Unlike conventional resources (which frequently announce their presence with natural manifestations such as geysers, hot springs and fumaroles), EGS prospects are likely to appear fairly undistinguished from the earth surface. And, of course, the probable necessity of fabricating a subterranean fluid circulation network to mine the heat from the rock (instead of simply relying onmore » natural, pre-existing permeable fractures) adds a significant degree of uncertainty to the prospects for success. Accordingly, the basic motivation for the work presented herein was to try to develop a new set of tools that would be more suitable for this purpose. Several years ago, the Department of Energy’s Geothermal Technologies Office recognized this need and funded a cost-shared grant to our company (then SAIC, now Leidos) to partner with Geowatt AG of Zurich, Switzerland and undertake the development of a new reservoir simulator that would be more suitable for EGS forecasting than the existing tools. That project has now been completed and a new numerical geothermal reservoir simulator has been developed. It is named “HeatEx” (for “Heat Extraction”) and is almost completely new, although its methodology owes a great deal to other previous geothermal software development efforts, including Geowatt’s “HEX-S” code, the STAR and SPFRAC simulators developed here at SAIC/Leidos, the MINC approach originally developed at LBNL, and tracer analysis software originally formulated at INEL. Furthermore, the development effort was led by engineers with many years of experience in using reservoir simulation software to make meaningful forecasts for real geothermal projects, not just software designers. It is hoped that, as a result, HeatEx will prove useful during the early stages of the development of EGS technology. The basic objective was to design a tool that could use field data that are likely to become available during the early phases of an EGS project (that is, during initial reconnaissance and fracture stimulation operations) to guide forecasts of the longer-term behavior of the system during production and heat-mining.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit

    In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less

  19. The reliability of a modified Kalamazoo Consensus Statement Checklist for assessing the communication skills of multidisciplinary clinicians in the simulated environment.

    PubMed

    Peterson, Eleanor B; Calhoun, Aaron W; Rider, Elizabeth A

    2014-09-01

    With increased recognition of the importance of sound communication skills and communication skills education, reliable assessment tools are essential. This study reports on the psychometric properties of an assessment tool based on the Kalamazoo Consensus Statement Essential Elements Communication Checklist. The Gap-Kalamazoo Communication Skills Assessment Form (GKCSAF), a modified version of an existing communication skills assessment tool, the Kalamazoo Essential Elements Communication Checklist-Adapted, was used to assess learners in a multidisciplinary, simulation-based communication skills educational program using multiple raters. 118 simulated conversations were available for analysis. Internal consistency and inter-rater reliability were determined by calculating a Cronbach's alpha score and intra-class correlation coefficients (ICC), respectively. The GKCSAF demonstrated high internal consistency with a Cronbach's alpha score of 0.844 (faculty raters) and 0.880 (peer observer raters), and high inter-rater reliability with an ICC of 0.830 (faculty raters) and 0.89 (peer observer raters). The Gap-Kalamazoo Communication Skills Assessment Form is a reliable method of assessing the communication skills of multidisciplinary learners using multi-rater methods within the learning environment. The Gap-Kalamazoo Communication Skills Assessment Form can be used by educational programs that wish to implement a reliable assessment and feedback system for a variety of learners. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  1. Examining the role of fluctuations in the early stages of homogenous polymer crystallization with simulation and statistical learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, Jr., Paul Michael

    Here, we propose a relationship between the dynamics in the amorphous and crystalline domains during polymer crystallization: the fluctuations of ordering-rate about a material-specific value in the amorphous phase drive those fluctuations associated with the increase in percent crystallinity. This suggests a differential equation that satisfies the three experimentally observed time regimes for the rate of crystal growth. To test this postulated expression, we applied a suite of statistical learning tools to molecular dynamics simulations to extract the relevant phenomenology. This study shows that the proposed relationship holds in the early time regime. It illustrates the effectiveness of soft computingmore » tools in the analysis of coarse-grained simulations in which patterns exist, but may not easily yield to strict quantitative evaluation. This ability assists us in characterizing the critical early time molecular arrangement during the primary nucleation phase of polymer melt crystallization. In addition to supporting the validity of the proposed kinetics expression, the simulations show that (i) the classical nucleation and growth mechanism is active in the early stages of ordering; (ii) the number of nuclei and their masses grow linearly during this early time regime; and (iii) a fixed inter-nuclei distance is established.« less

  2. Examining the role of fluctuations in the early stages of homogenous polymer crystallization with simulation and statistical learning

    DOE PAGES

    Welch, Jr., Paul Michael

    2017-01-23

    Here, we propose a relationship between the dynamics in the amorphous and crystalline domains during polymer crystallization: the fluctuations of ordering-rate about a material-specific value in the amorphous phase drive those fluctuations associated with the increase in percent crystallinity. This suggests a differential equation that satisfies the three experimentally observed time regimes for the rate of crystal growth. To test this postulated expression, we applied a suite of statistical learning tools to molecular dynamics simulations to extract the relevant phenomenology. This study shows that the proposed relationship holds in the early time regime. It illustrates the effectiveness of soft computingmore » tools in the analysis of coarse-grained simulations in which patterns exist, but may not easily yield to strict quantitative evaluation. This ability assists us in characterizing the critical early time molecular arrangement during the primary nucleation phase of polymer melt crystallization. In addition to supporting the validity of the proposed kinetics expression, the simulations show that (i) the classical nucleation and growth mechanism is active in the early stages of ordering; (ii) the number of nuclei and their masses grow linearly during this early time regime; and (iii) a fixed inter-nuclei distance is established.« less

  3. Numerical Simulations of SCR DeNOx System for a 660MW coal-fired power station

    NASA Astrophysics Data System (ADS)

    Yongqiang, Deng; Zhongming, Mei; Yijun, Mao; Nianping, Liu; Guoming, Yin

    2018-06-01

    Aimed at the selective catalytic reduction (SCR) DeNOx system of a 660 MW coal-fired power station, which is limited by low denitrification efficiency, large ammonia consumption and over-high ammonia escape rate, numerical simulations were conducted by employing STAR-CCM+ (CFD tool). The simulations results revealed the problems existed in the SCR DeNOx system. Aimed at limitations of the target SCR DeNOx system, factors affecting the denitrification performance of SCR, including the structural parameters and ammonia injected by the ammonia nozzles, were optimized. Under the optimized operational conditions, the denitrification efficiency of the SCR system was enhanced, while the ammonia escape rate was reduced below 3ppm. This study serves as references for optimization and modification of SCR systems.

  4. Development of a polarized neutron beam line at Algerian research reactors using McStas software

    NASA Astrophysics Data System (ADS)

    Makhloufi, M.; Salah, H.

    2017-02-01

    Unpolarized instrumentation has long been studied and designed using McStas simulation tool. But, only recently new models were developed for McStas to simulate polarized neutron scattering instruments. In the present contribution, we used McStas software to design a polarized neutron beam line, taking advantage of the available spectrometers reflectometer and diffractometer in Algeria. Both thermal and cold neutron was considered. The polarization was made by two types of supermirrors polarizers FeSi and CoCu provided by the HZB institute. For sake of performance and comparison, the polarizers were characterized and their characteristics reproduced. The simulated instruments are reported. Flipper and electromagnets for guide field are developed. Further developments including analyzers and upgrading of the existing spectrometers are underway.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shahidehpour, Mohammad

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practicesmore » can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision results are often text-based demonstrations. WINS includes a powerful visualization tool and user interface capability for transmission analyses, planning, and assessment, which will be of great interest to power market participants, power system planners and operators, and state and federal regulatory entities; and (3) WINS can handle extended transmission models for wind integration studies. WINS models include limitations on transmission flow as well as bus voltage for analyzing power system states. The existing decision tools often consider transmission flow constraints (dc power flow) alone which could result in the over-utilization of existing resources when analyzing wind integration. WINS can be used to assist power market participants including transmission companies, independent system operators, power system operators in vertically integrated utilities, wind energy developers, and regulatory agencies to analyze economics, security, and reliability of various options for wind integration including transmission upgrades and the planning of new transmission facilities. WINS can also be used by industry for the offline training of reliability and operation personnel when analyzing wind integration uncertainties, identifying critical spots in power system operation, analyzing power system vulnerabilities, and providing credible decisions for examining operation and planning options for wind integration. Researches in this project on wind integration included (1) Development of WINS; (2) Transmission Congestion Analysis in the Eastern Interconnection; (3) Analysis of 2030 Large-Scale Wind Energy Integration in the Eastern Interconnection; (4) Large-scale Analysis of 2018 Wind Energy Integration in the Eastern U.S. Interconnection. The research resulted in 33 papers, 9 presentations, 9 PhD degrees, 4 MS degrees, and 7 awards. The education activities in this project on wind energy included (1) Wind Energy Training Facility Development; (2) Wind Energy Course Development.« less

  6. Discrete event simulation as a tool in optimization of a professional complex adaptive system.

    PubMed

    Nielsen, Anders Lassen; Hilwig, Helmer; Kissoon, Niranjan; Teelucksingh, Surujpal

    2008-01-01

    Similar urgent needs for improvement of health care systems exist in the developed and developing world. The culture and the organization of an emergency department in developing countries can best be described as a professional complex adaptive system, where each agent (employee) are ignorant of the behavior of the system as a whole; no one understands the entire system. Each agent's action is based on the state of the system at the moment (i.e. lack of medicine, unavailable laboratory investigation, lack of beds and lack of staff in certain functions). An important question is how one can improve the emergency service within the given constraints. The use of simulation signals is one new approach in studying issues amenable to improvement. Discrete event simulation was used to simulate part of the patient flow in an emergency department. A simple model was built using a prototyping approach. The simulation showed that a minor rotation among the nurses could reduce the mean number of visitors that had to be refereed to alternative flows within the hospital from 87 to 37 on a daily basis with a mean utilization of the staff between 95.8% (the nurses) and 87.4% (the doctors). We conclude that even faced with resource constraints and lack of accessible data discrete event simulation is a tool that can be used successfully to study the consequences of changes in very complex and self organizing professional complex adaptive systems.

  7. Objective structured assessment of nontechnical skills: Reliability of a global rating scale for the in-training assessment in the operating room.

    PubMed

    Dedy, Nicolas J; Szasz, Peter; Louridas, Marisa; Bonrath, Esther M; Husslein, Heinrich; Grantcharov, Teodor P

    2015-06-01

    Nontechnical skills are critical for patient safety in the operating room (OR). As a result, regulatory bodies for accreditation and certification have mandated the integration of these competencies into postgraduate education. A generally accepted approach to the in-training assessment of nontechnical skills, however, is lacking. The goal of the present study was to develop an evidence-based and reliable tool for the in-training assessment of residents' nontechnical performance in the OR. The Objective Structured Assessment of Nontechnical Skills tool was designed as a 5-point global rating scale with descriptive anchors for each item, based on existing evidence-based frameworks of nontechnical skills, as well as resident training requirements. The tool was piloted on scripted videos and refined in an iterative process. The final version was used to rate residents' performance in recorded OR crisis simulations and during live observations in the OR. A total of 37 simulations and 10 live procedures were rated. Interrater agreement was good for total mean scores, both in simulation and in the real OR, with intraclass correlation coefficients >0.90 in all settings for average and single measures. Internal consistency of the scale was high (Cronbach's alpha = 0.80). The Objective Structured Assessment of Nontechnical Skills global rating scale was developed as an evidence-based tool for the in-training assessment of residents' nontechnical performance in the OR. Unique descriptive anchors allow for a criterion-referenced assessment of performance. Good reliability was demonstrated in different settings, supporting applications in research and education. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  9. Using driving simulators to assess driving safety.

    PubMed

    Boyle, Linda Ng; Lee, John D

    2010-05-01

    Changes in drivers, vehicles, and roadways pose substantial challenges to the transportation safety community. Crash records and naturalistic driving data are useful for examining the influence of past or existing technology on drivers, and the associations between risk factors and crashes. However, they are limited because causation cannot be established and technology not yet installed in production vehicles cannot be assessed. Driving simulators have become an increasingly widespread tool to understand evolving and novel technologies. The ability to manipulate independent variables in a randomized, controlled setting also provides the added benefit of identifying causal links. This paper introduces a special issue on simulator-based safety studies. The special issue comprises 25 papers that demonstrate the use of driving simulators to address pressing transportation safety problems and includes topics as diverse as neurological dysfunction, work zone design, and driver distraction. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  10. Development of an Efficient Approach to Perform Neutronics Simulations for Plutonium-238 Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chandler, David; Ellis, Ronald James

    Conversion of 238Pu decay heat into usable electricity is imperative to power National Aeronautics and Space Administration (NASA) deep space exploration missions; however, the current stockpile of 238Pu is diminishing and the quality is less than ideal. In response, the US Department of Energy and NASA have undertaken a program to reestablish a domestic 238Pu production program and a technology demonstration sub-project has been initiated. Neutronics simulations for 238Pu production play a vital role in this project because the results guide reactor safety-basis, target design and optimization, and post-irradiation examination activities. A new, efficient neutronics simulation tool written in Pythonmore » was developed to evaluate, with the highest fidelity possible with approved tools, the time-dependent nuclide evolution and heat deposition rates in 238Pu production targets irradiated in the High Flux Isotope Reactor (HFIR). The Python Activation and Heat Deposition Script (PAHDS) was developed specifically for experiment analysis in HFIR and couples the MCNP5 and SCALE 6.1.3 software quality assured tools to take advantage of an existing high-fidelity MCNP HFIR model, the most up-to-date ORIGEN code, and the most up-to-date nuclear data. Three cycle simulations were performed with PAHDS implementing ENDF/B-VII.0, ENDF/B-VII.1, and the Hybrid Library GPD-Rev0 cross-section libraries. The 238Pu production results were benchmarked against VESTA-obtained results and the impact of various cross-section libraries on the calculated metrics were assessed.« less

  11. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    DTIC Science & Technology

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  12. Multi-physics CFD simulations in engineering

    NASA Astrophysics Data System (ADS)

    Yamamoto, Makoto

    2013-08-01

    Nowadays Computational Fluid Dynamics (CFD) software is adopted as a design and analysis tool in a great number of engineering fields. We can say that single-physics CFD has been sufficiently matured in the practical point of view. The main target of existing CFD software is single-phase flows such as water and air. However, many multi-physics problems exist in engineering. Most of them consist of flow and other physics, and the interactions between different physics are very important. Obviously, multi-physics phenomena are critical in developing machines and processes. A multi-physics phenomenon seems to be very complex, and it is so difficult to be predicted by adding other physics to flow phenomenon. Therefore, multi-physics CFD techniques are still under research and development. This would be caused from the facts that processing speed of current computers is not fast enough for conducting a multi-physics simulation, and furthermore physical models except for flow physics have not been suitably established. Therefore, in near future, we have to develop various physical models and efficient CFD techniques, in order to success multi-physics simulations in engineering. In the present paper, I will describe the present states of multi-physics CFD simulations, and then show some numerical results such as ice accretion and electro-chemical machining process of a three-dimensional compressor blade which were obtained in my laboratory. Multi-physics CFD simulations would be a key technology in near future.

  13. Concept of Operations Visualization for Ares I Production

    NASA Technical Reports Server (NTRS)

    Chilton, Jim; Smith, David Alan

    2008-01-01

    Establishing Computer Aided Design models of the Ares I production facility, tooling and vehicle components and integrating them into manufacturing visualizations/simulations allows Boeing and NASA to collaborate real time early in the design/development cycle. This collaboration identifies cost effective and lean solutions that can be easily shared with Ares stakeholders (e.g., other NASA Centers and potential science users). These Ares I production visualizations and analyses by their nature serve as early manufacturing improvement precursors for other Constellation elements to be built at the Michoud Assembly Facility such as Ares V and the Altair Lander. Key to this Boeing and Marshall Space Flight Center collaboration has been the use of advanced virtual manufacturing tools to understand the existing Shuttle era infrastructure and trade potential modifications to support Ares I production. These approaches are then used to determine an optimal manufacturing configuration in terms of labor efficiency, safety and facility enhancements. These same models and tools can be used in an interactive simulation of Ares I and V flight to the Space Station or moon to educate the human space constituency (e.g., government, academia, media and the public) in order to increase national and international understanding of Constellation goals and benefits.

  14. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    USGS Publications Warehouse

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  15. CTViz: A tool for the visualization of transport in nanocomposites.

    PubMed

    Beach, Benjamin; Brown, Joshua; Tarlton, Taylor; Derosa, Pedro A

    2016-05-01

    A visualization tool (CTViz) for charge transport processes in 3-D hybrid materials (nanocomposites) was developed, inspired by the need for a graphical application to assist in code debugging and data presentation of an existing in-house code. As the simulation code grew, troubleshooting problems grew increasingly difficult without an effective way to visualize 3-D samples and charge transport in those samples. CTViz is able to produce publication and presentation quality visuals of the simulation box, as well as static and animated visuals of the paths of individual carriers through the sample. CTViz was designed to provide a high degree of flexibility in the visualization of the data. A feature that characterizes this tool is the use of shade and transparency levels to highlight important details in the morphology or in the transport paths by hiding or dimming elements of little relevance to the current view. This is fundamental for the visualization of 3-D systems with complex structures. The code presented here provides these required capabilities, but has gone beyond the original design and could be used as is or easily adapted for the visualization of other particulate transport where transport occurs on discrete paths. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis; Mandelli, Diego; Prescott, Steven

    The existing fleet of nuclear power plants is in the process of extending its lifetime and increasing the power generated from these plants via power uprates. In order to evaluate the impact of these factors on the safety of the plant, the Risk Informed Safety Margin Characterization (RISMC) project aims to provide insight to decision makers through a series of simulations of the plant dynamics for different initial conditions (e.g., probabilistic analysis and uncertainty quantification). This report focuses, in particular, on the application of a RISMC detailed demonstration case study for an emergent issue using the RAVEN and RELAP-7 tools.more » This case study looks at the impact of a couple of challenges to a hypothetical pressurized water reactor, including: (1) a power uprate, (2) a potential loss of off-site power followed by the possible loss of all diesel generators (i.e., a station black-out event), (3) and earthquake induces station-blackout, and (4) a potential earthquake induced tsunami flood. The analysis is performed by using a set of codes: a thermal-hydraulic code (RELAP-7), a flooding simulation tool (NEUTRINO) and a stochastic analysis tool (RAVEN) – these are currently under development at the Idaho National Laboratory.« less

  17. NetCoffee: a fast and accurate global alignment approach to identify functionally conserved proteins in multiple networks.

    PubMed

    Hu, Jialu; Kehr, Birte; Reinert, Knut

    2014-02-15

    Owing to recent advancements in high-throughput technologies, protein-protein interaction networks of more and more species become available in public databases. The question of how to identify functionally conserved proteins across species attracts a lot of attention in computational biology. Network alignments provide a systematic way to solve this problem. However, most existing alignment tools encounter limitations in tackling this problem. Therefore, the demand for faster and more efficient alignment tools is growing. We present a fast and accurate algorithm, NetCoffee, which allows to find a global alignment of multiple protein-protein interaction networks. NetCoffee searches for a global alignment by maximizing a target function using simulated annealing on a set of weighted bipartite graphs that are constructed using a triplet approach similar to T-Coffee. To assess its performance, NetCoffee was applied to four real datasets. Our results suggest that NetCoffee remedies several limitations of previous algorithms, outperforms all existing alignment tools in terms of speed and nevertheless identifies biologically meaningful alignments. The source code and data are freely available for download under the GNU GPL v3 license at https://code.google.com/p/netcoffee/.

  18. CATCh, an Ensemble Classifier for Chimera Detection in 16S rRNA Sequencing Studies

    PubMed Central

    Mysara, Mohamed; Saeys, Yvan; Leys, Natalie; Raes, Jeroen

    2014-01-01

    In ecological studies, microbial diversity is nowadays mostly assessed via the detection of phylogenetic marker genes, such as 16S rRNA. However, PCR amplification of these marker genes produces a significant amount of artificial sequences, often referred to as chimeras. Different algorithms have been developed to remove these chimeras, but efforts to combine different methodologies are limited. Therefore, two machine learning classifiers (reference-based and de novo CATCh) were developed by integrating the output of existing chimera detection tools into a new, more powerful method. When comparing our classifiers with existing tools in either the reference-based or de novo mode, a higher performance of our ensemble method was observed on a wide range of sequencing data, including simulated, 454 pyrosequencing, and Illumina MiSeq data sets. Since our algorithm combines the advantages of different individual chimera detection tools, our approach produces more robust results when challenged with chimeric sequences having a low parent divergence, short length of the chimeric range, and various numbers of parents. Additionally, it could be shown that integrating CATCh in the preprocessing pipeline has a beneficial effect on the quality of the clustering in operational taxonomic units. PMID:25527546

  19. New approaches for real time decision support systems

    NASA Technical Reports Server (NTRS)

    Hair, D. Charles; Pickslay, Kent

    1994-01-01

    NCCOSC RDT&E Division (NRaD) is conducting research into ways of improving decision support systems (DSS) that are used in tactical Navy decision making situations. The research has focused on the incorporation of findings about naturalistic decision-making processes into the design of the DSS. As part of that research, two computer tools were developed that model the two primary naturalistic decision-making strategies used by Navy experts in tactical settings. Current work is exploring how best to incorporate the information produced by those tools into an existing simulation of current Navy decision support systems. This work has implications for any applications involving the need to make decisions under time constraints, based on incomplete or ambiguous data.

  20. A framework for modeling scenario-based barrier island storm impacts

    USGS Publications Warehouse

    Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.

    2018-01-01

    Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.

  1. BridgeUP: STEM and Learning Astrophysics Interactively

    NASA Astrophysics Data System (ADS)

    Hernandez, Betsy; Geogdzhayeva, Maria; Beltre, Chasity; Ocasio, Adrienne; Skarbinski, Maya; Zbib, Daniela; Swar, Prachi; Mac Low, Mordecai

    2018-01-01

    BridgeUP: STEM is an initiative responding to the gender and opportunity gaps that exist in the STEM pipeline for women, girls, and under-resourced youth. The program engages high school girls in experiences at the intersection of computer science, scientific research, and visualization that will position them to succeed and lead in these fields. Students work on projects closely aligned with research taking place at the American Museum of Natural History. One of the current astronomy research projects at the museum simulates migration of black holes in active galactic nucleus disks using the Pencil Code. The work presented here focuses on interactive tools used to teach dynamical concepts pertaining to this project. These include Logger Pro, along with Vernier equipment, PhET Interactive Simulations, and Python. Throughout the internship, students also learn qualitative astrophysics via presentations, animations and videos. We discuss the success of utilizing the aforementioned tools in teaching, as well as showing work conducted by the six current students participating in this Astronomy research project.

  2. On the Genealogy of Asexual Diploids

    NASA Astrophysics Data System (ADS)

    Lam, Fumei; Langley, Charles H.; Song, Yun S.

    Given molecular genetic data from diploid individuals that, at present, reproduce mostly or exclusively asexually without recombination, an important problem in evolutionary biology is detecting evidence of past sexual reproduction (i.e., meiosis and mating) and recombination (both meiotic and mitotic). However, currently there is a lack of computational tools for carrying out such a study. In this paper, we formulate a new problem of reconstructing diploid genealogies under the assumption of no sexual reproduction or recombination, with the ultimate goal being to devise genealogy-based tools for testing deviation from these assumptions. We first consider the infinite-sites model of mutation and develop linear-time algorithms to test the existence of an asexual diploid genealogy compatible with the infinite-sites model of mutation, and to construct one if it exists. Then, we relax the infinite-sites assumption and develop an integer linear programming formulation to reconstruct asexual diploid genealogies with the minimum number of homoplasy (back or recurrent mutation) events. We apply our algorithms on simulated data sets with sizes of biological interest.

  3. Objective assessment of laparoscopic skills using a virtual reality stimulator.

    PubMed

    Eriksen, J R; Grantcharov, T

    2005-09-01

    Virtual reality simulation has a great potential as a training and assessment tool of laparoscopic skills. The study was carried out to investigate whether the LapSim system (Surgical Science Ltd., Gothenburg, Sweden) was able to differentiate between subjects with different laparoscopic experience and thus to demonstrate its construct validity. Subjects 24 were divided into two groups: experienced (performed > 100 laparoscopic procedures, n = 10) and beginners (performed <10 laparoscopic procedures, n = 14). Assessment of laparoscopic skills was based on parameters measured by the computer system. Experienced surgeons performed consistently better than the residents. Significant differences in the parameters time and economy of motion existed between the two groups in seven of seven tasks. Regarding error parameters, differences existed in most but not all tasks. LapSim was able to differentiate between subjects with different laparoscopic experience. This indicates that the system measures skills relevant for laparoscopic surgery and can be used in training programs as a valid assessment tool.

  4. BDA: A novel method for identifying defects in body-centered cubic crystals.

    PubMed

    Möller, Johannes J; Bitzek, Erik

    2016-01-01

    The accurate and fast identification of crystallographic defects plays a key role for the analysis of atomistic simulation output data. For face-centered cubic (fcc) metals, most existing structure analysis tools allow for the direct distinction of common defects, such as stacking faults or certain low-index surfaces. For body-centered cubic (bcc) metals, on the other hand, a robust way to identify such defects is currently not easily available. We therefore introduce a new method for analyzing atomistic configurations of bcc metals, the BCC Defect Analysis (BDA). It uses existing structure analysis algorithms and combines their results to uniquely distinguish between typical defects in bcc metals. In essence, the BDA method offers the following features:•Identification of typical defect structures in bcc metals.•Reduction of erroneously identified defects by iterative comparison to the defects in the atom's neighborhood.•Availability as ready-to-use Python script for the widespread visualization tool OVITO [http://ovito.org].

  5. Realistic natural atmospheric phenomena and weather effects for interactive virtual environments

    NASA Astrophysics Data System (ADS)

    McLoughlin, Leigh

    Clouds and the weather are important aspects of any natural outdoor scene, but existing dynamic techniques within computer graphics only offer the simplest of cloud representations. The problem that this work looks to address is how to provide a means of simulating clouds and weather features such as precipitation, that are suitable for virtual environments. Techniques for cloud simulation are available within the area of meteorology, but numerical weather prediction systems are computationally expensive, give more numerical accuracy than we require for graphics and are restricted to the laws of physics. Within computer graphics, we often need to direct and adjust physical features or to bend reality to meet artistic goals, which is a key difference between the subjects of computer graphics and physical science. Pure physically-based simulations, however, evolve their solutions according to pre-set rules and are notoriously difficult to control. The challenge then is for the solution to be computationally lightweight and able to be directed in some measure while at the same time producing believable results. This work presents a lightweight physically-based cloud simulation scheme that simulates the dynamic properties of cloud formation and weather effects. The system simulates water vapour, cloud water, cloud ice, rain, snow and hail. The water model incorporates control parameters and the cloud model uses an arbitrary vertical temperature profile, with a tool described to allow the user to define this. The result of this work is that clouds can now be simulated in near real-time complete with precipitation. The temperature profile and tool then provide a means of directing the resulting formation..

  6. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  7. Potential effects of existing and proposed groundwater withdrawals on water levels and natural groundwater discharge in Snake Valley and surrounding areas, Utah and Nevada

    USGS Publications Warehouse

    Masbruch, Melissa D.; Brooks, Lynette E.

    2017-04-14

    Several U.S. Department of Interior (DOI) agencies are concerned about the cumulative effects of groundwater development on groundwater resources managed by, and other groundwater resources of interest to, these agencies in Snake Valley and surrounding areas. The new water uses that potentially concern the DOI agencies include 12 water-right applications filed in 2005, totaling approximately 8,864 acre-feet per year. To date, only one of these applications has been approved and partially developed. In addition, the DOI agencies are interested in the potential effects of three new water-right applications (UT 18-756, UT 18-758, and UT 18-759) and one water-right change application (UT a40687), which were the subject of a water-right hearing on April 19, 2016.This report presents a hydrogeologic analysis of areas in and around Snake Valley to assess potential effects of existing and future groundwater development on groundwater resources, specifically groundwater discharge sites, of interest to the DOI agencies. A previously developed steady-state numerical groundwater-flow model was modified to transient conditions with respect to well withdrawals and used to quantify drawdown and capture (withdrawals that result in depletion) of natural discharge from existing and proposed groundwater withdrawals. The original steady-state model simulates and was calibrated to 2009 conditions. To investigate the potential effects of existing and proposed groundwater withdrawals on the groundwater resources of interest to the DOI agencies, 10 withdrawal scenarios were simulated. All scenarios were simulated for periods of 5, 10, 15, 30, 55, and 105 years from the start of 2010; additionally, all scenarios were simulated to a new steady state to determine the ultimate long-term effects of the withdrawals. Capture maps were also constructed as part of this analysis. The simulations used to develop the capture maps test the response of the system, specifically the reduction of natural discharge, to future stresses at a point in the area represented by the model. In this way, these maps can be used as a tool to determine the source of water to, and potential effects at specific areas from, future well withdrawals.Downward trends in water levels measured in wells indicate that existing groundwater withdrawals in Snake Valley are affecting water levels. The numerical model simulates similar downward trends in water levels; simulated drawdowns in the model, however, are generally less than observed water-level declines. At the groundwater discharge sites of interest to the DOI agencies, simulated drawdowns from existing well withdrawals (projected into the future) range from 0 to about 50 feet. Following the addition of the proposed withdrawals, simulated drawdowns at some sites increase by 25 feet. Simulated drawdown resulting from the proposed withdrawals began in as few as 5 years after 2014 at several of the sites. At the groundwater discharge sites of interest to the DOI agencies, simulated capture of natural discharge resulting from the existing withdrawals ranged from 0 to 87 percent. Following the addition of the proposed withdrawals, simulated capture at several of the sites reached 100 percent, indicating that groundwater discharge at that site would cease. Simulated capture following the addition of the proposed withdrawals increased in as few as 5 years after 2014 at several of the sites.

  8. Connecting Artificial Brains to Robots in a Comprehensive Simulation Framework: The Neurorobotics Platform

    PubMed Central

    Falotico, Egidio; Vannucci, Lorenzo; Ambrosano, Alessandro; Albanese, Ugo; Ulbrich, Stefan; Vasquez Tieck, Juan Camilo; Hinkel, Georg; Kaiser, Jacques; Peric, Igor; Denninger, Oliver; Cauli, Nino; Kirtay, Murat; Roennau, Arne; Klinker, Gudrun; Von Arnim, Axel; Guyot, Luc; Peppicelli, Daniel; Martínez-Cañada, Pablo; Ros, Eduardo; Maier, Patrick; Weber, Sandro; Huber, Manuel; Plecher, David; Röhrbein, Florian; Deser, Stefan; Roitberg, Alina; van der Smagt, Patrick; Dillman, Rüdiger; Levi, Paul; Laschi, Cecilia; Knoll, Alois C.; Gewaltig, Marc-Oliver

    2017-01-01

    Combined efforts in the fields of neuroscience, computer science, and biology allowed to design biologically realistic models of the brain based on spiking neural networks. For a proper validation of these models, an embodiment in a dynamic and rich sensory environment, where the model is exposed to a realistic sensory-motor task, is needed. Due to the complexity of these brain models that, at the current stage, cannot deal with real-time constraints, it is not possible to embed them into a real-world task. Rather, the embodiment has to be simulated as well. While adequate tools exist to simulate either complex neural networks or robots and their environments, there is so far no tool that allows to easily establish a communication between brain and body models. The Neurorobotics Platform is a new web-based environment that aims to fill this gap by offering scientists and technology developers a software infrastructure allowing them to connect brain models to detailed simulations of robot bodies and environments and to use the resulting neurorobotic systems for in silico experimentation. In order to simplify the workflow and reduce the level of the required programming skills, the platform provides editors for the specification of experimental sequences and conditions, environments, robots, and brain–body connectors. In addition to that, a variety of existing robots and environments are provided. This work presents the architecture of the first release of the Neurorobotics Platform developed in subproject 10 “Neurorobotics” of the Human Brain Project (HBP).1 At the current state, the Neurorobotics Platform allows researchers to design and run basic experiments in neurorobotics using simulated robots and simulated environments linked to simplified versions of brain models. We illustrate the capabilities of the platform with three example experiments: a Braitenberg task implemented on a mobile robot, a sensory-motor learning task based on a robotic controller, and a visual tracking embedding a retina model on the iCub humanoid robot. These use-cases allow to assess the applicability of the Neurorobotics Platform for robotic tasks as well as in neuroscientific experiments. PMID:28179882

  9. Connecting Artificial Brains to Robots in a Comprehensive Simulation Framework: The Neurorobotics Platform.

    PubMed

    Falotico, Egidio; Vannucci, Lorenzo; Ambrosano, Alessandro; Albanese, Ugo; Ulbrich, Stefan; Vasquez Tieck, Juan Camilo; Hinkel, Georg; Kaiser, Jacques; Peric, Igor; Denninger, Oliver; Cauli, Nino; Kirtay, Murat; Roennau, Arne; Klinker, Gudrun; Von Arnim, Axel; Guyot, Luc; Peppicelli, Daniel; Martínez-Cañada, Pablo; Ros, Eduardo; Maier, Patrick; Weber, Sandro; Huber, Manuel; Plecher, David; Röhrbein, Florian; Deser, Stefan; Roitberg, Alina; van der Smagt, Patrick; Dillman, Rüdiger; Levi, Paul; Laschi, Cecilia; Knoll, Alois C; Gewaltig, Marc-Oliver

    2017-01-01

    Combined efforts in the fields of neuroscience, computer science, and biology allowed to design biologically realistic models of the brain based on spiking neural networks. For a proper validation of these models, an embodiment in a dynamic and rich sensory environment, where the model is exposed to a realistic sensory-motor task, is needed. Due to the complexity of these brain models that, at the current stage, cannot deal with real-time constraints, it is not possible to embed them into a real-world task. Rather, the embodiment has to be simulated as well. While adequate tools exist to simulate either complex neural networks or robots and their environments, there is so far no tool that allows to easily establish a communication between brain and body models. The Neurorobotics Platform is a new web-based environment that aims to fill this gap by offering scientists and technology developers a software infrastructure allowing them to connect brain models to detailed simulations of robot bodies and environments and to use the resulting neurorobotic systems for in silico experimentation. In order to simplify the workflow and reduce the level of the required programming skills, the platform provides editors for the specification of experimental sequences and conditions, environments, robots, and brain-body connectors. In addition to that, a variety of existing robots and environments are provided. This work presents the architecture of the first release of the Neurorobotics Platform developed in subproject 10 "Neurorobotics" of the Human Brain Project (HBP). At the current state, the Neurorobotics Platform allows researchers to design and run basic experiments in neurorobotics using simulated robots and simulated environments linked to simplified versions of brain models. We illustrate the capabilities of the platform with three example experiments: a Braitenberg task implemented on a mobile robot, a sensory-motor learning task based on a robotic controller, and a visual tracking embedding a retina model on the iCub humanoid robot. These use-cases allow to assess the applicability of the Neurorobotics Platform for robotic tasks as well as in neuroscientific experiments.

  10. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.

  11. Neuromechanic: a computational platform for simulation and analysis of the neural control of movement

    PubMed Central

    Bunderson, Nathan E.; Bingham, Jeffrey T.; Sohn, M. Hongchul; Ting, Lena H.; Burkholder, Thomas J.

    2015-01-01

    Neuromusculoskeletal models solve the basic problem of determining how the body moves under the influence of external and internal forces. Existing biomechanical modeling programs often emphasize dynamics with the goal of finding a feed-forward neural program to replicate experimental data or of estimating force contributions or individual muscles. The computation of rigid-body dynamics, muscle forces, and activation of the muscles are often performed separately. We have developed an intrinsically forward computational platform (Neuromechanic, www.neuromechanic.com) that explicitly represents the interdependencies among rigid body dynamics, frictional contact, muscle mechanics, and neural control modules. This formulation has significant advantages for optimization and forward simulation, particularly with application to neural controllers with feedback or regulatory features. Explicit inclusion of all state dependencies allows calculation of system derivatives with respect to kinematic states as well as muscle and neural control states, thus affording a wealth of analytical tools, including linearization, stability analyses and calculation of initial conditions for forward simulations. In this review, we describe our algorithm for generating state equations and explain how they may be used in integration, linearization and stability analysis tools to provide structural insights into the neural control of movement. PMID:23027632

  12. plasmaFoam: An OpenFOAM framework for computational plasma physics and chemistry

    NASA Astrophysics Data System (ADS)

    Venkattraman, Ayyaswamy; Verma, Abhishek Kumar

    2016-09-01

    As emphasized in the 2012 Roadmap for low temperature plasmas (LTP), scientific computing has emerged as an essential tool for the investigation and prediction of the fundamental physical and chemical processes associated with these systems. While several in-house and commercial codes exist, with each having its own advantages and disadvantages, a common framework that can be developed by researchers from all over the world will likely accelerate the impact of computational studies on advances in low-temperature plasma physics and chemistry. In this regard, we present a finite volume computational toolbox to perform high-fidelity simulations of LTP systems. This framework, primarily based on the OpenFOAM solver suite, allows us to enhance our understanding of multiscale plasma phenomenon by performing massively parallel, three-dimensional simulations on unstructured meshes using well-established high performance computing tools that are widely used in the computational fluid dynamics community. In this talk, we will present preliminary results obtained using the OpenFOAM-based solver suite with benchmark three-dimensional simulations of microplasma devices including both dielectric and plasma regions. We will also discuss the future outlook for the solver suite.

  13. Neuromechanic: a computational platform for simulation and analysis of the neural control of movement.

    PubMed

    Bunderson, Nathan E; Bingham, Jeffrey T; Sohn, M Hongchul; Ting, Lena H; Burkholder, Thomas J

    2012-10-01

    Neuromusculoskeletal models solve the basic problem of determining how the body moves under the influence of external and internal forces. Existing biomechanical modeling programs often emphasize dynamics with the goal of finding a feed-forward neural program to replicate experimental data or of estimating force contributions or individual muscles. The computation of rigid-body dynamics, muscle forces, and activation of the muscles are often performed separately. We have developed an intrinsically forward computational platform (Neuromechanic, www.neuromechanic.com) that explicitly represents the interdependencies among rigid body dynamics, frictional contact, muscle mechanics, and neural control modules. This formulation has significant advantages for optimization and forward simulation, particularly with application to neural controllers with feedback or regulatory features. Explicit inclusion of all state dependencies allows calculation of system derivatives with respect to kinematic states and muscle and neural control states, thus affording a wealth of analytical tools, including linearization, stability analyses and calculation of initial conditions for forward simulations. In this review, we describe our algorithm for generating state equations and explain how they may be used in integration, linearization, and stability analysis tools to provide structural insights into the neural control of movement. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Computational Planning in Facial Surgery.

    PubMed

    Zachow, Stefan

    2015-10-01

    This article reflects the research of the last two decades in computational planning for cranio-maxillofacial surgery. Model-guided and computer-assisted surgery planning has tremendously developed due to ever increasing computational capabilities. Simulators for education, planning, and training of surgery are often compared with flight simulators, where maneuvers are also trained to reduce a possible risk of failure. Meanwhile, digital patient models can be derived from medical image data with astonishing accuracy and thus can serve for model surgery to derive a surgical template model that represents the envisaged result. Computerized surgical planning approaches, however, are often still explorative, meaning that a surgeon tries to find a therapeutic concept based on his or her expertise using computational tools that are mimicking real procedures. Future perspectives of an improved computerized planning may be that surgical objectives will be generated algorithmically by employing mathematical modeling, simulation, and optimization techniques. Planning systems thus act as intelligent decision support systems. However, surgeons can still use the existing tools to vary the proposed approach, but they mainly focus on how to transfer objectives into reality. Such a development may result in a paradigm shift for future surgery planning. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  15. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  16. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  17. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  18. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  19. Strategy and gaps for modeling, simulation, and control of hybrid systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob

    2015-04-01

    The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less

  20. ProtSqueeze: simple and effective automated tool for setting up membrane protein simulations.

    PubMed

    Yesylevskyy, Semen O

    2007-01-01

    The major challenge in setting up membrane protein simulations is embedding the protein into the pre-equilibrated lipid bilayer. Several techniques were proposed to achieve optimal packing of the lipid molecules around the protein. However, all of them possess serious disadvantages, which limit their applicability and discourage the users of simulation packages from using them. In the present work, we analyzed existing approaches and proposed a new procedure of protein insertion into the lipid bilayer, which is implemented in the ProtSqueeze software. The advantages of ProtSqueeze are as follows: (1) the insertion algorithm is simple, understandable, and controllable; (2) the software can work with virtually any simulation package on virtually any platform; (3) no modification of the source code of the simulation package is needed; (4) the procedure of insertion is as automated as possible; (5) ProtSqueeze is distributed for free under a general public license. In this work, we present the architecture and the algorithm of ProtSqueeze and demonstrate its usage in case studies.

  1. Three-dimensional implementation of the Low Diffusion method for continuum flow simulations

    NASA Astrophysics Data System (ADS)

    Mirza, A.; Nizenkov, P.; Pfeiffer, M.; Fasoulas, S.

    2017-11-01

    Concepts of a particle-based continuum method have existed for many years. The ultimate goal is to couple such a method with the Direct Simulation Monte Carlo (DSMC) in order to bridge the gap of numerical tools in the treatment of the transitional flow regime between near-equilibrium and rarefied gas flows. For this purpose, the Low Diffusion (LD) method, introduced first by Burt and Boyd, offers a promising solution. In this paper, the LD method is revisited and the implementation in a modern particle solver named PICLas is given. The modifications of the LD routines enable three-dimensional continuum flow simulations. The implementation is successfully verified through a series of test cases: simple stationary shock, oblique shock simulation and thermal Couette flow. Additionally, the capability of this method is demonstrated by the simulation of a hypersonic nitrogen flow around a 70°-blunted cone. Overall results are in very good agreement with experimental data. Finally, the scalability of PICLas using LD on a high performance cluster is presented.

  2. Simulation of arthroscopic surgery using MRI data

    NASA Technical Reports Server (NTRS)

    Heller, Geoffrey; Genetti, Jon

    1994-01-01

    With the availability of Magnetic Resonance Imaging (MRI) technology in the medical field and the development of powerful graphics engines in the computer world the possibility now exists for the simulation of surgery using data obtained from an actual patient. This paper describes a surgical simulation system which will allow a physician or a medical student to practice surgery on a patient without ever entering an operating room. This could substantially lower the cost of medial training by providing an alternative to the use of cadavers. This project involves the use of volume data acquired by MRI which are converted to polygonal form using a corrected marching cubes algorithm. The data are then colored and a simulation of surface response based on springy structures is performed in real time. Control for the system is obtained through the use of an attached analog-to-digital unit. A remote electronic device is described which simulates an imaginary tool having features in common with both arthroscope and laparoscope.

  3. Monte-Carlo background simulations of present and future detectors in x-ray astronomy

    NASA Astrophysics Data System (ADS)

    Tenzer, C.; Kendziorra, E.; Santangelo, A.

    2008-07-01

    Reaching a low-level and well understood internal instrumental background is crucial for the scientific performance of an X-ray detector and, therefore, a main objective of the instrument designers. Monte-Carlo simulations of the physics processes and interactions taking place in a space-based X-ray detector as a result of its orbital environment can be applied to explain the measured background of existing missions. They are thus an excellent tool to predict and optimize the background of future observatories. Weak points of a design and the main sources of the background can be identified and methods to reduce them can be implemented and studied within the simulations. Using the Geant4 Monte-Carlo toolkit, we have created a simulation environment for space-based detectors and we present results of such background simulations for XMM-Newton's EPIC pn-CCD camera. The environment is also currently used to estimate and optimize the background of the future instruments Simbol-X and eRosita.

  4. Further Validation of Simulated Dynamic Interface Testing Techniques as a Tool in the Forecasting of Air Vehicle Deck Limits

    DTIC Science & Technology

    2010-01-01

    UAV Autonomy program which includes intelligent reasoning for autonomy, technologies to enhance see and avoid capabilities, object identification ...along the ship’s base recovery course (BRC). The pilot then flies toward the stern of the ship, aligning his approach path with the ship’s lineup line...quiescent point identification . CONCLUSIONS The primary goal for conducting dynamic interface analysis is to expand existing operating envelopes and

  5. Gunshot identification system by integration of open source consumer electronics

    NASA Astrophysics Data System (ADS)

    López R., Juan Manuel; Marulanda B., Jose Ignacio

    2014-05-01

    This work presents a prototype of low-cost gunshots identification system that uses consumer electronics in order to ensure the existence of gunshots and then classify it according to a previously established database. The implementation of this tool in the urban areas is to set records that support the forensics, hence improving law enforcement also on developing countries. An analysis of its effectiveness is presented in comparison with theoretical results obtained with numerical simulations.

  6. Left ventricular fluid mechanics: the long way from theoretical models to clinical applications.

    PubMed

    Pedrizzetti, Gianni; Domenichini, Federico

    2015-01-01

    The flow inside the left ventricle is characterized by the formation of vortices that smoothly accompany blood from the mitral inlet to the aortic outlet. Computational fluid dynamics permitted to shed some light on the fundamental processes involved with vortex motion. More recently, patient-specific numerical simulations are becoming an increasingly feasible tool that can be integrated with the developing imaging technologies. The existing computational methods are reviewed in the perspective of their potential role as a novel aid for advanced clinical analysis. The current results obtained by simulation methods either alone or in combination with medical imaging are summarized. Open problems are highlighted and perspective clinical applications are discussed.

  7. Design of Low-Complexity and High-Speed Coplanar Four-Bit Ripple Carry Adder in QCA Technology

    NASA Astrophysics Data System (ADS)

    Balali, Moslem; Rezai, Abdalhossein

    2018-07-01

    Quantum-dot Cellular Automata (QCA) technology is a suitable technology to replace CMOS technology due to low-power consumption, high-speed and high-density devices. Full adder has an important role in the digital circuit design. This paper presents and evaluates a novel single-layer four-bit QCA Ripple Carry Adder (RCA) circuit. The developed four-bit QCA RCA circuit is based on novel QCA full adder circuit. The developed circuits are simulated using QCADesigner tool version 2.0.3. The simulation results show that the developed circuits have advantages in comparison with existing single-layer and multilayer circuits in terms of cell count, area occupation and circuit latency.

  8. Design of Low-Complexity and High-Speed Coplanar Four-Bit Ripple Carry Adder in QCA Technology

    NASA Astrophysics Data System (ADS)

    Balali, Moslem; Rezai, Abdalhossein

    2018-03-01

    Quantum-dot Cellular Automata (QCA) technology is a suitable technology to replace CMOS technology due to low-power consumption, high-speed and high-density devices. Full adder has an important role in the digital circuit design. This paper presents and evaluates a novel single-layer four-bit QCA Ripple Carry Adder (RCA) circuit. The developed four-bit QCA RCA circuit is based on novel QCA full adder circuit. The developed circuits are simulated using QCADesigner tool version 2.0.3. The simulation results show that the developed circuits have advantages in comparison with existing single-layer and multilayer circuits in terms of cell count, area occupation and circuit latency.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jager, Yetta; Efroymson, Rebecca Ann; Sublette, K.

    Quantitative tools are needed to evaluate the ecological effects of increasing petroleum production. In this article, we describe two stochastic models for simulating the spatial distribution of brine spills on a landscape. One model uses general assumptions about the spatial arrangement of spills and their sizes; the second model distributes spills by siting rectangular well complexes and conditioning spill probabilities on the configuration of pipes. We present maps of landscapes with spills produced by the two methods and compare the ability of the models to reproduce a specified spill area. A strength of the models presented here is their abilitymore » to extrapolate from the existing landscape to simulate landscapes with a higher (or lower) density of oil wells.« less

  10. MetaCRAST: reference-guided extraction of CRISPR spacers from unassembled metagenomes.

    PubMed

    Moller, Abraham G; Liang, Chun

    2017-01-01

    Clustered regularly interspaced short palindromic repeat (CRISPR) systems are the adaptive immune systems of bacteria and archaea against viral infection. While CRISPRs have been exploited as a tool for genetic engineering, their spacer sequences can also provide valuable insights into microbial ecology by linking environmental viruses to their microbial hosts. Despite this importance, metagenomic CRISPR detection remains a major challenge. Here we present a reference-guided CRISPR spacer detection tool ( Meta genomic C RISPR R eference- A ided S earch T ool-MetaCRAST) that constrains searches based on user-specified direct repeats (DRs). These DRs could be expected from assembly or taxonomic profiles of metagenomes. We compared the performance of MetaCRAST to those of two existing metagenomic CRISPR detection tools-Crass and MinCED-using both real and simulated acid mine drainage (AMD) and enhanced biological phosphorus removal (EBPR) metagenomes. Our evaluation shows MetaCRAST improves CRISPR spacer detection in real metagenomes compared to the de novo CRISPR detection methods Crass and MinCED. Evaluation on simulated metagenomes show it performs better than de novo tools for Illumina metagenomes and comparably for 454 metagenomes. It also has comparable performance dependence on read length and community composition, run time, and accuracy to these tools. MetaCRAST is implemented in Perl, parallelizable through the Many Core Engine (MCE), and takes metagenomic sequence reads and direct repeat queries (FASTA or FASTQ) as input. It is freely available for download at https://github.com/molleraj/MetaCRAST.

  11. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    NASA Astrophysics Data System (ADS)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel yet unverified survey cadences (e.g. the baseline LSST cadence) that sparsely spread the observations required for detection over several days or weeks.

  12. Efficient Bayesian mixed model analysis increases association power in large cohorts

    PubMed Central

    Loh, Po-Ru; Tucker, George; Bulik-Sullivan, Brendan K; Vilhjálmsson, Bjarni J; Finucane, Hilary K; Salem, Rany M; Chasman, Daniel I; Ridker, Paul M; Neale, Benjamin M; Berger, Bonnie; Patterson, Nick; Price, Alkes L

    2014-01-01

    Linear mixed models are a powerful statistical tool for identifying genetic associations and avoiding confounding. However, existing methods are computationally intractable in large cohorts, and may not optimize power. All existing methods require time cost O(MN2) (where N = #samples and M = #SNPs) and implicitly assume an infinitesimal genetic architecture in which effect sizes are normally distributed, which can limit power. Here, we present a far more efficient mixed model association method, BOLT-LMM, which requires only a small number of O(MN)-time iterations and increases power by modeling more realistic, non-infinitesimal genetic architectures via a Bayesian mixture prior on marker effect sizes. We applied BOLT-LMM to nine quantitative traits in 23,294 samples from the Women’s Genome Health Study (WGHS) and observed significant increases in power, consistent with simulations. Theory and simulations show that the boost in power increases with cohort size, making BOLT-LMM appealing for GWAS in large cohorts. PMID:25642633

  13. Computation in Classical Mechanics with Easy Java Simulations (EJS)

    NASA Astrophysics Data System (ADS)

    Cox, Anne J.

    2006-12-01

    Let your students enjoy creating animations and incorporating some computational physics into your Classical Mechanics course. This talk will demonstrate the use of an Open Source Physics package, Easy Java Simulations (EJS), in an already existing sophomore/junior level Classical Mechanics course. EJS allows for incremental introduction of computational physics into existing courses because it is easy to use (for instructors and students alike) and it is open source. Students can use this tool for numerical solutions to problems (as they can with commercial systems: Mathcad and Mathematica), but they can also generate their own animations. For example, students in Classical Mechanics use Lagrangian mechanics to solve a problem, and then use EJS not only to numerically solve the differential equations, but to show the associated motion (and check their answers). EJS, developed by Francisco Esquembre (http://fem.um.es/Ejs/), is built on the OpenSource Physics framework (http://www.opensourcephysics.org/) supported through NSF DUE0442581.

  14. OSI Network-layer Abstraction: Analysis of Simulation Dynamics and Performance Indicators

    NASA Astrophysics Data System (ADS)

    Lawniczak, Anna T.; Gerisch, Alf; Di Stefano, Bruno

    2005-06-01

    The Open Systems Interconnection (OSI) reference model provides a conceptual framework for communication among computers in a data communication network. The Network Layer of this model is responsible for the routing and forwarding of packets of data. We investigate the OSI Network Layer and develop an abstraction suitable for the study of various network performance indicators, e.g. throughput, average packet delay, average packet speed, average packet path-length, etc. We investigate how the network dynamics and the network performance indicators are affected by various routing algorithms and by the addition of randomly generated links into a regular network connection topology of fixed size. We observe that the network dynamics is not simply the sum of effects resulting from adding individual links to the connection topology but rather is governed nonlinearly by the complex interactions caused by the existence of all randomly added and already existing links in the network. Data for our study was gathered using Netzwerk-1, a C++ simulation tool that we developed for our abstraction.

  15. MAISIE: a multipurpose astronomical instrument simulator environment

    NASA Astrophysics Data System (ADS)

    O'Brien, Alan; Beard, Steven; Geers, Vincent; Klaassen, Pamela

    2016-07-01

    Astronomical instruments often need simulators to preview their data products and test their data reduction pipelines. Instrument simulators have tended to be purpose-built with a single instrument in mind, and at- tempting to reuse one of these simulators for a different purpose is often a slow and difficult task. MAISIE is a simulator framework designed for reuse on different instruments. An object-oriented design encourages reuse of functionality and structure, while offering the flexibility to create new classes with new functionality. MAISIE is a set of Python classes, interfaces and tools to help build instrument simulators. MAISIE can just as easily build simulators for single and multi-channel instruments, imagers and spectrometers, ground and space based instruments. To remain easy to use and to facilitate the sharing of simulators across teams, MAISIE is written in Python, a freely available and open-source language. New functionality can be created for MAISIE by creating new classes that represent optical elements. This approach allows new and novel instruments to add functionality and take advantage of the existing MAISIE classes. MAISIE has recently been used successfully to develop the simulator for the JWST/MIRI- Medium Resolution Spectrometer.

  16. Acquaintance to Artificial Neural Networks and use of artificial intelligence as a diagnostic tool for tuberculosis: A review.

    PubMed

    Dande, Payal; Samant, Purva

    2018-01-01

    Tuberculosis [TB] has afflicted numerous nations in the world. As per a report by the World Health Organization [WHO], an estimated 1.4 million TB deaths in 2015 and an additional 0.4 million deaths resulting from TB disease among people living with HIV, were observed. Most of the TB deaths can be prevented if it is detected at an early stage. The existing processes of diagnosis like blood tests or sputum tests are not only tedious but also take a long time for analysis and cannot differentiate between different drug resistant stages of TB. The need to find newer prompt methods for disease detection has been aided by the latest Artificial Intelligence [AI] tools. Artificial Neural Network [ANN] is one of the important tools that is being used widely in diagnosis and evaluation of medical conditions. This review aims at providing brief introduction to various AI tools that are used in TB detection and gives a detailed description about the utilization of ANN as an efficient diagnostic technique. The paper also provides a critical assessment of ANN and the existing techniques for their diagnosis of TB. Researchers and Practitioners in the field are looking forward to use ANN and other upcoming AI tools such as Fuzzy-logic, genetic algorithms and artificial intelligence simulation as a promising current and future technology tools towards tackling the global menace of Tuberculosis. Latest advancements in the diagnostic field include the combined use of ANN with various other AI tools like the Fuzzy-logic, which has led to an increase in the efficacy and specificity of the diagnostic techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Simulation Data Management - Requirements and Design Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clay, Robert L.; Friedman-Hill, Ernest J.; Gibson, Marcus J.

    Simulation Data Management (SDM), the ability to securely organize, archive, and share analysis models and the artifacts used to create them, is a fundamental requirement for modern engineering analysis based on computational simulation. We have worked separately to provide secure, network SDM services to engineers and scientists at our respective laboratories for over a decade. We propose to leverage our experience and lessons learned to help develop and deploy a next-generation SDM service as part of a multi-laboratory team. This service will be portable across multiple sites and platforms, and will be accessible via a range of command-line tools andmore » well-documented APIs. In this document, we’ll review our high-level and low-level requirements for such a system, review one existing system, and briefly discuss our proposed implementation.« less

  18. CMacIonize: Monte Carlo photoionisation and moving-mesh radiation hydrodynamics

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, Bert; Wood, Kenneth

    2018-02-01

    CMacIonize simulates the self-consistent evolution of HII regions surrounding young O and B stars, or other sources of ionizing radiation. The code combines a Monte Carlo photoionization algorithm that uses a complex mix of hydrogen, helium and several coolants in order to self-consistently solve for the ionization and temperature balance at any given time, with a standard first order hydrodynamics scheme. The code can be run as a post-processing tool to get the line emission from an existing simulation snapshot, but can also be used to run full radiation hydrodynamical simulations. Both the radiation transfer and the hydrodynamics are implemented in a general way that is independent of the grid structure that is used to discretize the system, allowing it to be run both as a standard fixed grid code and also as a moving-mesh code.

  19. Computer simulation of the human respiratory system for educational purposes.

    PubMed

    Botsis, Taxiarhis; Halkiotis, Stelios-Chris; Kourlaba, Georgia

    2004-01-01

    The main objective of this study was the development of a computer simulation system for the human respiratory system, in order to educate students of nursing. This approach was based on existing mathematical models and on our own constructed specific functions. For the development of this educational tool the appropriate software packages were used according to the special demands of this process. This system is called ReSim (Respiratory Simulation) and consists of two parts: the first part deals with pulmonary volumes and the second one represents the mechanical behavior of lungs. The target group evaluated ReSim. The outcomes of the evaluation process were positive and helped us realize the system characteristics that needed improvements. Our basic conclusion is that the extended use of such systems supports the educational process and offers new potential for learning.

  20. The plant leaf movement analyzer (PALMA): a simple tool for the analysis of periodic cotyledon and leaf movement in Arabidopsis thaliana.

    PubMed

    Wagner, Lucas; Schmal, Christoph; Staiger, Dorothee; Danisman, Selahattin

    2017-01-01

    The analysis of circadian leaf movement rhythms is a simple yet effective method to study effects of treatments or gene mutations on the circadian clock of plants. Currently, leaf movements are analysed using time lapse photography and subsequent bioinformatics analyses of leaf movements. Programs that are used for this purpose either are able to perform one function (i.e. leaf tip detection or rhythm analysis) or their function is limited to specific computational environments. We developed a leaf movement analysis tool-PALMA-that works in command line and combines image extraction with rhythm analysis using Fast Fourier transformation and non-linear least squares fitting. We validated PALMA in both simulated time series and in experiments using the known short period mutant sensitivity to red light reduced 1 ( srr1 - 1 ). We compared PALMA with two established leaf movement analysis tools and found it to perform equally well. Finally, we tested the effect of reduced iron conditions on the leaf movement rhythms of wild type plants. Here, we found that PALMA successfully detected period lengthening under reduced iron conditions. PALMA correctly estimated the period of both simulated and real-life leaf movement experiments. As a platform-independent console-program that unites both functions needed for the analysis of circadian leaf movements it is a valid alternative to existing leaf movement analysis tools.

  1. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  2. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  3. A software tool for analyzing multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  4. DUK - A Fast and Efficient Kmer Based Sequence Matching Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Mingkun; Copeland, Alex; Han, James

    2011-03-21

    A new tool, DUK, is developed to perform matching task. Matching is to find whether a query sequence partially or totally matches given reference sequences or not. Matching is similar to alignment. Indeed many traditional analysis tasks like contaminant removal use alignment tools. But for matching, there is no need to know which bases of a query sequence matches which position of a reference sequence, it only need know whether there exists a match or not. This subtle difference can make matching task much faster than alignment. DUK is accurate, versatile, fast, and has efficient memory usage. It uses Kmermore » hashing method to index reference sequences and Poisson model to calculate p-value. DUK is carefully implemented in C++ in object oriented design. The resulted classes can also be used to develop other tools quickly. DUK have been widely used in JGI for a wide range of applications such as contaminant removal, organelle genome separation, and assembly refinement. Many real applications and simulated dataset demonstrate its power.« less

  5. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, D.; Alfonsi, A.; Talbot, P.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less

  6. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  7. Nuclear Power Plant Cyber Security Discrete Dynamic Event Tree Analysis (LDRD 17-0958) FY17 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, Timothy A.; Denman, Matthew R.; Williams, R. A.

    Instrumentation and control of nuclear power is transforming from analog to modern digital assets. These control systems perform key safety and security functions. This transformation is occurring in new plant designs as well as in the existing fleet of plants as the operation of those plants is extended to 60 years. This transformation introduces new and unknown issues involving both digital asset induced safety issues and security issues. Traditional nuclear power risk assessment tools and cyber security assessment methods have not been modified or developed to address the unique nature of cyber failure modes and of cyber security threat vulnerabilities.more » iii This Lab-Directed Research and Development project has developed a dynamic cyber-risk in- formed tool to facilitate the analysis of unique cyber failure modes and the time sequencing of cyber faults, both malicious and non-malicious, and impose those cyber exploits and cyber faults onto a nuclear power plant accident sequence simulator code to assess how cyber exploits and cyber faults could interact with a plants digital instrumentation and control (DI&C) system and defeat or circumvent a plants cyber security controls. This was achieved by coupling an existing Sandia National Laboratories nuclear accident dynamic simulator code with a cyber emulytics code to demonstrate real-time simulation of cyber exploits and their impact on automatic DI&C responses. Studying such potential time-sequenced cyber-attacks and their risks (i.e., the associated impact and the associated degree of difficulty to achieve the attack vector) on accident management establishes a technical risk informed framework for developing effective cyber security controls for nuclear power.« less

  8. Injury representation against ballistic threats using three novel numerical models.

    PubMed

    Breeze, Johno; Fryer, R; Pope, D; Clasper, J

    2017-06-01

    Injury modelling of ballistic threats is a valuable tool for informing policy on personal protective equipment and other injury mitigation methods. Currently, the Ministry of Defence (MoD) and Centre for Protection of National Infrastructure (CPNI) are focusing on the development of three interlinking numerical models, each of a different fidelity, to answer specific questions on current threats. High-fidelity models simulate the physical events most realistically, and will be used in the future to test the medical effectiveness of personal armour systems. They are however generally computationally intensive, slow running and much of the experimental data to base their algorithms on do not yet exist. Medium fidelity models, such as the personnel vulnerability simulation (PVS), generally use algorithms based on physical or engineering estimations of interaction. This enables a reasonable representation of reality and greatly speeds up runtime allowing full assessments of the entire body area to be undertaken. Low-fidelity models such as the human injury predictor (HIP) tool generally use simplistic algorithms to make injury predictions. Individual scenarios can be run very quickly and hence enable statistical casualty assessments of large groups, where significant uncertainty concerning the threat and affected population exist. HIP is used to simulate the blast and penetrative fragmentation effects of a terrorist detonation of an improvised explosive device within crowds of people in metropolitan environments. This paper describes the collaboration between MoD and CPNI using an example of all three fidelities of injury model and to highlight future areas of research that are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. A flexible, interactive software tool for fitting the parameters of neuronal models.

    PubMed

    Friedrich, Péter; Vella, Michael; Gulyás, Attila I; Freund, Tamás F; Káli, Szabolcs

    2014-01-01

    The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool.

  10. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danon, Yaron; Nazarewicz, Witold; Talou, Patrick

    2013-02-18

    This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less

  11. A flexible, interactive software tool for fitting the parameters of neuronal models

    PubMed Central

    Friedrich, Péter; Vella, Michael; Gulyás, Attila I.; Freund, Tamás F.; Káli, Szabolcs

    2014-01-01

    The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool. PMID:25071540

  12. Simulating Pressure Profiles for the Free-Electron Laser Photoemission Gun Using Molflow+

    NASA Astrophysics Data System (ADS)

    Song, Diego; Hernandez-Garcia, Carlos

    2012-10-01

    The Jefferson Lab Free Electron Laser (FEL) generates tunable laser light by passing a relativistic electron beam generated in a high-voltage DC electron gun with a semiconducting photocathode through a magnetic undulator. The electron gun is in stringent vacuum conditions in order to guarantee photocathode longevity. Considering an upgrade of the electron gun, this project consists of simulating pressure profiles to determine if the novel design meets the electron gun vacuum requirements. The method of simulation employs the software Molflow+, developed by R. Kersevan at the Organisation Europ'eene pour la Recherche Nucl'eaire (CERN), which uses the test-particle Monte Carlo method to simulate molecular flows in 3D structures. Pressure is obtained along specified chamber axes. Results are then compared to measured pressure values from the existing gun for validation. Outgassing rates, surface area, and pressure were found to be proportionally related. The simulations indicate that the upgrade gun vacuum chamber requires more pumping compared to its predecessor, while it holds similar vacuum conditions. The ability to simulate pressure profiles through tools like Molflow+, allows researchers to optimize vacuum systems during the engineering process.

  13. PERFORM 60 - Prediction of the effects of radiation for reactor pressure vessel and in-core materials using multi-scale modelling - 60 years foreseen plant lifetime

    NASA Astrophysics Data System (ADS)

    Leclercq, Sylvain; Lidbury, David; Van Dyck, Steven; Moinereau, Dominique; Alamo, Ana; Mazouzi, Abdou Al

    2010-11-01

    In nuclear power plants, materials may undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities that operate these reactors need to quantify the ageing and the potential degradations of some essential structures of the power plant to ensure safe and reliable plant operation. So far, the material databases needed to take account of these degradations in the design and safe operation of installations mainly rely on long-term irradiation programs in test reactors as well as on mechanical or corrosion testing in specialized hot cells. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences have now made possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. A first step towards this goal has been successfully reached through the development of the RPV-2 and Toughness Module numerical tools by the scientific community created around the FP6 PERFECT project. These tools allow to simulate irradiation effects on the constitutive behaviour of the reactor pressure vessel low alloy steel, and also on its failure properties. Relying on the existing PERFECT Roadmap, the 4 years Collaborative Project PERFORM 60 has mainly for objective to develop multi-scale tools aimed at predicting the combined effects of irradiation and corrosion on internals (austenitic stainless steels) and also to improve existing ones on RPV (bainitic steels). PERFORM 60 is based on two technical sub-projects: (i) RPV and (ii) internals. In addition to these technical sub-projects, the Users' Group and Training sub-project shall allow representatives of constructors, utilities, research organizations… from Europe, USA and Japan to receive the information and training to get their own appraisal on limits and potentialities of the developed tools. An important effort will also be made to teach young researchers in the field of materials' degradation. PERFORM 60 has officially started on March 1st, 2009 with 20 European organizations and Universities involved in the nuclear field.

  14. Atmospheric extinction in simulation tools for solar tower plants

    NASA Astrophysics Data System (ADS)

    Hanrieder, Natalie; Wilbert, Stefan; Schroedter-Homscheidt, Marion; Schnell, Franziska; Guevara, Diana Mancera; Buck, Reiner; Giuliano, Stefano; Pitz-Paal, Robert

    2017-06-01

    Atmospheric extinction causes significant radiation losses between the heliostat field and the receiver in a solar tower plants. These losses vary with site and time. State of the art is that in ray-tracing and plant optimization tools, atmospheric extinction is included by choosing between few constant standard atmospheric conditions. Even though some tools allow the consideration of site and time dependent extinction data, such data sets are nearly never available. This paper summarizes and compares the most common model equations implemented in several ray-tracing tools. There are already several methods developed and published to measure extinction on-site. An overview of the existing methods is also given here. Ray-tracing simulations of one exemplary tower plant at the Plataforma Solar de Almería (PSA) are presented to estimate the plant yield deviations between simulations using standard model equations instead of extinction time series. For PSA, the effect of atmospheric extinction accounts for losses between 1.6 and 7 %. This range is caused by considering overload dumping or not. Applying standard clear or hazy model equations instead of extinction time series lead to an underestimation of the annual plant yield at PSA. The discussion of the effect of extinction in tower plants has to include overload dumping. Situations in which overload dumping occurs are mostly connected to high radiation levels and low atmospheric extinction. Therefore it can be recommended that project developers should consider site and time dependent extinction data especially on hazy sites. A reduced uncertainty of the plant yield prediction can significantly reduce costs due to smaller risk margins for financing and EPCs. The generation of extinction data for several locations in form of representative yearly time series or geographical maps should be further elaborated.

  15. Facilitating hydrological data analysis workflows in R: the RHydro package

    NASA Astrophysics Data System (ADS)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges of the RHydro package, including integration with big data technologies, web technologies, and emerging data models in hydrology.

  16. DIMM-SC: a Dirichlet mixture model for clustering droplet-based single cell transcriptomic data.

    PubMed

    Sun, Zhe; Wang, Ting; Deng, Ke; Wang, Xiao-Feng; Lafyatis, Robert; Ding, Ying; Hu, Ming; Chen, Wei

    2018-01-01

    Single cell transcriptome sequencing (scRNA-Seq) has become a revolutionary tool to study cellular and molecular processes at single cell resolution. Among existing technologies, the recently developed droplet-based platform enables efficient parallel processing of thousands of single cells with direct counting of transcript copies using Unique Molecular Identifier (UMI). Despite the technology advances, statistical methods and computational tools are still lacking for analyzing droplet-based scRNA-Seq data. Particularly, model-based approaches for clustering large-scale single cell transcriptomic data are still under-explored. We developed DIMM-SC, a Dirichlet Mixture Model for clustering droplet-based Single Cell transcriptomic data. This approach explicitly models UMI count data from scRNA-Seq experiments and characterizes variations across different cell clusters via a Dirichlet mixture prior. We performed comprehensive simulations to evaluate DIMM-SC and compared it with existing clustering methods such as K-means, CellTree and Seurat. In addition, we analyzed public scRNA-Seq datasets with known cluster labels and in-house scRNA-Seq datasets from a study of systemic sclerosis with prior biological knowledge to benchmark and validate DIMM-SC. Both simulation studies and real data applications demonstrated that overall, DIMM-SC achieves substantially improved clustering accuracy and much lower clustering variability compared to other existing clustering methods. More importantly, as a model-based approach, DIMM-SC is able to quantify the clustering uncertainty for each single cell, facilitating rigorous statistical inference and biological interpretations, which are typically unavailable from existing clustering methods. DIMM-SC has been implemented in a user-friendly R package with a detailed tutorial available on www.pitt.edu/∼wec47/singlecell.html. wei.chen@chp.edu or hum@ccf.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  17. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034

  18. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  19. Simulated Seasonal Spatio-Temporal Patterns of Soil Moisture, Temperature, and Net Radiation in a Deciduous Forest

    NASA Technical Reports Server (NTRS)

    Ballard, Jerrell R., Jr.; Howington, Stacy E.; Cinnella, Pasquale; Smith, James A.

    2011-01-01

    The temperature and moisture regimes in a forest are key components in the forest ecosystem dynamics. Observations and studies indicate that the internal temperature distribution and moisture content of the tree influence not only growth and development, but onset and cessation of cambial activity [1], resistance to insect predation[2], and even affect the population dynamics of the insects [3]. Moreover, temperature directly affects the uptake and metabolism of population from the soil into the tree tissue [4]. Additional studies show that soil and atmospheric temperatures are significant parameters that limit the growth of trees and impose treeline elevation limitation [5]. Directional thermal infrared radiance effects have long been observed in natural backgrounds [6]. In earlier work, we illustrated the use of physically-based models to simulate directional effects in thermal imaging [7-8]. In this paper, we illustrated the use of physically-based models to simulate directional effects in thermal, and net radiation in a adeciduous forest using our recently developed three-dimensional, macro-scale computational tool that simulates the heat and mass transfer interaction in a soil-root-stem systems (SRSS). The SRSS model includes the coupling of existing heat and mass transport tools to stimulate the diurnal internal and external temperatures, internal fluid flow and moisture distribution, and heat flow in the system.

  20. Sensitivity Observing System Experiment (SOSE)-a new effective NWP-based tool in designing the global observing system

    NASA Astrophysics Data System (ADS)

    Marseille, Gert-Jan; Stoffelen, Ad; Barkmeijer, Jan

    2008-03-01

    Lacking an established methodology to test the potential impact of prospective extensions to the global observing system (GOS) in real atmospheric cases we developed such a method, called Sensitivity Observing System Experiment (SOSE). For example, since the GOS is non uniform it is of interest to investigate the benefit of complementary observing systems filling its gaps. In a SOSE adjoint sensitivity structures are used to define a pseudo true atmospheric state for the simulation of the prospective observing system. Next, the synthetic observations are used together with real observations from the existing GOS in a state-of-the-art Numerical Weather Prediction (NWP) model to assess the potential added value of the new observing system. Unlike full observing system simulation experiments (OSSE), SOSE can be applied to real extreme events that were badly forecast operationally and only requires the simulation of the new instrument. As such SOSE is an effective tool, for example, to define observation requirements for extensions to the GOS. These observation requirements may serve as input for the design of an operational network of prospective observing systems. In a companion paper we use SOSE to simulate potential future space borne Doppler Wind Lidar (DWL) scenarios and assess their capability to sample meteorologically sensitive areas not well captured by the current GOS, in particular over the Northern Hemisphere oceans.

  1. Computational modeling of muscular thin films for cardiac repair

    NASA Astrophysics Data System (ADS)

    Böl, Markus; Reese, Stefanie; Parker, Kevin Kit; Kuhl, Ellen

    2009-03-01

    Motivated by recent success in growing biohybrid material from engineered tissues on synthetic polymer films, we derive a computational simulation tool for muscular thin films in cardiac repair. In this model, the polydimethylsiloxane base layer is simulated in terms of microscopically motivated tetrahedral elements. Their behavior is characterized through a volumetric contribution and a chain contribution that explicitly accounts for the polymeric microstructure of networks of long chain molecules. Neonatal rat ventricular cardiomyocytes cultured on these polymeric films are modeled with actively contracting truss elements located on top of the sheet. The force stretch response of these trusses is motivated by the cardiomyocyte force generated during active contraction as suggested by the filament sliding theory. In contrast to existing phenomenological models, all material parameters of this novel model have a clear biophyisical interpretation. The predictive features of the model will be demonstrated through the simulation of muscular thin films. First, the set of parameters will be fitted for one particular experiment documented in the literature. This parameter set is then used to validate the model for various different experiments. Last, we give an outlook of how the proposed simulation tool could be used to virtually predict the response of multi-layered muscular thin films. These three-dimensional constructs show a tremendous regenerative potential in repair of damaged cardiac tissue. The ability to understand, tune and optimize their structural response is thus of great interest in cardiovascular tissue engineering.

  2. Separator Reconnection at Earth's Dayside Magnetopause and the Tail: MMS Observations Compared to Global 3D Simulations

    NASA Astrophysics Data System (ADS)

    Buzulukova, N.; Dorelli, J.; Glocer, A.

    2017-12-01

    We present the results of global high resolution resistive magnetohydrodynamics (MHD BATS-R-US) simulations of Earth's magnetosphere. We extract location of magnetic separators with RECONX tool and compare the results with observations from the Magnetospheric Multiscale (MMS). A few cases are analysed including a southward IMF magnetopause crossing during October 16, 2015 that was previously identified as an electron diffusion region (EDR) event. The simulation predicts a complex time-dependent magnetic topology consisting of multiple separators and flux ropes. Despite the topological complexity, the predicted distance between MMS and the primary separator is less than 0.5 Earth radii. The simulation shows that the existence of IMF Bx results in a duskward shift of the location of the topological separator. The results are explained by a combined effect of solar wind draping and pile-up effect that modify the current density across the magnetopause and affect the location of the separator. The RECONX tool also is used to extract the separator location in the geomagnetic tail, and relate transient tail structures (bursty bulk flows) to the location of separator. These results suggest that global magnetic topology, rather than local magnetic geometry alone, determines the location of the separator reconnection both at the dayside magnetopause and in the tail. We show that the resistive MHD model helps to understand the global context of local MMS observations.

  3. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Fault Diagnosis System of Wind Turbine Generator Based on Petri Net

    NASA Astrophysics Data System (ADS)

    Zhang, Han

    Petri net is an important tool for discrete event dynamic systems modeling and analysis. And it has great ability to handle concurrent phenomena and non-deterministic phenomena. Currently Petri nets used in wind turbine fault diagnosis have not participated in the actual system. This article will combine the existing fuzzy Petri net algorithms; build wind turbine control system simulation based on Siemens S7-1200 PLC, while making matlab gui interface for migration of the system to different platforms.

  5. Launch team training system

    NASA Technical Reports Server (NTRS)

    Webb, J. T.

    1988-01-01

    A new approach to the training, certification, recertification, and proficiency maintenance of the Shuttle launch team is proposed. Previous training approaches are first reviewed. Short term program goals include expanding current training methods, improving the existing simulation capability, and scheduling training exercises with the same priority as hardware tests. Long-term goals include developing user requirements which would take advantage of state-of-the-art tools and techniques. Training requirements for the different groups of people to be trained are identified, and future goals are outlined.

  6. Engine dynamic analysis with general nonlinear finite element codes

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1991-01-01

    A general engine dynamic analysis as a standard design study computational tool is described for the prediction and understanding of complex engine dynamic behavior. Improved definition of engine dynamic response provides valuable information and insights leading to reduced maintenance and overhaul costs on existing engine configurations. Application of advanced engine dynamic simulation methods provides a considerable cost reduction in the development of new engine designs by eliminating some of the trial and error process done with engine hardware development.

  7. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  8. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  9. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  10. Development of an Output-based Adaptive Method for Multi-Dimensional Euler and Navier-Stokes Simulations

    NASA Technical Reports Server (NTRS)

    Darmofal, David L.

    2003-01-01

    The use of computational simulations in the prediction of complex aerodynamic flows is becoming increasingly prevalent in the design process within the aerospace industry. Continuing advancements in both computing technology and algorithmic development are ultimately leading to attempts at simulating ever-larger, more complex problems. However, by increasing the reliance on computational simulations in the design cycle, we must also increase the accuracy of these simulations in order to maintain or improve the reliability arid safety of the resulting aircraft. At the same time, large-scale computational simulations must be made more affordable so that their potential benefits can be fully realized within the design cycle. Thus, a continuing need exists for increasing the accuracy and efficiency of computational algorithms such that computational fluid dynamics can become a viable tool in the design of more reliable, safer aircraft. The objective of this research was the development of an error estimation and grid adaptive strategy for reducing simulation errors in integral outputs (functionals) such as lift or drag from from multi-dimensional Euler and Navier-Stokes simulations. In this final report, we summarize our work during this grant.

  11. Mocking the weak lensing universe: The LensTools Python computing package

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-10-01

    We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.

  12. Analyzing asteroid reflectance spectra with numerical tools based on scattering simulations

    NASA Astrophysics Data System (ADS)

    Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri

    2017-04-01

    We are developing a set of numerical tools that can be used in analyzing the reflectance spectra of granular materials such as the regolith surface of atmosphereless Solar system objects. Our goal is to be able to explain, with realistic numerical scattering models, the spectral features arising when materials are intimately mixed together. We include the space-weathering -type effects in our simulations, i.e., mixing host mineral locally with small inclusions of another material in small proportions. Our motivation for this study comes from the present lack of such tools. The current common practice is to apply a semi-physical approximate model such as some variation of Hapke models [e.g., 1] or the Shkuratov model [2]. These models are expressed in a closed form so that they are relatively fast to apply. They are based on simplifications on the radiative transfer theory. The problem is that the validity of the model is not always guaranteed, and the derived physical properties related to particle scattering properties can be unrealistic [3]. We base our numerical tool into a chain of scattering simulations. Scattering properties of small inclusions inside an absorbing host matrix can be derived using exact methods solving the Maxwell equations of the system. The next step, scattering by a single regolith grain, is solved using a geometrical optics method accounting for surface reflections, internal absorption, and possibly the internal diffuse scattering. The third step involves the radiative transfer simulations of these regolith grains in a macroscopic planar element. The chain can be continued next with shadowing simulation over the target surface elements, and finally by integrating the bidirectional reflectance distribution function over the object's shape. Most of the tools in the proposed chain already exist, and one practical task for us is to tie these together into an easy-to-use toolchain that can be publicly distributed. We plan to open the abovementioned toolchain as a web-based open service. Acknowledgments: The research is funded by the ERC Advanced Grant No. 320773 (SAEMPL) References: [1] B. Hapke, Icarus 195, 918-926, 2008. [2] Yu. Shkuratov et al, Icarus 137, 235-246, 1999. [3] Yu. Shkuratov et al, JQSRT 113, 2431-2456, 2012. [4] K. Muinonen et al, JQSRT 110, 1628-1639, 2009.

  13. Complexity Analysis of Traffic in Corridors-in-the-Sky

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon Jean

    2010-01-01

    The corridors-in-the-sky concept imitates the highway system in ground transportation. The benefit expected from a corridor relies on its capability of handling high density traffic with negligible controller workload, the acceptance of extra fuel or distance, and the complexity reduction in underlying sectors. This work evaluates a selected corridor from these perspectives through simulations. To examine traffic inside the corridor, a corridor traffic simulation tool that can resolve conflicts is developed using C language. Prescribed conflict resolution maneuvers mimic corridor users behaviors and conflict resolution counts measure complexity. Different lane options and operational policies are proposed to examine their impacts on complexity. Fuel consumption is calculated and compared for corridor traffic. On the other hand, to investigate the complexity of non-corridor traffic in underlying sectors, the existing Airspace Concept Evaluation System tool is utilized along with the Automated Airspace Concept tool. The number of conflict resolutions is examined and treated as the complexity measurement. The results show heavy traffic can be managed with low complexity for a historical traffic schedule simulated with appropriate operational policies and lane options. For instance, with 608 flights and peak aircraft count of 100, only 84 actions need to be taken in a 24-hour period to resolve the conflicts for an 8-lane corridor. Compared with the fuel consumptions with great circle trajectories, the simulation of corridor traffic shows that the total extra fuel for corridor flights is 26,373 gallons, or 2.76%, which is 0.38% less than flying filed flight plans. Without taking climb and descent portions of corridor traffic, the complexity of underlying sectors is reduced by 17.71%. However the climb and descent portions will eliminate the reduction and the overall complexity of sectors is actually increased by 9.14%.

  14. 3-d finite element model development for biomechanics: a software demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Hollister, A.M.; Ashby, E.

    1997-03-01

    Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models,more » using human hand and knee examples, and will demonstrate their software tools.« less

  15. G2H--graphics-to-haptic virtual environment development tool for PC's.

    PubMed

    Acosta, E; Temkin, B; Krummel, T M; Heinrichs, W L

    2000-01-01

    For surgical training and preparations, the existing surgical virtual environments have shown great improvement. However, these improvements are more in the visual aspect. The incorporation of haptics into virtual reality base surgical simulations would enhance the sense of realism greatly. To aid in the development of the haptic surgical virtual environment we have created a graphics to haptic, G2H, virtual environment developer tool. G2H transforms graphical virtual environments (created or imported) to haptic virtual environments without programming. The G2H capability has been demonstrated using the complex 3D pelvic model of Lucy 2.0, the Stanford Visible Female. The pelvis was made haptic using G2H without any further programming effort.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Eric J

    The ResStock analysis tool is helping states, municipalities, utilities, and manufacturers identify which home upgrades save the most energy and money. Across the country there's a vast diversity in the age, size, construction practices, installed equipment, appliances, and resident behavior of the housing stock, not to mention the range of climates. These variations have hindered the accuracy of predicting savings for existing homes. Researchers at the National Renewable Energy Laboratory (NREL) developed ResStock. It's a versatile tool that takes a new approach to large-scale residential energy analysis by combining: large public and private data sources, statistical sampling, detailed subhourly buildingmore » simulations, high-performance computing. This combination achieves unprecedented granularity and most importantly - accuracy - in modeling the diversity of the single-family housing stock.« less

  17. Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes

    NASA Technical Reports Server (NTRS)

    Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.

    2001-01-01

    The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as part of their continuing efforts to support the users of the FLUKA code within the particle physics community. In keeping with the spirit of developing an evolving physics code, we are planning as part of this project, to participate in the efforts to validate the core FLUKA physics in ground-based accelerator test runs. The emphasis of these test runs will be the physics of greatest interest in the simulation of the space radiation environment. Such a tool will be of great value to planners, designers and operators of future space missions, as well as for the design of the vehicles and habitats to be used on such missions. It will also be of aid to future experiments of various kinds that may be affected at some level by the ambient radiation environment, or in the analysis of hybrid experiment designs that have been discussed for space-based astronomy and astrophysics. The tool will be of value to the Life Sciences personnel involved in the prediction and measurement of radiation doses experienced by the crewmembers on such missions. In addition, the tool will be of great use to the planners of experiments to measure and evaluate the space radiation environment itself. It can likewise be useful in the analysis of safe havens, hazard migration plans, and NASA's call for new research in composites and to NASA engineers modeling the radiation exposure of electronic circuits. This code will provide an important complimentary check on the predictions of analytic codes such as BRYNTRN/HZETRN that are presently used for many similar applications, and which have shortcomings that are more easily overcome with Monte Carlo type simulations. Finally, it is acknowledged that there are similar efforts based around the use of the GEANT4 Monte-Carlo transport code currently under development at CERN. It is our intention to make our software modular and sufficiently flexible to allow the parallel use of either FLUKA or GEANT4 as the physics transport engine.

  18. ISSARS Aerosol Database : an Incorporation of Atmospheric Particles into a Universal Tool to Simulate Remote Sensing Instruments

    NASA Technical Reports Server (NTRS)

    Goetz, Michael B.

    2011-01-01

    The Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) entered its third and final year of development with an overall goal of providing a unified tool to simulate active and passive space borne atmospheric remote sensing instruments. These simulations focus on the atmosphere ranging from UV to microwaves. ISSARS handles all assumptions and uses various models on scattering and microphysics to fill the gaps left unspecified by the atmospheric models to create each instrument's measurements. This will help benefit mission design and reduce mission cost, create efficient implementation of multi-instrument/platform Observing System Simulation Experiments (OSSE), and improve existing models as well as new advanced models in development. In this effort, various aerosol particles are incorporated into the system, and a simulation of input wavelength and spectral refractive indices related to each spherical test particle(s) generate its scattering properties and phase functions. These atmospheric particles being integrated into the system comprise the ones observed by the Multi-angle Imaging SpectroRadiometer(MISR) and by the Multiangle SpectroPolarimetric Imager(MSPI). In addition, a complex scattering database generated by Prof. Ping Yang (Texas A&M) is also incorporated into this aerosol database. Future development with a radiative transfer code will generate a series of results that can be validated with results obtained by the MISR and MSPI instruments; nevertheless, test cases are simulated to determine the validity of various plugin libraries used to determine or gather the scattering properties of particles studied by MISR and MSPI, or within the Single-scattering properties of tri-axial ellipsoidal mineral dust particles database created by Prof. Ping Yang.

  19. 3D FEM Simulation of Flank Wear in Turning

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio

    2011-05-01

    This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.

  20. Evaluating variability with atomistic simulations: the effect of potential and calculation methodology on the modeling of lattice and elastic constants

    NASA Astrophysics Data System (ADS)

    Hale, Lucas M.; Trautt, Zachary T.; Becker, Chandler A.

    2018-07-01

    Atomistic simulations using classical interatomic potentials are powerful investigative tools linking atomic structures to dynamic properties and behaviors. It is well known that different interatomic potentials produce different results, thus making it necessary to characterize potentials based on how they predict basic properties. Doing so makes it possible to compare existing interatomic models in order to select those best suited for specific use cases, and to identify any limitations of the models that may lead to unrealistic responses. While the methods for obtaining many of these properties are often thought of as simple calculations, there are many underlying aspects that can lead to variability in the reported property values. For instance, multiple methods may exist for computing the same property and values may be sensitive to certain simulation parameters. Here, we introduce a new high-throughput computational framework that encodes various simulation methodologies as Python calculation scripts. Three distinct methods for evaluating the lattice and elastic constants of bulk crystal structures are implemented and used to evaluate the properties across 120 interatomic potentials, 18 crystal prototypes, and all possible combinations of unique lattice site and elemental model pairings. Analysis of the results reveals which potentials and crystal prototypes are sensitive to the calculation methods and parameters, and it assists with the verification of potentials, methods, and molecular dynamics software. The results, calculation scripts, and computational infrastructure are self-contained and openly available to support researchers in performing meaningful simulations.

  1. LandCaRe DSS--an interactive decision support system for climate change impact assessment and the analysis of potential agricultural land use adaptation strategies.

    PubMed

    Wenkel, Karl-Otto; Berg, Michael; Mirschel, Wilfried; Wieland, Ralf; Nendel, Claas; Köstner, Barbara

    2013-09-01

    Decision support to develop viable climate change adaptation strategies for agriculture and regional land use management encompasses a wide range of options and issues. Up to now, only a few suitable tools and methods have existed for farmers and regional stakeholders that support the process of decision-making in this field. The interactive model-based spatial information and decision support system LandCaRe DSS attempts to close the existing methodical gap. This system supports interactive spatial scenario simulations, multi-ensemble and multi-model simulations at the regional scale, as well as the complex impact assessment of potential land use adaptation strategies at the local scale. The system is connected to a local geo-database and via the internet to a climate data server. LandCaRe DSS uses a multitude of scale-specific ecological impact models, which are linked in various ways. At the local scale (farm scale), biophysical models are directly coupled with a farm economy calculator. New or alternative simulation models can easily be added, thanks to the innovative architecture and design of the DSS. Scenario simulations can be conducted with a reasonable amount of effort. The interactive LandCaRe DSS prototype also offers a variety of data analysis and visualisation tools, a help system for users and a farmer information system for climate adaptation in agriculture. This paper presents the theoretical background, the conceptual framework, and the structure and methodology behind LandCaRe DSS. Scenario studies at the regional and local scale for the two Eastern German regions of Uckermark (dry lowlands, 2600 km(2)) and Weißeritz (humid mountain area, 400 km(2)) were conducted in close cooperation with stakeholders to test the functionality of the DSS prototype. The system is gradually being transformed into a web version (http://www.landcare-dss.de) to ensure the broadest possible distribution of LandCaRe DSS to the public. The system will be continuously developed, updated and used in different research projects and as a learning and knowledge-sharing tool for students. The main objective of LandCaRe DSS is to provide information on the complex long-term impacts of climate change and on potential management options for adaptation by answering "what-if" type questions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. The perceived value of using BIM for energy simulation

    NASA Astrophysics Data System (ADS)

    Lewis, Anderson M.

    Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement level with BIM and/or energy simulation. However, green design stakeholder perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM and/or energy simulation may differ between different user groups (i.e. BIM users only, energy simulation users only, and BIM and energy simulation users). For example, the BIM-only user groups appeared to have a strong positive correlation between the perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM. Additionally, this study suggests that the top perceived benefits of using BIMs to inform energy simulations among green design stakeholders are: facilitation of communication, reducing of process related costs, and giving users the ability examine more design options. The main perceived barrier of using BIMs to inform energy simulations among green design stakeholders was a lack of BIM standards for model integration with multidisciplinary teams. Results from this study will help readers understand how to better implement BIM-based energy simulation while mitigating barriers and optimizing benefits. Additionally, examining discrepancies between user groups can lead the identification and improvement of shortfalls in current BIM-based energy simulation processes. Understanding how perceptions and engagement levels differ among different software user groups will help in developing a strategies for implementing BIM-based energy simulation that are tailored to each specific user group.

  3. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  4. Java-based Graphical User Interface for MAVERIC-II

    NASA Technical Reports Server (NTRS)

    Seo, Suk Jai

    2005-01-01

    A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.

  5. The Met Office HadGEM3-ES chemistry-climate model: evaluation of stratospheric dynamics and its impact on ozone

    NASA Astrophysics Data System (ADS)

    Hardiman, Steven C.; Butchart, Neal; O'Connor, Fiona M.; Rumbold, Steven T.

    2017-03-01

    Free-running and nudged versions of a Met Office chemistry-climate model are evaluated and used to investigate the impact of dynamics versus transport and chemistry within the model on the simulated evolution of stratospheric ozone. Metrics of the dynamical processes relevant for simulating stratospheric ozone are calculated, and the free-running model is found to outperform the previous model version in 10 of the 14 metrics. In particular, large biases in stratospheric transport and tropical tropopause temperature, which existed in the previous model version, are substantially reduced, making the current model more suitable for the simulation of stratospheric ozone. The spatial structure of the ozone hole, the area of polar stratospheric clouds, and the increased ozone concentrations in the Northern Hemisphere winter stratosphere following sudden stratospheric warmings, were all found to be sensitive to the accuracy of the dynamics and were better simulated in the nudged model than in the free-running model. Whilst nudging can, in general, provide a useful tool for removing the influence of dynamical biases from the evolution of chemical fields, this study shows that issues can remain in the climatology of nudged models. Significant biases in stratospheric vertical velocities, age of air, water vapour, and total column ozone still exist in the Met Office nudged model. Further, these can lead to biases in the downward flux of ozone into the troposphere.

  6. An implicit higher-order spatially accurate scheme for solving time dependent flows on unstructured meshes

    NASA Astrophysics Data System (ADS)

    Tomaro, Robert F.

    1998-07-01

    The present research is aimed at developing a higher-order, spatially accurate scheme for both steady and unsteady flow simulations using unstructured meshes. The resulting scheme must work on a variety of general problems to ensure the creation of a flexible, reliable and accurate aerodynamic analysis tool. To calculate the flow around complex configurations, unstructured grids and the associated flow solvers have been developed. Efficient simulations require the minimum use of computer memory and computational times. Unstructured flow solvers typically require more computer memory than a structured flow solver due to the indirect addressing of the cells. The approach taken in the present research was to modify an existing three-dimensional unstructured flow solver to first decrease the computational time required for a solution and then to increase the spatial accuracy. The terms required to simulate flow involving non-stationary grids were also implemented. First, an implicit solution algorithm was implemented to replace the existing explicit procedure. Several test cases, including internal and external, inviscid and viscous, two-dimensional, three-dimensional and axi-symmetric problems, were simulated for comparison between the explicit and implicit solution procedures. The increased efficiency and robustness of modified code due to the implicit algorithm was demonstrated. Two unsteady test cases, a plunging airfoil and a wing undergoing bending and torsion, were simulated using the implicit algorithm modified to include the terms required for a moving and/or deforming grid. Secondly, a higher than second-order spatially accurate scheme was developed and implemented into the baseline code. Third- and fourth-order spatially accurate schemes were implemented and tested. The original dissipation was modified to include higher-order terms and modified near shock waves to limit pre- and post-shock oscillations. The unsteady cases were repeated using the higher-order spatially accurate code. The new solutions were compared with those obtained using the second-order spatially accurate scheme. Finally, the increased efficiency of using an implicit solution algorithm in a production Computational Fluid Dynamics flow solver was demonstrated for steady and unsteady flows. A third- and fourth-order spatially accurate scheme has been implemented creating a basis for a state-of-the-art aerodynamic analysis tool.

  7. SLIVISU, an Interactive Visualisation Framework for Analysis of Geological Sea-Level Indicators

    NASA Astrophysics Data System (ADS)

    Klemann, V.; Schulte, S.; Unger, A.; Dransch, D.

    2011-12-01

    Flanking data analysis in earth system sciences by advanced visualisation tools is a striking feature due to rising complexity, amount and variety of available data. With respect to sea-level indicators (SLIs), their analysis in earth-system applications, such as modelling and simulation on regional or global scales, demands the consideration of large amounts of data - we talk about thousands of SLIs - and, so, to go ahead of analysing single sea-level curves. On the other hand, a gross analysis by means of statistical methods is hindered by the often heterogeneous and individual character of the single SLIs, i.e., the spatio-temporal context and often heterogenous information is difficult to handle or to represent in an objective way. Therefore a concept of integrating automated analysis and visualisation is mandatory. This is provided by visual analytics. As an implementation of this concept, we present the visualisation framework SLIVISU, developed at GFZ, which bases on multiple linked views and provides a synoptic analysis of observational data, model configurations, model outputs and results of automated analysis in glacial isostatic adjustment. Starting as a visualisation tool for an existing database of SLIs, it now serves as an analysis tool for the evaluation of model simulations in studies of glacial-isostatic adjustment.

  8. The Use of a Block Diagram Simulation Language for Rapid Model Prototyping

    NASA Technical Reports Server (NTRS)

    Whitlow, Johnathan E.; Engrand, Peter

    1996-01-01

    The research performed this summer was a continuation of work performed during the 1995 NASA/ASEE Summer Fellowship. The focus of the work was to expand previously generated predictive models for liquid oxygen (LOX) loading into the external fuel tank of the shuttle. The models which were developed using a block diagram simulation language known as VisSim, were evaluated on numerous shuttle flights and found to well in most cases. Once the models were refined and validated, the predictive methods were integrated into the existing Rockwell software propulsion advisory tool (PAT). Although time was not sufficient to completely integrate the models developed into PAT, the ability to predict flows and pressures in the orbiter section and graphically display the results was accomplished.

  9. Achromatic half-wave plate for submillimeter instruments in cosmic microwave background astronomy: modeling and simulation.

    PubMed

    Savini, Giorgio; Pisano, Giampaolo; Ade, Peter A R

    2006-12-10

    We adopted an existing formalism and modified it to simulate, with high precision, the transmission, reflection, and absorption of multiple-plate birefringent devices as a function of frequency. To validate the model, we use it to compare the measured properties of an achromatic five-plate device with a broadband antireflection coating to expectations derived from the material optical constants and its geometric configuration. The half-wave plate presented here is observed to perform well with a phase shift variation of < 2 degrees from the ideal 180 degrees over a bandwidth of Deltav/v approximately 1 at millimeter wavelengths. This formalism represents a powerful design tool for birefringent polarization modulators and enables its optical properties to be specified with high accuracy.

  10. Functional specifications for AI software tools for electric power applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, W.S.

    1985-08-01

    The principle barrier to the introduction of artificial intelligence (AI) technology to the electric power industry has not been a lack of interest or appropriate problems, for the industry abounds in both. Like most others, however, the electric power industry lacks the personnel - knowledge engineers - with the special combination of training and skills AI programming demands. Conversely, very few AI specialists are conversant with electric power industry problems and applications. The recent availability of sophisticated AI programming environments is doing much to alleviate this shortage. These products provide a set of powerful and usable software tools that enablemore » even non-AI scientists to rapidly develop AI applications. The purpose of this project was to develop functional specifications for programming tools that, when integrated with existing general-purpose knowledge engineering tools, would expedite the production of AI applications for the electric power industry. Twelve potential applications, representative of major problem domains within the nuclear power industry, were analyzed in order to identify those tools that would be of greatest value in application development. Eight tools were specified, including facilities for power plant modeling, data base inquiry, simulation and machine-machine interface.« less

  11. Tools for Evaluating Fault Detection and Diagnostic Methods for HVAC Secondary Systems

    NASA Astrophysics Data System (ADS)

    Pourarian, Shokouh

    Although modern buildings are using increasingly sophisticated energy management and control systems that have tremendous control and monitoring capabilities, building systems routinely fail to perform as designed. More advanced building control, operation, and automated fault detection and diagnosis (AFDD) technologies are needed to achieve the goal of net-zero energy commercial buildings. Much effort has been devoted to develop such technologies for primary heating ventilating and air conditioning (HVAC) systems, and some secondary systems. However, secondary systems, such as fan coil units and dual duct systems, although widely used in commercial, industrial, and multifamily residential buildings, have received very little attention. This research study aims at developing tools that could provide simulation capabilities to develop and evaluate advanced control, operation, and AFDD technologies for these less studied secondary systems. In this study, HVACSIM+ is selected as the simulation environment. Besides developing dynamic models for the above-mentioned secondary systems, two other issues related to the HVACSIM+ environment are also investigated. One issue is the nonlinear equation solver used in HVACSIM+ (Powell's Hybrid method in subroutine SNSQ). It has been found from several previous research projects (ASRHAE RP 825 and 1312) that SNSQ is especially unstable at the beginning of a simulation and sometimes unable to converge to a solution. Another issue is related to the zone model in the HVACSIM+ library of components. Dynamic simulation of secondary HVAC systems unavoidably requires an interacting zone model which is systematically and dynamically interacting with building surrounding. Therefore, the accuracy and reliability of the building zone model affects operational data generated by the developed dynamic tool to predict HVAC secondary systems function. The available model does not simulate the impact of direct solar radiation that enters a zone through glazing and the study of zone model is conducted in this direction to modify the existing zone model. In this research project, the following tasks are completed and summarized in this report: 1. Develop dynamic simulation models in the HVACSIM+ environment for common fan coil unit and dual duct system configurations. The developed simulation models are able to produce both fault-free and faulty operational data under a wide variety of faults and severity levels for advanced control, operation, and AFDD technology development and evaluation purposes; 2. Develop a model structure, which includes the grouping of blocks and superblocks, treatment of state variables, initial and boundary conditions, and selection of equation solver, that can simulate a dual duct system efficiently with satisfactory stability; 3. Design and conduct a comprehensive and systematic validation procedure using collected experimental data to validate the developed simulation models under both fault-free and faulty operational conditions; 4. Conduct a numerical study to compare two solution techniques: Powell's Hybrid (PH) and Levenberg-Marquardt (LM) in terms of their robustness and accuracy. 5. Modification of the thermal state of the existing building zone model in HVACSIM+ library of component. This component is revised to consider the transmitted heat through glazing as a heat source for transient building zone load prediction In this report, literature, including existing HVAC dynamic modeling environment and models, HVAC model validation methodologies, and fault modeling and validation methodologies, are reviewed. The overall methodologies used for fault free and fault model development and validation are introduced. Detailed model development and validation results for the two secondary systems, i.e., fan coil unit and dual duct system are summarized. Experimental data mostly from the Iowa Energy Center Energy Resource Station are used to validate the models developed in this project. Satisfactory model performance in both fault free and fault simulation studies is observed for all studied systems.

  12. Virtual Plant Tissue: Building Blocks for Next-Generation Plant Growth Simulation

    PubMed Central

    De Vos, Dirk; Dzhurakhalov, Abdiravuf; Stijven, Sean; Klosiewicz, Przemyslaw; Beemster, Gerrit T. S.; Broeckhove, Jan

    2017-01-01

    Motivation: Computational modeling of plant developmental processes is becoming increasingly important. Cellular resolution plant tissue simulators have been developed, yet they are typically describing physiological processes in an isolated way, strongly delimited in space and time. Results: With plant systems biology moving toward an integrative perspective on development we have built the Virtual Plant Tissue (VPTissue) package to couple functional modules or models in the same framework and across different frameworks. Multiple levels of model integration and coordination enable combining existing and new models from different sources, with diverse options in terms of input/output. Besides the core simulator the toolset also comprises a tissue editor for manipulating tissue geometry and cell, wall, and node attributes in an interactive manner. A parameter exploration tool is available to study parameter dependence of simulation results by distributing calculations over multiple systems. Availability: Virtual Plant Tissue is available as open source (EUPL license) on Bitbucket (https://bitbucket.org/vptissue/vptissue). The project has a website https://vptissue.bitbucket.io. PMID:28523006

  13. SES cupola interactive display design environment

    NASA Technical Reports Server (NTRS)

    Vu, Bang Q.; Kirkhoff, Kevin R.

    1989-01-01

    The Systems Engineering Simulator, located at the Lyndon B. Johnson Space Center in Houston, Texas, is tasked with providing a real-time simulator for developing displays and controls targeted for the Space Station Freedom. These displays and controls will exist inside an enclosed workstation located on the space station. The simulation is currently providing the engineering analysis environment for NASA and contractor personnel to design, prototype, and test alternatives for graphical presentation of data to an astronaut while he performs specified tasks. A highly desirable aspect of this environment is to have the capability to rapidly develop and bring on-line a number of different displays for use in determining the best utilization of graphics techniques in achieving maximum efficiency of the test subject fulfilling his task. The Systems Engineering Simulator now has available a tool which assists in the rapid development of displays for these graphic workstations. The Display Builder was developed in-house to provide an environment which allows easy construction and modification of displays within minutes of receiving requirements for specific tests.

  14. Design and simulation of an articulated surgical arm for guiding stereotactic neurosurgery

    NASA Astrophysics Data System (ADS)

    Kadi, A. Majeed; Zamorano, Lucia J.; Frazer, Matthew P.; Lu, Yi

    1992-03-01

    In stereotactic surgery, the need exists for means of relating intraoperatively the position and orientation of the surgical instrument used by the neurosurgeon to a known frame of reference. An articulated arm is proposed which would provide the neurosurgeon with on-line information for position, and orientation of the surgical tools being moved by the neurosurgeon. The articulated arm has six degrees of freedom, with five revolute and one prismatic joints. The design features include no obstruction to the field of view, lightweight, good balance against gravity, an accuracy of 1 mm spherical error probability (SEP), and a solvable kinematic structure making it capable of fitting the operating room environment. The arm can be mounted on either the surgical table or the stereotactic frame. A graphical simulation of the arm was created using the IGRIP simulation package created by Deneb Robotics. The simulation demonstrates the use of the arm, mounted on several positions of the ring reaching various target points within the cranium.

  15. Regional Climate Simulation and Data Assimilation with Variable-Resolution GCMs

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.

    2002-01-01

    Variable resolution GCMs using a global stretched grid (SG) with enhanced regional resolution over one or multiple areas of interest represents a viable new approach to regional climateklimate change and data assimilation studies and applications. The multiple areas of interest, at least one within each global quadrant, include the major global mountains and major global monsoonal circulations over North America, South America, India-China, and Australia. They also can include the polar domains, and the European and African regions. The SG-approach provides an efficient regional downscaling to mesoscales, and it is an ideal tool for representing consistent interactions of globaYlarge- and regionallmeso- scales while preserving the high quality of global circulation. Basically, the SG-GCM simulations are no different from those of the traditional uniform-grid GCM simulations besides using a variable-resolution grid. Several existing SG-GCMs developed by major centers and groups are briefly described. The major discussion is based on the GEOS (Goddard Earth Observing System) SG-GCM regional climate simulations.

  16. Design of Scalable and Effective Earth Science Collaboration Tool

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.

    2014-12-01

    Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation Suite (NEOS3).

  17. A review of virtual reality based training simulators for orthopaedic surgery.

    PubMed

    Vaughan, Neil; Dubey, Venketesh N; Wainwright, Thomas W; Middleton, Robert G

    2016-02-01

    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 hip replacements pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Proceedings of the 1987 conference on tools for the simulation profession

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawkins, R.; Klukis, K.

    1987-01-01

    This book covers the proceedings of the 1987 conference on tools for the simulation profession. Some of the topics are: SIMULACT: a generic tool for simulating distributed systems; ESL language simulation of spacecraft batteries; and Trends in global cadmium levels from increased use of fossil fuels.

  19. Using micro-simulation to investigate the safety impacts of transit design alternatives at signalized intersections.

    PubMed

    Li, Lu; Persaud, Bhagwant; Shalaby, Amer

    2017-03-01

    This study investigates the use of crash prediction models and micro-simulation to develop an effective surrogate safety assessment measure at the intersection level. With the use of these tools, hypothetical scenarios can be developed and explored to evaluate the safety impacts of design alternatives in a controlled environment, in which factors not directly associated with the design alternatives can be fixed. Micro-simulation models are developed, calibrated, and validated. Traffic conflicts in the micro-simulation models are estimated and linked with observed crash frequency, which greatly alleviates the lengthy time needed to collect sufficient crash data for evaluating alternatives, due to the rare and infrequent nature of crash events. A set of generalized linear models with negative binomial error structure is developed to correlate the simulated conflicts with the observed crash frequency in Toronto, Ontario, Canada. Crash prediction models are also developed for crashes of different impact types and for transit-involved crashes. The resulting statistical significance and the goodness-of-fit of the models suggest adequate predictive ability. Based on the established correlation between simulated conflicts and observed crashes, scenarios are developed in the micro-simulation models to investigate the safety effects of individual transit line elements by making hypothetical modifications to such elements and estimating changes in crash frequency from the resulting changes in conflicts. The findings imply that the existing transit signal priority schemes can have a negative effect on safety performance, and that the existing near-side stop positioning and streetcar transit type can be safer at their current state than if they were to be replaced by their respective counterparts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. IgSimulator: a versatile immunosequencing simulator.

    PubMed

    Safonova, Yana; Lapidus, Alla; Lill, Jennie

    2015-10-01

    The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Robustness of atomistic Gō models in predicting native-like folding intermediates

    NASA Astrophysics Data System (ADS)

    Estácio, S. G.; Fernandes, C. S.; Krobath, H.; Faísca, P. F. N.; Shakhnovich, E. I.

    2012-08-01

    Gō models are exceedingly popular tools in computer simulations of protein folding. These models are native-centric, i.e., they are directly constructed from the protein's native structure. Therefore, it is important to understand up to which extent the atomistic details of the native structure dictate the folding behavior exhibited by Gō models. Here we address this challenge by performing exhaustive discrete molecular dynamics simulations of a Gō potential combined with a full atomistic protein representation. In particular, we investigate the robustness of this particular type of Gō models in predicting the existence of intermediate states in protein folding. We focus on the N47G mutational form of the Spc-SH3 folding domain (x-ray structure) and compare its folding pathway with that of alternative native structures produced in silico. Our methodological strategy comprises equilibrium folding simulations, structural clustering, and principal component analysis.

  2. Simulation of a Cold Gas Thruster System and Test Data Correlation

    NASA Technical Reports Server (NTRS)

    Hauser, Daniel M.; Quinn, Frank D.

    2012-01-01

    During developmental testing of the Ascent Abort 1 (AA-1) cold gas thruster system, unexpected behavior was detected. Upon further review the design as it existed may not have met the requirements. To determine the best approach for modifying the design, the system was modeled with a dynamic fluid analysis tool (EASY5). The system model consisted of the nitrogen storage tank, pressure regulator, thruster valve, nozzle, and the associated interconnecting line lengths. The regulator and thruster valves were modeled using a combination of the fluid and mechanical modules available in EASY5. The simulation results were then compared against actual system test data. The simulation results exhibited behaviors similar to the test results, such as the pressure regulators response to thruster firings. Potential design solutions were investigated using the analytical model parameters, including increasing the volume downstream of the regulator and increasing the orifice area. Both were shown to improve the regulator response.

  3. DualSPHysics: A numerical tool to simulate real breakwaters

    NASA Astrophysics Data System (ADS)

    Zhang, Feng; Crespo, Alejandro; Altomare, Corrado; Domínguez, José; Marzeddu, Andrea; Shang, Shao-ping; Gómez-Gesteira, Moncho

    2018-02-01

    The open-source code DualSPHysics is used in this work to compute the wave run-up in an existing dike in the Chinese coast using realistic dimensions, bathymetry and wave conditions. The GPU computing power of the DualSPHysics allows simulating real-engineering problems that involve complex geometries with a high resolution in a reasonable computational time. The code is first validated by comparing the numerical free-surface elevation, the wave orbital velocities and the time series of the run-up with physical data in a wave flume. Those experiments include a smooth dike and an armored dike with two layers of cubic blocks. After validation, the code is applied to a real case to obtain the wave run-up under different incident wave conditions. In order to simulate the real open sea, the spurious reflections from the wavemaker are removed by using an active wave absorption technique.

  4. Risk Reduction and Training using Simulation Based Tools - 12180

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Irin P.

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less

  5. A methodological, task-based approach to Procedure-Specific Simulations training.

    PubMed

    Setty, Yaki; Salzman, Oren

    2016-12-01

    Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.

  6. Infrared imagery acquisition process supporting simulation and real image training

    NASA Astrophysics Data System (ADS)

    O'Connor, John

    2012-05-01

    The increasing use of infrared sensors requires development of advanced infrared training and simulation tools to meet current Warfighter needs. In order to prepare the force, a challenge exists for training and simulation images to be both realistic and consistent with each other to be effective and avoid negative training. The US Army Night Vision and Electronic Sensors Directorate has corrected this deficiency by developing and implementing infrared image collection methods that meet the needs of both real image trainers and real-time simulations. The author presents innovative methods for collection of high-fidelity digital infrared images and the associated equipment and environmental standards. The collected images are the foundation for US Army, and USMC Recognition of Combat Vehicles (ROC-V) real image combat ID training and also support simulations including the Night Vision Image Generator and Synthetic Environment Core. The characteristics, consistency, and quality of these images have contributed to the success of these and other programs. To date, this method has been employed to generate signature sets for over 350 vehicles. The needs of future physics-based simulations will also be met by this data. NVESD's ROC-V image database will support the development of training and simulation capabilities as Warfighter needs evolve.

  7. The MeqTrees software system and its use for third-generation calibration of radio interferometers

    NASA Astrophysics Data System (ADS)

    Noordam, J. E.; Smirnov, O. M.

    2010-12-01

    Context. The formulation of the radio interferometer measurement equation (RIME) for a generic radio telescope by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. Aims: The MeqTrees software system is designed to implement numerical models, and to solve for arbitrary subsets of their parameters. It may be applied to many problems, but was originally geared towards implementing Measurement Equations in radio astronomy for the purposes of simulation and calibration. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool that facilitates rapid experimentation, and exchange of ideas (and scripts). Methods: MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a Python-based Tree Definition Language (TDL), then rapidly executed on the back-end. The use of TDL facilitates an extremely short turn-around time (hours rather than weeks or months) for experimentation with new ideas. This is also helped by unprecedented visualization capabilities for all final and intermediate results. A flexible data model and a number of important optimizations in the back-end ensures that the numerical performance is comparable to that of hand-written code. Results: MeqTrees is already widely used as the simulation tool for new instruments (LOFAR, SKA) and technologies (focal plane arrays). It has demonstrated that it can achieve a noise-limited dynamic range in excess of a million, on WSRT data. It is the only package that is specifically designed to handle what we propose to call third-generation calibration (3GC), which is needed for the new generation of giant radio telescopes, but can also improve the calibration of existing instruments.

  8. A flexible object-oriented software framework for developing complex multimedia simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less

  9. The modelling of odour dispersion as a support tool for the improvements of high odours impact plants.

    PubMed

    Luciano, Antonella; Torretta, Vincenzo; Mancini, Giuseppe; Eleuteri, Andrea; Raboni, Massimo; Viotti, Paolo

    2017-03-01

    Two scenarios in terms of odour impact assessment were studied during the phase of upgrading of an existing waste treatment plant: CALPUFF was used for the simulation of odour dispersion. Olfactometric measures, carried out over different periods and different positions in the plant, were used for model calibration. Results from simulations were reported in terms of statistics of odour concentrations and isopleths maps of the 98th percentile of the hourly peak concentrations, as requested from the European legislation and standards. The excess perception thresholds and emissions were utilized to address the plant upgrade options. The hourly evaluation of odours was performed to determine the most impacting period of the day. An inverse application of the numerical simulation starting from defining the odour threshold at the receptor was made to allow the definition of the required abatement efficiency at the odours source location. Results from the proposed approach confirmed the likelihood to adopt odour dispersion modelling, not only in the authorization phase, but also as a tool for driving technical and managing actions in plant upgrade so to reduce impacts and improve the public acceptance. The upgrade actions in order to achieve the expected efficiency are reported as well.

  10. Development and Experimental Benchmark of Simulations to Predict Used Nuclear Fuel Cladding Temperatures during Drying and Transfer Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greiner, Miles

    Radial hydride formation in high-burnup used fuel cladding has the potential to radically reduce its ductility and suitability for long-term storage and eventual transport. To avoid this formation, the maximum post-reactor temperature must remain sufficiently low to limit the cladding hoop stress, and so that hydrogen from the existing circumferential hydrides will not dissolve and become available to re-precipitate into radial hydrides under the slow cooling conditions during drying, transfer and early dry-cask storage. The objective of this research is to develop and experimentallybenchmark computational fluid dynamics simulations of heat transfer in post-pool-storage drying operations, when high-burnup fuel cladding ismore » likely to experience its highest temperature. These benchmarked tools can play a key role in evaluating dry cask storage systems for extended storage of high-burnup fuels and post-storage transportation, including fuel retrievability. The benchmarked tools will be used to aid the design of efficient drying processes, as well as estimate variations of surface temperatures as a means of inferring helium integrity inside the canister or cask. This work will be conducted effectively because the principal investigator has experience developing these types of simulations, and has constructed a test facility that can be used to benchmark them.« less

  11. Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu Dianliang; Zhu Hongmin; Shanghai Key Laboratory of Advance Manufacturing Environment

    Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools andmore » equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.« less

  12. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.

    2017-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  13. Patch-clamp recordings of rat neurons from acute brain slices of the somatosensory cortex during magnetic stimulation

    PubMed Central

    Pashut, Tamar; Magidov, Dafna; Ben-Porat, Hana; Wolfus, Shuki; Friedman, Alex; Perel, Eli; Lavidor, Michal; Bar-Gad, Izhar; Yeshurun, Yosef; Korngreen, Alon

    2014-01-01

    Although transcranial magnetic stimulation (TMS) is a popular tool for both basic research and clinical applications, its actions on nerve cells are only partially understood. We have previously predicted, using compartmental modeling, that magnetic stimulation of central nervous system neurons depolarized the soma followed by initiation of an action potential in the initial segment of the axon. The simulations also predict that neurons with low current threshold are more susceptible to magnetic stimulation. Here we tested these theoretical predictions by combining in vitro patch-clamp recordings from rat brain slices with magnetic stimulation and compartmental modeling. In agreement with the modeling, our recordings demonstrate the dependence of magnetic stimulation-triggered action potentials on the type and state of the neuron and its orientation within the magnetic field. Our results suggest that the observed effects of TMS are deeply rooted in the biophysical properties of single neurons in the central nervous system and provide a framework both for interpreting existing TMS data and developing new simulation-based tools and therapies. PMID:24917788

  14. Cross-verification of the GENE and XGC codes in preparation for their coupling

    NASA Astrophysics Data System (ADS)

    Jenko, Frank; Merlo, Gabriele; Bhattacharjee, Amitava; Chang, Cs; Dominski, Julien; Ku, Seunghoe; Parker, Scott; Lanti, Emmanuel

    2017-10-01

    A high-fidelity Whole Device Model (WDM) of a magnetically confined plasma is a crucial tool for planning and optimizing the design of future fusion reactors, including ITER. Aiming at building such a tool, in the framework of the Exascale Computing Project (ECP) the two existing gyrokinetic codes GENE (Eulerian delta-f) and XGC (PIC full-f) will be coupled, thus enabling to carry out first principle kinetic WDM simulations. In preparation for this ultimate goal, a benchmark between the two codes is carried out looking at ITG modes in the adiabatic electron limit. This verification exercise is also joined by the global Lagrangian PIC code ORB5. Linear and nonlinear comparisons have been carried out, neglecting for simplicity collisions and sources. A very good agreement is recovered on frequency, growth rate and mode structure of linear modes. A similarly excellent agreement is also observed comparing the evolution of the heat flux and of the background temperature profile during nonlinear simulations. Work supported by the US DOE under the Exascale Computing Project (17-SC-20-SC).

  15. Using Velocity Anisotropy to Analyze Magnetohydrodynamic Turbulence in Giant Molecular Clouds

    NASA Astrophysics Data System (ADS)

    Madrid, Alecio; Hernandez, Audra

    2018-01-01

    Structure function (SF) analysis is a strong tool for gaging the Alfvénic properties of magnetohydrodynamic (MHD) simulations, yet there is a lack of literature rigorously investigating limitations in the context of radio spectroscopy. This study takes an in depth approach to studying the limitations of SF analysis for analyzing MHD turbulence in giant molecular cloud (GMC) spectroscopy data. MHD turbulence plays a critical role in the structure and evolution of GMCs as well as in the formation of sub-structures known to spawn stellar progenitors. Existing methods of detection are neither economical nor robust (e.g. dust polarization), and nowhere is this more clear than in the theoretical-observational divide in current literature. A significant limitation of GMC spectroscopy results from the large variation in methods used for extracting GMCs from survey data. Thus, a robust method for studying MHD turbulence must correctly gauge physical properties regardless of the data extraction method used. While SF analysis has demonstrated strong potential across a range of simulated conditions, this study finds significant concern regarding its feasibility as a robust tool in GMC spectroscopy.

  16. Shale Fracture Analysis using the Combined Finite-Discrete Element Method

    NASA Astrophysics Data System (ADS)

    Carey, J. W.; Lei, Z.; Rougier, E.; Knight, E. E.; Viswanathan, H.

    2014-12-01

    Hydraulic fracturing (hydrofrac) is a successful method used to extract oil and gas from highly carbonate rocks like shale. However, challenges exist for industry experts estimate that for a single $10 million dollar lateral wellbore fracking operation, only 10% of the hydrocarbons contained in the rock are extracted. To better understand how to improve hydrofrac recovery efficiencies and to lower its costs, LANL recently funded the Laboratory Directed Research and Development (LDRD) project: "Discovery Science of Hydraulic Fracturing: Innovative Working Fluids and Their Interactions with Rocks, Fractures, and Hydrocarbons". Under the support of this project, the LDRD modeling team is working with the experimental team to understand fracture initiation and propagation in shale rocks. LANL's hybrid hydro-mechanical (HM) tool, the Hybrid Optimization Software Suite (HOSS), is being used to simulate the complex fracture and fragment processes under a variety of different boundary conditions. HOSS is based on the combined finite-discrete element method (FDEM) and has been proven to be a superior computational tool for multi-fracturing problems. In this work, the comparison of HOSS simulation results to triaxial core flooding experiments will be presented.

  17. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  18. An Update on Improvements to NiCE Support for PROTEUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Andrew; McCaskey, Alexander J.; Billings, Jay Jay

    2015-09-01

    The Department of Energy Office of Nuclear Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has supported the development of the NEAMS Integrated Computational Environment (NiCE), a modeling and simulation workflow environment that provides services and plugins to facilitate tasks such as code execution, model input construction, visualization, and data analysis. This report details the development of workflows for the reactor core neutronics application, PROTEUS. This advanced neutronics application (primarily developed at Argonne National Laboratory) aims to improve nuclear reactor design and analysis by providing an extensible and massively parallel, finite-element solver for current and advanced reactor fuel neutronicsmore » modeling. The integration of PROTEUS-specific tools into NiCE is intended to make the advanced capabilities that PROTEUS provides more accessible to the nuclear energy research and development community. This report will detail the work done to improve existing PROTEUS workflow support in NiCE. We will demonstrate and discuss these improvements, including the development of flexible IO services, an improved interface for input generation, and the addition of advanced Fortran development tools natively in the platform.« less

  19. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  20. NASA's Integrated Instrument Simulator Suite for Atmospheric Remote Sensing from Spaceborne Platforms (ISSARS) and Its Role for the ACE and GPM Missions

    NASA Technical Reports Server (NTRS)

    Tanelli, Simone; Tao, Wei-Kuo; Hostetler, Chris; Kuo, Kwo-Sen; Matsui, Toshihisa; Jacob, Joseph C.; Niamsuwam, Noppasin; Johnson, Michael P.; Hair, John; Butler, Carolyn; hide

    2011-01-01

    Forward simulation is an indispensable tool for evaluation of precipitation retrieval algorithms as well as for studying snow/ice microphysics and their radiative properties. The main challenge of the implementation arises due to the size of the problem domain. To overcome this hurdle, assumptions need to be made to simplify compiles cloud microphysics. It is important that these assumptions are applied consistently throughout the simulation process. ISSARS addresses this issue by providing a computationally efficient and modular framework that can integrate currently existing models and is also capable of expanding for future development. ISSARS is designed to accommodate the simulation needs of the Aerosol/Clouds/Ecosystems (ACE) mission and the Global Precipitation Measurement (GPM) mission: radars, microwave radiometers, and optical instruments such as lidars and polarimeter. ISSARS's computation is performed in three stages: input reconditioning (IRM), electromagnetic properties (scattering/emission/absorption) calculation (SEAM), and instrument simulation (ISM). The computation is implemented as a web service while its configuration can be accessed through a web-based interface.

  1. Ultra-Scale Computing for Emergency Evacuation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhaduri, Budhendra L; Nutaro, James J; Liu, Cheng

    2010-01-01

    Emergency evacuations are carried out in anticipation of a disaster such as hurricane landfall or flooding, and in response to a disaster that strikes without a warning. Existing emergency evacuation modeling and simulation tools are primarily designed for evacuation planning and are of limited value in operational support for real time evacuation management. In order to align with desktop computing, these models reduce the data and computational complexities through simple approximations and representations of real network conditions and traffic behaviors, which rarely represent real-world scenarios. With the emergence of high resolution physiographic, demographic, and socioeconomic data and supercomputing platforms, itmore » is possible to develop micro-simulation based emergency evacuation models that can foster development of novel algorithms for human behavior and traffic assignments, and can simulate evacuation of millions of people over a large geographic area. However, such advances in evacuation modeling and simulations demand computational capacity beyond the desktop scales and can be supported by high performance computing platforms. This paper explores the motivation and feasibility of ultra-scale computing for increasing the speed of high resolution emergency evacuation simulations.« less

  2. Towards a genetics-based adaptive agent to support flight testing

    NASA Astrophysics Data System (ADS)

    Cribbs, Henry Brown, III

    Although the benefits of aircraft simulation have been known since the late 1960s, simulation almost always entails interaction with a human test pilot. This "pilot-in-the-loop" simulation process provides useful evaluative information to the aircraft designer and provides a training tool to the pilot. Emulation of a pilot during the early phases of the aircraft design process might provide designers a useful evaluative tool. Machine learning might emulate a pilot in a simulated aircraft/cockpit setting. Preliminary work in the application of machine learning techniques, such as reinforcement learning, to aircraft maneuvering have shown promise. These studies used simplified interfaces between machine learning agent and the aircraft simulation. The simulations employed low order equivalent system models. High-fidelity aircraft simulations exist, such as the simulations developed by NASA at its Dryden Flight Research Center. To expand the applicational domain of reinforcement learning to aircraft designs, this study presents a series of experiments that examine a reinforcement learning agent in the role of test pilot. The NASA X-31 and F-106 high-fidelity simulations provide realistic aircraft for the agent to maneuver. The approach of the study is to examine an agent possessing a genetic-based, artificial neural network to approximate long-term, expected cost (Bellman value) in a basic maneuvering task. The experiments evaluate different learning methods based on a common feedback function and an identical task. The learning methods evaluated are: Q-learning, Q(lambda)-learning, SARSA learning, and SARSA(lambda) learning. Experimental results indicate that, while prediction error remain quite high, similar, repeatable behaviors occur in both aircraft. Similar behavior exhibits portability of the agent between aircraft with different handling qualities (dynamics). Besides the adaptive behavior aspects of the study, the genetic algorithm used in the agent is shown to play an additive role in the shaping of the artificial neural network to the prediction task.

  3. BioJazz: in silico evolution of cellular networks with unbounded complexity using rule-based modeling.

    PubMed

    Feng, Song; Ollivier, Julien F; Swain, Peter S; Soyer, Orkun S

    2015-10-30

    Systems biologists aim to decipher the structure and dynamics of signaling and regulatory networks underpinning cellular responses; synthetic biologists can use this insight to alter existing networks or engineer de novo ones. Both tasks will benefit from an understanding of which structural and dynamic features of networks can emerge from evolutionary processes, through which intermediary steps these arise, and whether they embody general design principles. As natural evolution at the level of network dynamics is difficult to study, in silico evolution of network models can provide important insights. However, current tools used for in silico evolution of network dynamics are limited to ad hoc computer simulations and models. Here we introduce BioJazz, an extendable, user-friendly tool for simulating the evolution of dynamic biochemical networks. Unlike previous tools for in silico evolution, BioJazz allows for the evolution of cellular networks with unbounded complexity by combining rule-based modeling with an encoding of networks that is akin to a genome. We show that BioJazz can be used to implement biologically realistic selective pressures and allows exploration of the space of network architectures and dynamics that implement prescribed physiological functions. BioJazz is provided as an open-source tool to facilitate its further development and use. Source code and user manuals are available at: http://oss-lab.github.io/biojazz and http://osslab.lifesci.warwick.ac.uk/BioJazz.aspx. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    PubMed

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  5. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  6. Face and construct validation of a next generation virtual reality (Gen2-VR) surgical simulator.

    PubMed

    Sankaranarayanan, Ganesh; Li, Baichun; Manser, Kelly; Jones, Stephanie B; Jones, Daniel B; Schwaitzberg, Steven; Cao, Caroline G L; De, Suvranu

    2016-03-01

    Surgical performance is affected by distractors and interruptions to surgical workflow that exist in the operating room. However, traditional surgical simulators are used to train surgeons in a skills laboratory that does not recreate these conditions. To overcome this limitation, we have developed a novel, immersive virtual reality (Gen2-VR) system to train surgeons in these environments. This study was to establish face and construct validity of our system. The study was a within-subjects design, with subjects repeating a virtual peg transfer task under three different conditions: Case I: traditional VR; Case II: Gen2-VR with no distractions and Case III: Gen2-VR with distractions and interruptions. In Case III, to simulate the effects of distractions and interruptions, music was played intermittently, the camera lens was fogged for 10 s and tools malfunctioned for 15 s at random points in time during the simulation. At the completion of the study subjects filled in a 5-point Likert scale feedback questionnaire. A total of sixteen subjects participated in this study. Friedman test showed significant difference in scores between the three conditions (p < 0.0001). Post hoc analysis using Wilcoxon signed-rank tests with Bonferroni correction further showed that all the three conditions were significantly different from each other (Case I, Case II, p < 0.0001), (Case I, Case III, p < 0.0001) and (Case II, Case III, p = 0.009). Subjects rated that fog (mean 4.18) and tool malfunction (median 4.56) significantly hindered their performance. The results showed that Gen2-VR simulator has both face and construct validity and that it can accurately and realistically present distractions and interruptions in a simulated OR, in spite of limitations of the current HMD hardware technology.

  7. Face and Construct Validation of a Next Generation Virtual Reality (Gen2-VR©) Surgical Simulator

    PubMed Central

    Sankaranarayanan, Ganesh; Li, Baichun; Manser, Kelly; Jones, Stephanie B.; Jones, Daniel B.; Schwaitzberg, Steven; Cao, Caroline G. L.; De, Suvranu

    2015-01-01

    Introduction Surgical performance is affected by distractors and interruptions to surgical workflow that exist in the operating room. However, traditional surgical simulators are used to train surgeons in a skills lab that does not recreate these conditions. To overcome this limitation, we have developed a novel, immersive virtual reality (Gen2-VR©) system to train surgeons in these environments. This study was to establish face and construct validity of our system. Methods and Procedures The study was a within-subjects design, with subjects repeating a virtual peg transfer task under three different conditions: CASE I: traditional VR; CASE II: Gen2-VR© with no distractions and CASE III: Gen2-VR© with distractions and interruptions.. In Case III, to simulate the effects of distractions and interruptions, music was played intermittently, the camera lens was fogged for 10 seconds and tools malfunctioned for 15 seconds at random points in time during the simulation. At the completion of the study subjects filled in a 5-point Likert scale feedback questionnaire. A total of sixteen subjects participated in this study. Results Friedman test showed significant difference in scores between the three conditions (p < 0.0001). Post hoc analysis using Wilcoxon Signed Rank tests with Bonferroni correction further showed that all the three conditions were significantly different from each other (Case I, Case II, p < 0.001), (Case I, Case III, p < 0.001) and (Case II, Case III, p = 0.009). Subjects rated that fog (mean= 4.18) and tool malfunction (median = 4.56) significantly hindered their performance. Conclusion The results showed that Gen2-VR© simulator has both face and construct validity and it can accurately and realistically present distractions and interruptions in a simulated OR, in spite of limitations of the current HMD hardware technology. PMID:26092010

  8. Modeling and Simulation of an UAS Collision Avoidance Systems

    NASA Technical Reports Server (NTRS)

    Oliveros, Edgardo V.; Murray, A. Jennifer

    2010-01-01

    This paper describes a Modeling and Simulation of an Unmanned Aircraft Systems (UAS) Collision Avoidance System, capable of representing different types of scenarios for UAS collision avoidance. Commercial and military piloted aircraft currently utilize various systems for collision avoidance such as Traffic Alert and Collision A voidance System (TCAS), Automatic Dependent Surveillance-Broadcast (ADS-B), Radar and ElectroOptical and Infrared Sensors (EO-IR). The integration of information from these systems is done by the pilot in the aircraft to determine the best course of action. In order to operate optimally in the National Airspace System (NAS) UAS have to work in a similar or equivalent manner to a piloted aircraft by applying the principle of "detect-see and avoid" (DSA) to other air traffic. Hence, we have taken these existing sensor technologies into consideration in order to meet the challenge of researching the modeling and simulation of an approximated DSA system. A Schematic Model for a UAS Collision Avoidance System (CAS) has been developed ina closed loop block diagram for that purpose. We have found that the most suitable software to carry out this task is the Satellite Tool Kit (STK) from Analytical Graphics Inc. (AGI). We have used the Aircraft Mission Modeler (AMM) for modeling and simulation of a scenario where a UAS is placed on a possible collision path with an initial intruder and then with a second intruder, but is able to avoid them by executing a right tum maneuver and then climbing. Radars have also been modeled with specific characteristics for the UAS and both intruders. The software provides analytical, graphical user interfaces and data controlling tools which allow the operator to simulate different conditions. Extensive simulations have been carried out which returned excellent results.

  9. A generalised individual-based algorithm for modelling the evolution of quantitative herbicide resistance in arable weed populations.

    PubMed

    Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul

    2017-02-01

    Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  10. Design and development of a unit element microstrip antenna for aircraft collision avoidance system

    NASA Astrophysics Data System (ADS)

    De, Debajit; Sahu, Prasanna Kumar

    2017-10-01

    Aircraft/traffic alert and collision avoidance system (ACAS/TCAS) is an airborne system which is designed to provide the service as a last defense equipment for avoiding mid-air collisions between the aircraft. In the existing system, four monopole stub-elements are used as ACAS directional antenna and one blade type element is used as ACAS omnidirectional antenna. The existing ACAS antenna has some drawbacks such as low gain, large beamwidth, frequency and beam tuning/scanning issues etc. Antenna issues like unwanted signals reception may create difficulties to identify the possible threats. In this paper, the focus is on the design and development of a unit element microstrip antenna which can be used for ACAS application and to overcome the possible limitations associated with the existing techniques. Two proposed antenna models are presented here, which are single feed and dual feed microstrip dual patch slotted antenna. These are designed and simulated in CST Microwave Studio tool. The performance and other antenna characteristics have been explored from the simulation results followed by the antenna fabrication and measurement. A good reflection coefficient, Voltage Standing Wave Ratio (VSWR), narrow beamwidth, perfect directional radiation pattern, high gain and directivity make this proposed antenna a good candidate for this application.

  11. Simulation Activity in Otolaryngology Residencies.

    PubMed

    Deutsch, Ellen S; Wiet, Gregory J; Seidman, Michael; Hussey, Heather M; Malekzadeh, Sonya; Fried, Marvin P

    2015-08-01

    Simulation has become a valuable tool in medical education, and several specialties accept or require simulation as a resource for resident training or assessment as well as for board certification or maintenance of certification. This study investigates current simulation resources and activities in US otolaryngology residency programs and examines interest in advancing simulation training and assessment within the specialty. Web-based survey. US otolaryngology residency training programs. An electronic web-based survey was disseminated to all US otolaryngology program directors to determine their respective institutional and departmental simulation resources, existing simulation activities, and interest in further simulation initiatives. Descriptive results are reported. Responses were received from 43 of 104 (43%) residency programs. Simulation capabilities and resources are available in most respondents' institutions (78.6% report onsite resources; 73.8% report availability of models, manikins, and devices). Most respondents (61%) report limited simulation activity within otolaryngology. Areas of simulation are broad, addressing technical and nontechnical skills related to clinical training (94%). Simulation is infrequently used for research, credentialing, or systems improvement. The majority of respondents (83.8%) expressed interest in participating in multicenter trials of simulation initiatives. Most respondents from otolaryngology residency programs have incorporated some simulation into their curriculum. Interest among program directors to participate in future multicenter trials appears high. Future research efforts in this area should aim to determine optimal simulators and simulation activities for training and assessment as well as how to best incorporate simulation into otolaryngology residency training programs. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  12. A web-based simulation of a longitudinal clinic used in a 4-week ambulatory rotation: a cohort study

    PubMed Central

    Wong, Rene WG; Lochnan, Heather A

    2009-01-01

    Background Residency training takes place primarily on inpatient wards. In the absence of a resident continuity clinic, internal medicine residents rely on block rotations to learn about continuity of care. Alternate methods to introduce continuity of care are needed. Methods A web-based tool, Continuity of Care Online Simulations (COCOS), was designed for use in a one-month, postgraduate clinical rotation in endocrinology. It is an interactive tool that simulates the continuing care of any patient with a chronic endocrine disease. Twenty-three residents in internal medicine participated in a study to investigate the effects of using COCOS during a clinical rotation in endocrinology on pre-post knowledge test scores and self-assessment of confidence. Results Compared to residents who did the rotation alone, residents who used COCOS during the rotation had significantly higher improvements in test scores (% increase in pre-post test scores +21.6 [standard deviation, SD, 8.0] vs. +5.9 [SD 6.8]; p < .001). Test score improvements were most pronounced for less commonly seen conditions. There were no significant differences in changes in confidence. Residents rated COCOS very highly, recommending its use as a standard part of the rotation and throughout residency. Conclusion A stand-alone web-based tool can be incorporated into an existing clinical rotation to help residents learn about continuity of care. It has the most potential to teach residents about topics that are less commonly seen during a clinical rotation. The adaptable, web-based format allows the creation of cases for most chronic medical conditions. PMID:19187554

  13. MVIAeval: a web tool for comprehensively evaluating the performance of a new missing value imputation algorithm.

    PubMed

    Wu, Wei-Sheng; Jhou, Meng-Jhun

    2017-01-13

    Missing value imputation is important for microarray data analyses because microarray data with missing values would significantly degrade the performance of the downstream analyses. Although many microarray missing value imputation algorithms have been developed, an objective and comprehensive performance comparison framework is still lacking. To solve this problem, we previously proposed a framework which can perform a comprehensive performance comparison of different existing algorithms. Also the performance of a new algorithm can be evaluated by our performance comparison framework. However, constructing our framework is not an easy task for the interested researchers. To save researchers' time and efforts, here we present an easy-to-use web tool named MVIAeval (Missing Value Imputation Algorithm evaluator) which implements our performance comparison framework. MVIAeval provides a user-friendly interface allowing users to upload the R code of their new algorithm and select (i) the test datasets among 20 benchmark microarray (time series and non-time series) datasets, (ii) the compared algorithms among 12 existing algorithms, (iii) the performance indices from three existing ones, (iv) the comprehensive performance scores from two possible choices, and (v) the number of simulation runs. The comprehensive performance comparison results are then generated and shown as both figures and tables. MVIAeval is a useful tool for researchers to easily conduct a comprehensive and objective performance evaluation of their newly developed missing value imputation algorithm for microarray data or any data which can be represented as a matrix form (e.g. NGS data or proteomics data). Thus, MVIAeval will greatly expedite the progress in the research of missing value imputation algorithms.

  14. Implementation of depolarization due to beam-beam effects in the beam-beam interaction simulation tool GUINEA-PIG++

    NASA Astrophysics Data System (ADS)

    Rimbault, C.; Le Meur, G.; Blampuy, F.; Bambade, P.; Schulte, D.

    2009-12-01

    Depolarization is a new feature in the beam-beam simulation tool GUINEA-PIG++ (GP++). The results of this simulation are studied and compared with another beam-beam simulation tool, CAIN, considering different beam parameters for the International Linear Collider (ILC) with a centre-of-mass energy of 500 GeV.

  15. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  16. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less

  17. Using indirect comparisons to compare interventions within a Cochrane review: a tool for comparative effectiveness research.

    PubMed

    Agapova, Maria; Devine, Emily B; Nguyen, Hiep; Wolf, Fredric M; Inoue, Lurdes Y T

    2014-07-01

    Assessing relative performance among competing interventions is an important part of comparative effectiveness research. Bayesian indirect comparisons add information to existing Cochrane reviews, such as which intervention is likely to perform best. However, heterogeneity variance priors may influence results and, potentially, clinical guidance. We highlight the features of Bayesian indirect comparisons using a case study of a Cochrane review update in asthma care. The probability that one self-management educational intervention outperforms others is estimated. Simulation studies investigate the effect of heterogeneity variance prior distributions. Results suggest a 55% probability that individual education is best, followed by combination (39%) and group (6%). The intervention with few trials was sensitive to prior distributions. Bayesian indirect comparisons updates of Cochrane reviews are valuable comparative effectiveness research tools.

  18. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  19. Computational Assessment of a 3-Stage Axial Compressor Which Provides Airflow to the NASA 11- by 11-Foot Transonic Wind Tunnel, Including Design Changes for Increased Performance

    NASA Technical Reports Server (NTRS)

    Kulkarni, Sameer; Beach, Timothy A.; Jorgenson, Philip C.; Veres, Joseph P.

    2017-01-01

    A 24 foot diameter 3-stage axial compressor powered by variable-speed induction motors provides the airflow in the closed-return 11- by 11-Foot Transonic Wind Tunnel (11-Foot TWT) Facility at NASA Ames Research Center at Moffett Field, California. The facility is part of the Unitary Plan Wind Tunnel, which was completed in 1955. Since then, upgrades made to the 11-Foot TWT such as flow conditioning devices and instrumentation have increased blockage and pressure loss in the tunnel, somewhat reducing the peak Mach number capability of the test section. Due to erosion effects on the existing aluminum alloy rotor blades, fabrication of new steel rotor blades is planned. This presents an opportunity to increase the Mach number capability of the tunnel by redesigning the compressor for increased pressure ratio. Challenging design constraints exist for any proposed design, demanding the use of the existing driveline, rotor disks, stator vanes, and hub and casing flow paths, so as to minimize cost and installation time. The current effort was undertaken to characterize the performance of the existing compressor design using available design tools and computational fluid dynamics (CFD) codes and subsequently recommend a new compressor design to achieve higher pressure ratio, which directly correlates with increased test section Mach number. The constant cross-sectional area of the compressor leads to highly diffusion factors, which presents a challenge in simulating the existing design. The CFD code APNASA was used to simulate the aerodynamic performance of the existing compressor. The simulations were compared to performance predictions from the HT0300 turbomachinery design and analysis code, and to compressor performance data taken during a 1997 facility test. It was found that the CFD simulations were sensitive to endwall leakages associated with stator buttons, and to a lesser degree, under-stator-platform flow recirculation at the hub. When stator button leakages were modeled, pumping capability increased by over 20 of pressure rise at design point due to a large reduction in aerodynamic blockage at the hub. Incorporating the stator button leakages was crucial to matching test data. Under-stator-platform flow recirculation was thought to be large due to a lack of seals. The effect of this recirculation was assessed with APNASA simulations recirculating 0.5, 1, and 2 of inlet flow about stators 1 and 2, modeled as axisymmetric mass flux boundary conditions on the hub before and after the vanes. The injection of flow ahead of the stators tended to re-energize the boundary layer and reduce hub separations, resulting in about 3 increased stall margin per 1 of inlet flow recirculated. In order to assess the value of the flow recirculation, a mixing plane simulation of the compressor which gridded the under-stator cavities was generated using the ADPAC CFD code. This simulation indicated that about 0.65 of the inlet flow is recirculated around each shrouded stator. This collective information was applied during the redesign of the compressor. A potential design was identified using HT0300 which improved overall pressure ratio by removing pre-swirl into rotor 1, replacing existing NASA 65 series rotors with double circular arc sections, and re-staggering rotors and the existing stators. The performance of the new design predicted by APNASA and HT0300 is compared to the existing design.

  20. Numerical Investigation of Plasma Detachment in Magnetic Nozzle Experiments

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2008-01-01

    At present there exists no generally accepted theoretical model that provides a consistent physical explanation of plasma detachment from an externally-imposed magnetic nozzle. To make progress towards that end, simulation of plasma flow in the magnetic nozzle of an arcjet experiment is performed using a multidimensional numerical simulation tool that includes theoretical models of the various dispersive and dissipative processes present in the plasma. This is an extension of the simulation tool employed in previous work by Sankaran et al. The aim is to compare the computational results with various proposed magnetic nozzle detachment theories to develop an understanding of the physical mechanisms that cause detachment. An applied magnetic field topology is obtained using a magnetostatic field solver (see Fig. I), and this field is superimposed on the time-dependent magnetic field induced in the plasma to provide a self-consistent field description. The applied magnetic field and model geometry match those found in experiments by Kuriki and Okada. This geometry is modeled because there is a substantial amount of experimental data that can be compared to the computational results, allowing for validation of the model. In addition, comparison of the simulation results with the experimentally obtained plasma parameters will provide insight into the mechanisms that lead to plasma detachment, revealing how they scale with different input parameters. Further studies will focus on modeling literature experiments both for the purpose of additional code validation and to extract physical insight regarding the mechanisms driving detachment.

  1. NEAMS Update. Quarterly Report for October - December 2011.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, K.

    2012-02-16

    The Advanced Modeling and Simulation Office within the DOE Office of Nuclear Energy (NE) has been charged with revolutionizing the design tools used to build nuclear power plants during the next 10 years. To accomplish this, the DOE has brought together the national laboratories, U.S. universities, and the nuclear energy industry to establish the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program. The mission of NEAMS is to modernize computer modeling of nuclear energy systems and improve the fidelity and validity of modeling results using contemporary software environments and high-performance computers. NEAMS will create a set of engineering-level codes aimedmore » at designing and analyzing the performance and safety of nuclear power plants and reactor fuels. The truly predictive nature of these codes will be achieved by modeling the governing phenomena at the spatial and temporal scales that dominate the behavior. These codes will be executed within a simulation environment that orchestrates code integration with respect to spatial meshing, computational resources, and execution to give the user a common 'look and feel' for setting up problems and displaying results. NEAMS is building upon a suite of existing simulation tools, including those developed by the federal Scientific Discovery through Advanced Computing and Advanced Simulation and Computing programs. NEAMS also draws upon existing simulation tools for materials and nuclear systems, although many of these are limited in terms of scale, applicability, and portability (their ability to be integrated into contemporary software and hardware architectures). NEAMS investments have directly and indirectly supported additional NE research and development programs, including those devoted to waste repositories, safeguarded separations systems, and long-term storage of used nuclear fuel. NEAMS is organized into two broad efforts, each comprising four elements. The quarterly highlights October-December 2011 are: (1) Version 1.0 of AMP, the fuel assembly performance code, was tested on the JAGUAR supercomputer and released on November 1, 2011, a detailed discussion of this new simulation tool is given; (2) A coolant sub-channel model and a preliminary UO{sub 2} smeared-cracking model were implemented in BISON, the single-pin fuel code, more information on how these models were developed and benchmarked is given; (3) The Object Kinetic Monte Carlo model was implemented to account for nucleation events in meso-scale simulations and a discussion of the significance of this advance is given; (4) The SHARP neutronics module, PROTEUS, was expanded to be applicable to all types of reactors, and a discussion of the importance of PROTEUS is given; (5) A plan has been finalized for integrating the high-fidelity, three-dimensional reactor code SHARP with both the systems-level code RELAP7 and the fuel assembly code AMP. This is a new initiative; (6) Work began to evaluate the applicability of AMP to the problem of dry storage of used fuel and to define a relevant problem to test the applicability; (7) A code to obtain phonon spectra from the force-constant matrix for a crystalline lattice has been completed. This important bridge between subcontinuum and continuum phenomena is discussed; (8) Benchmarking was begun on the meso-scale, finite-element fuels code MARMOT to validate its new variable splitting algorithm; (9) A very computationally demanding simulation of diffusion-driven nucleation of new microstructural features has been completed. An explanation of the difficulty of this simulation is given; (10) Experiments were conducted with deformed steel to validate a crystal plasticity finite-element code for bodycentered cubic iron; (11) The Capability Transfer Roadmap was completed and published as an internal laboratory technical report; (12) The AMP fuel assembly code input generator was integrated into the NEAMS Integrated Computational Environment (NiCE). More details on the planned NEAMS computing environment is given; and (13) The NEAMS program website (neams.energy.gov) is nearly ready to launch.« less

  2. Calibrating a forest landscape model to simulate frequent fire in Mediterranean-type shrublands

    USGS Publications Warehouse

    Syphard, A.D.; Yang, J.; Franklin, J.; He, H.S.; Keeley, J.E.

    2007-01-01

    In Mediterranean-type ecosystems (MTEs), fire disturbance influences the distribution of most plant communities, and altered fire regimes may be more important than climate factors in shaping future MTE vegetation dynamics. Models that simulate the high-frequency fire and post-fire response strategies characteristic of these regions will be important tools for evaluating potential landscape change scenarios. However, few existing models have been designed to simulate these properties over long time frames and broad spatial scales. We refined a landscape disturbance and succession (LANDIS) model to operate on an annual time step and to simulate altered fire regimes in a southern California Mediterranean landscape. After developing a comprehensive set of spatial and non-spatial variables and parameters, we calibrated the model to simulate very high fire frequencies and evaluated the simulations under several parameter scenarios representing hypotheses about system dynamics. The goal was to ensure that observed model behavior would simulate the specified fire regime parameters, and that the predictions were reasonable based on current understanding of community dynamics in the region. After calibration, the two dominant plant functional types responded realistically to different fire regime scenarios. Therefore, this model offers a new alternative for simulating altered fire regimes in MTE landscapes. ?? 2007 Elsevier Ltd. All rights reserved.

  3. Design and evaluation of an augmented reality simulator using leap motion.

    PubMed

    Wright, Trinette; de Ribaupierre, Sandrine; Eagleson, Roy

    2017-10-01

    Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system.

  4. Design and evaluation of an augmented reality simulator using leap motion

    PubMed Central

    de Ribaupierre, Sandrine; Eagleson, Roy

    2017-01-01

    Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system. PMID:29184667

  5. Technical Report: Benchmarking for Quasispecies Abundance Inference with Confidence Intervals from Metagenomic Sequence Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLoughlin, K.

    2016-01-22

    The software application “MetaQuant” was developed by our group at Lawrence Livermore National Laboratory (LLNL). It is designed to profile microbial populations in a sample using data from whole-genome shotgun (WGS) metagenomic DNA sequencing. Several other metagenomic profiling applications have been described in the literature. We ran a series of benchmark tests to compare the performance of MetaQuant against that of a few existing profiling tools, using real and simulated sequence datasets. This report describes our benchmarking procedure and results.

  6. Vehicle Modeling for use in the CAFE model: Process description and modeling assumptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moawad, Ayman; Kim, Namdoo; Rousseau, Aymeric

    2016-06-01

    The objective of this project is to develop and demonstrate a process that, at a minimum, provides more robust information that can be used to calibrate inputs applicable under the CAFE model’s existing structure. The project will be more fully successful if a process can be developed that minimizes the need for decision trees and replaces the synergy factors by inputs provided directly from a vehicle simulation tool. The report provides a description of the process that was developed by Argonne National Laboratory and implemented in Autonomie.

  7. A Proof of Concept: Grizzly, the LWRS Program Materials Aging and Degradation Pathway Main Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ben Spencer; Jeremey Busby; Richard Martineau

    2012-10-01

    Nuclear power currently provides a significant fraction of the United States’ non-carbon emitting power generation. In future years, nuclear power must continue to generate a significant portion of the nation’s electricity to meet the growing electricity demand, clean energy goals, and ensure energy independence. New reactors will be an essential part of the expansion of nuclear power. However, given limits on new builds imposed by economics and industrial capacity, the extended service of the existing fleet will also be required.

  8. SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.

    PubMed

    Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko

    2013-05-01

    Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.

  9. Multiscale Universal Interface: A concurrent framework for coupling heterogeneous solvers

    NASA Astrophysics Data System (ADS)

    Tang, Yu-Hang; Kudo, Shuhei; Bian, Xin; Li, Zhen; Karniadakis, George Em

    2015-09-01

    Concurrently coupled numerical simulations using heterogeneous solvers are powerful tools for modeling multiscale phenomena. However, major modifications to existing codes are often required to enable such simulations, posing significant difficulties in practice. In this paper we present a C++ library, i.e. the Multiscale Universal Interface (MUI), which is capable of facilitating the coupling effort for a wide range of multiscale simulations. The library adopts a header-only form with minimal external dependency and hence can be easily dropped into existing codes. A data sampler concept is introduced, combined with a hybrid dynamic/static typing mechanism, to create an easily customizable framework for solver-independent data interpretation. The library integrates MPI MPMD support and an asynchronous communication protocol to handle inter-solver information exchange irrespective of the solvers' own MPI awareness. Template metaprogramming is heavily employed to simultaneously improve runtime performance and code flexibility. We validated the library by solving three different multiscale problems, which also serve to demonstrate the flexibility of the framework in handling heterogeneous models and solvers. In the first example, a Couette flow was simulated using two concurrently coupled Smoothed Particle Hydrodynamics (SPH) simulations of different spatial resolutions. In the second example, we coupled the deterministic SPH method with the stochastic Dissipative Particle Dynamics (DPD) method to study the effect of surface grafting on the hydrodynamics properties on the surface. In the third example, we consider conjugate heat transfer between a solid domain and a fluid domain by coupling the particle-based energy-conserving DPD (eDPD) method with the Finite Element Method (FEM).

  10. Multiscale Universal Interface: A concurrent framework for coupling heterogeneous solvers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Yu-Hang, E-mail: yuhang_tang@brown.edu; Kudo, Shuhei, E-mail: shuhei-kudo@outlook.jp; Bian, Xin, E-mail: xin_bian@brown.edu

    2015-09-15

    Graphical abstract: - Abstract: Concurrently coupled numerical simulations using heterogeneous solvers are powerful tools for modeling multiscale phenomena. However, major modifications to existing codes are often required to enable such simulations, posing significant difficulties in practice. In this paper we present a C++ library, i.e. the Multiscale Universal Interface (MUI), which is capable of facilitating the coupling effort for a wide range of multiscale simulations. The library adopts a header-only form with minimal external dependency and hence can be easily dropped into existing codes. A data sampler concept is introduced, combined with a hybrid dynamic/static typing mechanism, to create anmore » easily customizable framework for solver-independent data interpretation. The library integrates MPI MPMD support and an asynchronous communication protocol to handle inter-solver information exchange irrespective of the solvers' own MPI awareness. Template metaprogramming is heavily employed to simultaneously improve runtime performance and code flexibility. We validated the library by solving three different multiscale problems, which also serve to demonstrate the flexibility of the framework in handling heterogeneous models and solvers. In the first example, a Couette flow was simulated using two concurrently coupled Smoothed Particle Hydrodynamics (SPH) simulations of different spatial resolutions. In the second example, we coupled the deterministic SPH method with the stochastic Dissipative Particle Dynamics (DPD) method to study the effect of surface grafting on the hydrodynamics properties on the surface. In the third example, we consider conjugate heat transfer between a solid domain and a fluid domain by coupling the particle-based energy-conserving DPD (eDPD) method with the Finite Element Method (FEM)« less

  11. Codifference as a practical tool to measure interdependence

    NASA Astrophysics Data System (ADS)

    Wyłomańska, Agnieszka; Chechkin, Aleksei; Gajda, Janusz; Sokolov, Igor M.

    2015-03-01

    Correlation and spectral analysis represent the standard tools to study interdependence in statistical data. However, for the stochastic processes with heavy-tailed distributions such that the variance diverges, these tools are inadequate. The heavy-tailed processes are ubiquitous in nature and finance. We here discuss codifference as a convenient measure to study statistical interdependence, and we aim to give a short introductory review of its properties. By taking different known stochastic processes as generic examples, we present explicit formulas for their codifferences. We show that for the Gaussian processes codifference is equivalent to covariance. For processes with finite variance these two measures behave similarly with time. For the processes with infinite variance the covariance does not exist, however, the codifference is relevant. We demonstrate the practical importance of the codifference by extracting this function from simulated as well as real data taken from turbulent plasma of fusion device and financial market. We conclude that the codifference serves as a convenient practical tool to study interdependence for stochastic processes with both infinite and finite variances as well.

  12. Alkemio: association of chemicals with biomedical topics by text and data mining

    PubMed Central

    Gijón-Correas, José A.; Andrade-Navarro, Miguel A.; Fontaine, Jean F.

    2014-01-01

    The PubMed® database of biomedical citations allows the retrieval of scientific articles studying the function of chemicals in biology and medicine. Mining millions of available citations to search reported associations between chemicals and topics of interest would require substantial human time. We have implemented the Alkemio text mining web tool and SOAP web service to help in this task. The tool uses biomedical articles discussing chemicals (including drugs), predicts their relatedness to the query topic with a naïve Bayesian classifier and ranks all chemicals by P-values computed from random simulations. Benchmarks on seven human pathways showed good retrieval performance (areas under the receiver operating characteristic curves ranged from 73.6 to 94.5%). Comparison with existing tools to retrieve chemicals associated to eight diseases showed the higher precision and recall of Alkemio when considering the top 10 candidate chemicals. Alkemio is a high performing web tool ranking chemicals for any biomedical topics and it is free to non-commercial users. Availability: http://cbdm.mdc-berlin.de/∼medlineranker/cms/alkemio. PMID:24838570

  13. A better sequence-read simulator program for metagenomics.

    PubMed

    Johnson, Stephen; Trost, Brett; Long, Jeffrey R; Pittet, Vanessa; Kusalik, Anthony

    2014-01-01

    There are many programs available for generating simulated whole-genome shotgun sequence reads. The data generated by many of these programs follow predefined models, which limits their use to the authors' original intentions. For example, many models assume that read lengths follow a uniform or normal distribution. Other programs generate models from actual sequencing data, but are limited to reads from single-genome studies. To our knowledge, there are no programs that allow a user to generate simulated data following non-parametric read-length distributions and quality profiles based on empirically-derived information from metagenomics sequencing data. We present BEAR (Better Emulation for Artificial Reads), a program that uses a machine-learning approach to generate reads with lengths and quality values that closely match empirically-derived distributions. BEAR can emulate reads from various sequencing platforms, including Illumina, 454, and Ion Torrent. BEAR requires minimal user input, as it automatically determines appropriate parameter settings from user-supplied data. BEAR also uses a unique method for deriving run-specific error rates, and extracts useful statistics from the metagenomic data itself, such as quality-error models. Many existing simulators are specific to a particular sequencing technology; however, BEAR is not restricted in this way. Because of its flexibility, BEAR is particularly useful for emulating the behaviour of technologies like Ion Torrent, for which no dedicated sequencing simulators are currently available. BEAR is also the first metagenomic sequencing simulator program that automates the process of generating abundances, which can be an arduous task. BEAR is useful for evaluating data processing tools in genomics. It has many advantages over existing comparable software, such as generating more realistic reads and being independent of sequencing technology, and has features particularly useful for metagenomics work.

  14. The validity of a professional competence tool for physiotherapy students in simulation-based clinical education: a Rasch analysis.

    PubMed

    Judd, Belinda K; Scanlan, Justin N; Alison, Jennifer A; Waters, Donna; Gordon, Christopher J

    2016-08-05

    Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.

  15. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acharya, Naresh; Baone, Chaitanya; Veda, Santosh

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less

  16. LoRTE: Detecting transposon-induced genomic variants using low coverage PacBio long read sequences.

    PubMed

    Disdero, Eric; Filée, Jonathan

    2017-01-01

    Population genomic analysis of transposable elements has greatly benefited from recent advances of sequencing technologies. However, the short size of the reads and the propensity of transposable elements to nest in highly repeated regions of genomes limits the efficiency of bioinformatic tools when Illumina or 454 technologies are used. Fortunately, long read sequencing technologies generating read length that may span the entire length of full transposons are now available. However, existing TE population genomic softwares were not designed to handle long reads and the development of new dedicated tools is needed. LoRTE is the first tool able to use PacBio long read sequences to identify transposon deletions and insertions between a reference genome and genomes of different strains or populations. Tested against simulated and genuine Drosophila melanogaster PacBio datasets, LoRTE appears to be a reliable and broadly applicable tool to study the dynamic and evolutionary impact of transposable elements using low coverage, long read sequences. LoRTE is an efficient and accurate tool to identify structural genomic variants caused by TE insertion or deletion. LoRTE is available for download at http://www.egce.cnrs-gif.fr/?p=6422.

  17. Population variability in animal health: Influence on dose-exposure-response relationships: Part II: Modelling and simulation.

    PubMed

    Martinez, Marilyn N; Gehring, Ronette; Mochel, Jonathan P; Pade, Devendra; Pelligand, Ludovic

    2018-05-28

    During the 2017 Biennial meeting, the American Academy of Veterinary Pharmacology and Therapeutics hosted a 1-day session on the influence of population variability on dose-exposure-response relationships. In Part I, we highlighted some of the sources of population variability. Part II provides a summary of discussions on modelling and simulation tools that utilize existing pharmacokinetic data, can integrate drug physicochemical characteristics with species physiological characteristics and dosing information or that combine observed with predicted and in vitro information to explore and describe sources of variability that may influence the safe and effective use of veterinary pharmaceuticals. © 2018 John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  18. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  19. The Harvest suite for rapid core-genome alignment and visualization of thousands of intraspecific microbial genomes.

    PubMed

    Treangen, Todd J; Ondov, Brian D; Koren, Sergey; Phillippy, Adam M

    2014-01-01

    Whole-genome sequences are now available for many microbial species and clades, however existing whole-genome alignment methods are limited in their ability to perform sequence comparisons of multiple sequences simultaneously. Here we present the Harvest suite of core-genome alignment and visualization tools for the rapid and simultaneous analysis of thousands of intraspecific microbial strains. Harvest includes Parsnp, a fast core-genome multi-aligner, and Gingr, a dynamic visual platform. Together they provide interactive core-genome alignments, variant calls, recombination detection, and phylogenetic trees. Using simulated and real data we demonstrate that our approach exhibits unrivaled speed while maintaining the accuracy of existing methods. The Harvest suite is open-source and freely available from: http://github.com/marbl/harvest.

  20. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria...

  1. Landsat-7 Simulation and Testing Environments

    NASA Technical Reports Server (NTRS)

    Holmes, E.; Ha, K.; Hawkins, K.; Lombardo, J.; Ram, M.; Sabelhaus, P.; Scott, S.; Phillips, R.

    1999-01-01

    A spacecraft Attitude Control and Determination Subsystem (ACDS) is heavily dependent upon simulation throughout its entire development, implementation and ground test cycle. Engineering simulation tools are typically developed to design and analyze control systems to validate the design and software simulation tools are required to qualify the flight software. However, the need for simulation does not end here. Operating the ACDS of a spacecraft on the ground requires the simulation of spacecraft dynamics, disturbance modeling and celestial body motion. Sensor data must also be simulated and substituted for actual sensor data on the ground so that the spacecraft will respond by sending commands to the actuators as they will on orbit. And finally, the simulators is the primary training tool and test-bed for the Flight Operations Team. In this paper various ACDS simulation, developed for or used by the Landsat 7 project will be described. The paper will include a description of each tool, its unique attributes, and its role in the overall development and testing of the ACDS. Finally, a section is included which discusses how the coordinated use of these simulation tools can maximize the probability of uncovering software, hardware and operations errors during the ground test process.

  2. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    DTIC Science & Technology

    2005-02-01

    method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background

  3. [Existing laparoscopic simulators and their benefit for the surgeon].

    PubMed

    Kalvach, J; Ryska, O; Ryska, M

    2016-01-01

    Nowadays, laparoscopic operations are a common part of surgical practice. However, they have their own characteristics and require a specific method of preparation. Recently, simulation techniques have been increasingly used for the training of skills. The aim of this review is to provide a summary of available literature on the topic of laparoscopic simulators, to assess their contribution to the training of surgeons, and to identify the most effective type of simulation. PubMed database, Web of Science and Cochrane Library were used to search for relevant publications. The keywords "laparoscopy, simulator, surgery, assessment" were used in the search. The search was limited to prospective studies published in the last 5 years in the English language. From a total of 354 studies found, we included in the survey 26 that matched our criteria. Nine studies compared individual simulators to one another. Five studies evaluated "high and low fidelity" (a virtual box simulator) as equally effective (EBM 2a). In three cases the "low fidelity" box simulator was found to be more efficient (EBM 2a3b). Only one study preferred the virtual simulator (VR) (EBM2b).Thirteen studies evaluated the benefits of simulators for practice. Twelve found training on a simulator to be an effective method of preparation (EBM 1b3b). In contrast, one study did not find any difference between the training simulator and traditional preparation (EBM 3b). Nine studies evaluated directly one of the methods of evaluating laparoscopic skills. Three studies evaluated VR simulator as a useful assessment tool. Other studies evaluated as successful the scoring system GOALS-GH. The hand motion analysis model was successful in one case. Most studies were observational (EBM 3b) and only 2 studies were of higher quality (EBM 2b). Simulators are an effective tool for practicing laparoscopic techniques (EBM: 1b). It cannot be determined based on available data which of the simulators is most effective. The virtual simulator, however, still remains the most self-sufficient unit suitable for teaching as well as evaluation of laparoscopic techniques (EBM 2b3b). Further studies are needed to find an effective system and parameters for an objective evaluation of skills. laparoscopy - simulator - surgery assessment.

  4. Three-dimensional tool radius compensation for multi-axis peripheral milling

    NASA Astrophysics Data System (ADS)

    Chen, Youdong; Wang, Tianmiao

    2013-05-01

    Few function about 3D tool radius compensation is applied to generating executable motion control commands in the existing computer numerical control (CNC) systems. Once the tool radius is changed, especially in the case of tool size changing with tool wear in machining, a new NC program has to be recreated. A generic 3D tool radius compensation method for multi-axis peripheral milling in CNC systems is presented. The offset path is calculated by offsetting the tool path along the direction of the offset vector with a given distance. The offset vector is perpendicular to both the tangent vector of the tool path and the orientation vector of the tool axis relative to the workpiece. The orientation vector equations of the tool axis relative to the workpiece are obtained through homogeneous coordinate transformation matrix and forward kinematics of generalized kinematics model of multi-axis machine tools. To avoid cutting into the corner formed by the two adjacent tool paths, the coordinates of offset path at the intersection point have been calculated according to the transition type that is determined by the angle between the two tool path tangent vectors at the corner. Through the verification by the solid cutting simulation software VERICUT® with different tool radiuses on a table-tilting type five-axis machine tool, and by the real machining experiment of machining a soup spoon on a five-axis machine tool with the developed CNC system, the effectiveness of the proposed 3D tool radius compensation method is confirmed. The proposed compensation method can be suitable for all kinds of three- to five-axis machine tools as a general form.

  5. Spatial-temporal consistency between gross primary productivity and solar-induced chlorophyll fluorescence of vegetation in China during 2007-2014

    NASA Astrophysics Data System (ADS)

    Ma, J.; Xiao, X.; Zhang, Y.; Chen, B.; Zhao, B.

    2017-12-01

    Great significance exists in accurately estimating spatial-temporal patterns of gross primary production (GPP) because of its important role in global carbon cycle. Satellite-based light use efficiency (LUE) models are regarded as an efficient tool in simulating spatially time-sires GPP. However, the estimation of the accuracy of GPP simulations from LUE at both spatial and temporal scales is still a challenging work. In this study, we simulated GPP of vegetation in China during 2007-2014 using a LUE model (Vegetation Photosynthesis Model, VPM) based on MODIS (moderate-resolution imaging spectroradiometer) images of 8-day temporal and 500-m spatial resolutions and NCEP (National Center for Environmental Prediction) climate data. Global Ozone Monitoring Instrument 2 (GOME-2) solar-induced chlorophyll fluorescence (SIF) data were used to compare with VPM simulated GPP (GPPVPM) temporally and spatially using linear correlation analysis. Significant positive linear correlations exist between monthly GPPVPM and SIF data over both single year (2010) and multiple years (2007-2014) in China. Annual GPPVPM is significantly positive correlated with SIF (R2>0.43) spatially for all years during 2007-2014 and all seasons in 2010 (R2>0.37). GPP dynamic trends is high spatial-temporal heterogeneous in China during 2007-2014. The results of this study indicate that GPPVPM is temporally and spatially in line with SIF data, and space-borne SIF data have great potential in validating and parameterizing GPP estimation of LUE-based models.

  6. A note on the kappa statistic for clustered dichotomous data.

    PubMed

    Zhou, Ming; Yang, Zhao

    2014-06-30

    The kappa statistic is widely used to assess the agreement between two raters. Motivated by a simulation-based cluster bootstrap method to calculate the variance of the kappa statistic for clustered physician-patients dichotomous data, we investigate its special correlation structure and develop a new simple and efficient data generation algorithm. For the clustered physician-patients dichotomous data, based on the delta method and its special covariance structure, we propose a semi-parametric variance estimator for the kappa statistic. An extensive Monte Carlo simulation study is performed to evaluate the performance of the new proposal and five existing methods with respect to the empirical coverage probability, root-mean-square error, and average width of the 95% confidence interval for the kappa statistic. The variance estimator ignoring the dependence within a cluster is generally inappropriate, and the variance estimators from the new proposal, bootstrap-based methods, and the sampling-based delta method perform reasonably well for at least a moderately large number of clusters (e.g., the number of clusters K ⩾50). The new proposal and sampling-based delta method provide convenient tools for efficient computations and non-simulation-based alternatives to the existing bootstrap-based methods. Moreover, the new proposal has acceptable performance even when the number of clusters is as small as K = 25. To illustrate the practical application of all the methods, one psychiatric research data and two simulated clustered physician-patients dichotomous data are analyzed. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Simulation of 20-channel, 50-GHz, Si3N4-based arrayed waveguide grating applying three different photonics tools

    NASA Astrophysics Data System (ADS)

    Gajdošová, Lenka; Seyringer, Dana

    2017-02-01

    We present the design and simulation of 20-channel, 50-GHz Si3N4 based AWG using three different commercial photonics tools, namely PHASAR from Optiwave Systems Inc., APSS from Apollo Photonics Inc. and RSoft from Synopsys Inc. For this purpose we created identical waveguide structures and identical AWG layouts in these tools and performed BPM simulations. For the simulations the same calculation conditions were used. These AWGs were designed for TM-polarized light with an AWG central wavelength of 850 nm. The output of all simulations, the transmission characteristics, were used to calculate the transmission parameters defining the optical properties of the simulated AWGs. These parameters were summarized and compared with each other. The results feature very good correlation between the tools and are comparable to the designed parameters in AWG-Parameters tool.

  8. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    PubMed

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  10. Development of a sharp interface model that simulates coastal aquifer flow with the coupled use of GIS

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Tolikas, Demetrios

    A simulation program, which works seamlessly with GIS and simulates flows in coastal aquifers, is presented in the present paper. The model is based on the Galerkin finite element discretization scheme and it simulates both steady and transient freshwater and saltwater flow, assuming that the two fluids are separated by a sharp interface. The model has been verified in simple cases where analytical solutions exist. The simulation program works as a tool of the GIS program, which is the main database that stores and manages all the necessary data. The combined use of the simulation and the GIS program forms an integrated management tool offering a simpler way of simulating and studying saline intrusion in coastal aquifers. Application of the model to the Yermasogia aquifer illustrates the coupled use of modeling and GIS techniques for the examination of regional coastal aquifer systems. Pour étudier un système aquifère côtier, nous avons développé un modèle aux éléments finis en quasi 3-D qui simule les écoulements d'eau douce et d'eau salée en régime aussi bien permanent que transitoire. Les équations qui les régissent sont discrétisées par un schéma de discrétisation de Garlekin aux éléments finis. Le modèle a été vérifié dans des cas simples où il existe des solutions analytiques. Toutes les données nécessaires sont introduites et gérées grâce à un logiciel de gestion de SIG. Le programme de simulation est utilisé comme un outil du logiciel de SIG, constituant ainsi un outil de gestion intégrée dont le but est de simuler et d'étudier l'intrusion saline dans les aquifères côtiers. L'application du modèle à l'aquifère de Yermasogia illustre l'utilisation couplée de la modélisation et des techniques de SIG pour l'étude des systèmes aquifères côtiers régionaux. Se ha desarrollado un modelo casi tridimensional de elementos finitos para simular el flujo de agua dulce y salada, tanto en régimen estacionario como en transitorio, en sistemas acuíferos costeros, bajo la hipótesis de separación por medio de una interfaz abrupta. Las ecuaciones del modelo han sido discretizadas mediante un esquema de Galerkin de discretización en elementos finitos. El modelo ha sido verificado en casos sencillos para los que existe solución analítica. Todos los datos necesarios se introducen y gestionan con un Sistema de Información Geográfica [SIG] por ordenador. El programa de simulación forma parte del programa de SIG, constituyendo una herramienta integrada de gestión para estudiar la intrusión salina en acuíferos costeros. La aplicación del modelo al acuífero de Yermasogia ilustra el uso acoplado de las técnicas de modelación y de SIG con el fin de examinar sistemas acuíferos costeros a escala regional.

  11. MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.

    PubMed

    Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan

    2017-01-01

    Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.

  12. Implementation of interconnect simulation tools in spice

    NASA Technical Reports Server (NTRS)

    Satsangi, H.; Schutt-Aine, J. E.

    1993-01-01

    Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.

  13. Simulations as a tool for higher mass resolution spectrometer: Lessons from existing observations

    NASA Astrophysics Data System (ADS)

    Nicolaou, Georgios; Yamauchi, Masatoshi; Nilsson, Hans; Wieser, Martin; Fedorov, Andrei

    2017-04-01

    Scientific requirements of each mission are crucial for the instrument's design. Ion tracing simulations of instruments can be helpful to characterize their performance, identify their limitations and improving the design for future missions. However, simulations provide the best performance in ideal case, and the actual response is determined by many other factors. Therefore, simulations should be compared with observations when possible. Characterizing the actual response of a running instrument gives valuable lessons for the future design of test instruments with the same detection principle before spending resources to build and calibrate them. In this study we use an ion tracing simulation of the Ion Composition Analyser (ICA) on board ROSETTA, in order to characterize its response and to compare it with the observations. It turned out that, due to the complicated unexpected response of the running instrument, the heavy cometary ions and molecules are sometimes difficult to be resolved. However, preliminary simulation of a slightly modified design predicts much higher mass resolution. Even after considering the complicated unexpected response, we safely expect that the modified design can resolve most abundant heavy atomic ions (e.g., O^+) and molecular ions (e.g., N_2+ and O_2^+). We show the simulation results for both designs and ICA data.

  14. GEANT4 Tuning For pCT Development

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim T.; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, João A. P.; Díaz, Katherin S.; Hormaza, Joel M.; Lopes, Ricardo T.

    2011-08-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. Moreover, the spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through aluminum absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadrontherapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code, or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development.

  15. A Micro-Grid Simulator Tool (SGridSim) using Effective Node-to-Node Complex Impedance (EN2NCI) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udhay Ravishankar; Milos manic

    2013-08-01

    This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less

  16. Impact of Charge Degradation on the Life Cycle Climate Performance of a Residential Air-Conditioning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beshr, Mohamed; Aute, Vikrant; Abdelaziz, Omar

    2014-01-01

    Vapor compression systems continuously leak a small fraction of their refrigerant charge to the environment, whether during operation or servicing. As a result of the slow leak rate occurring during operation, the refrigerant charge decreases until the system is serviced and recharged. This charge degradation, after a certain limit, begins to have a detrimental effect on system capacity, energy consumption, and coefficient of performance (COP). This paper presents a literature review and a summary of previous experimental work on the effect of undercharging or charge degradation of different vapor compression systems, especially those without a receiver. These systems include residentialmore » air conditioning and heat pump systems utilizing different components and refrigerants, and water chiller systems. Most of these studies show similar trends for the effect of charge degradation on system performance. However, it is found that although much experimental work exists on the effect of charge degradation on system performance, no correlation or comparison between charge degradation and system performance yet exists. Thus, based on the literature review, three different correlations that characterize the effect of charge on system capacity and energy consumption are developed for different systems as follows: one for air-conditioning systems, one for vapor compression water-to-water chiller systems, and one for heat pumps. These correlations can be implemented in vapor compression cycle simulation tools to obtain a better prediction of the system performance throughout its lifetime. In this paper, these correlations are implemented in an open source tool for life cycle climate performance (LCCP) based design of vapor compression systems. The LCCP of a residential air-source heat pump is evaluated using the tool and the effect of charge degradation on the results is studied. The heat pump is simulated using a validated component-based vapor compression system model and the LCCP results obtained using the three charge degradation correlations are compared.« less

  17. Modeling and control of flow during impregnation of heterogeneous porous media, with application to composite mold-filling processes

    NASA Astrophysics Data System (ADS)

    Bickerton, Simon

    Liquid Composite Molding (LCM) encompasses a growing list of composite material manufacturing techniques. These processes have provided the promise for complex fiber reinforced plastics parts, manufactured from a single molding step. In recent years a significant research effort has been invested in development of process simulations, providing tools that have advanced current LCM technology and broadened the range of applications. The requirement for manufacture of larger, more complex parts has motivated investigation of active control of LCM processes. Due to the unlimited variety of part geometries that can be produced, finite element based process simulations will be used to some extent in design of actively controlled processes. Ongoing efforts are being made to improve material parameter specification for process simulations, increasing their value as design tools. Several phenomena occurring during mold filling have been addressed through flow visualization experimentation and analysis of manufactured composite parts. The influence of well defined air channels within a mold cavity is investigated, incorporating their effects within existing filling simulations. Three different flow configurations have been addressed, testing the application of 'equivalent permeabilities', effectively approximating air channels as representative porous media. LCM parts having doubly curved regions require preform fabrics to undergo significant, and varying deformation throughout a mold cavity. Existing methods for predicting preform deformation, and the resulting permeability distribution have been applied to a conical mold geometry. Comparisons between experiment and simulation are promising, while the geometry studied has required large deformation over much of the part, shearing the preform fabric beyond the scope of the models applied. An investigational study was performed to determine the magnitude of effect, if any, on mold filling caused by corners within LCM mold cavities. The molds applied in this study have required careful consideration of cavity thickness variations. Any effects on mold filling due to corner radii have been overshadowed by those due to preform compression. While numerical tools are available to study actively controlled mold filling in a virtual environment, some development is required for the physical equipment to implement this in practice. A versatile, multiple line fluid injection system is developed here. The equipment and control algorithms employed have provided servo control of flow rate, or injection pressure, and have been tested under very challenging conditions. The single injection line developed is expanded to a multiple line system, and shows great potential for application to actual resin systems. A case study is presented, demonstrating design and implementation of a simple actively controlled injection scheme. The experimental facility developed provides an excellent testbed for application of actively controlled mold filling concepts, an area that is providing great promise for the advancement of LCM processes.

  18. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  19. Applied Virtual Reality Research and Applications at NASA/Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    1995-01-01

    A Virtual Reality (VR) applications program has been under development at NASA/Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before this technology can be utilized with confidence in these applications, it must be validated for each particular class of application. That is, the precision and reliability with which it maps onto real settings and scenarios, representative of a class, must be calculated and assessed. The approach of the MSFC VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems. Specific validation studies for selected classes of applications have been completed or are currently underway. These include macro-ergonomic "control-room class" design analysis, Spacelab stowage reconfiguration training, a full-body micro-gravity functional reach simulator, and a gross anatomy teaching simulator. This paper describes the MSFC VR Applications Program and the validation studies.

  20. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less

  1. The value of SPaCE in delivering patient feedback.

    PubMed

    Clapham, Laura; Allan, Laura; Stirling, Kevin

    2016-02-01

    The use of simulated patients (SPs) within undergraduate medical curricula is an established and valued learning opportunity. Within the context of simulation, it is imperative to capture feedback from all participants within the simulation activity. The Simulated Patient Candidate Evaluation (SPaCE) tool was developed to deliver SP feedback following a simulation activity. SpaCE is a closed feedback tool that allows SPs to rate a student's performance, using a five-point Likert scale, in three domains: attitude; interaction skills; and management. This research study examined the value of the SPaCE tool and how it contributes to the overall feedback that a student receives. Classical test theory was used to determine the reliability of the SPaCE tool. An evaluation of all SP responses was conducted to observe trends in scoring patterns for each question. Qualitative data were collected via a free-text questionnaire and subsequent focus group discussion. It is imperative to capture feedback from all participants within the simulation activity Classical test theory determined that the SPaCE tool had a reliability co-efficient of 0.89. A total of 13 SPs replied to the questionnaire. A thematic analysis of all questionnaire data identified that the SPaCE tool provides a structure that allows patient feedback to be given effectively following a simulation activity. These themes were discussed further with six SPs who attended the subsequent focus group session. The SPaCE tool has been shown to be a reliable closed feedback tool that allows SPs to discriminate between students, based on their performance. The next stage in the development of the SPaCE tool is to test the wider applicability of this feedback tool. © 2015 John Wiley & Sons Ltd.

  2. Going glass to digital: virtual microscopy as a simulation-based revolution in pathology and laboratory science.

    PubMed

    Nelson, Danielle; Ziv, Amitai; Bandali, Karim S

    2012-10-01

    The recent technological advance of digital high resolution imaging has allowed the field of pathology and medical laboratory science to undergo a dramatic transformation with the incorporation of virtual microscopy as a simulation-based educational and diagnostic tool. This transformation has correlated with an overall increase in the use of simulation in medicine in an effort to address dwindling clinical resource availability and patient safety issues currently facing the modern healthcare system. Virtual microscopy represents one such simulation-based technology that has the potential to enhance student learning and readiness to practice while revolutionising the ability to clinically diagnose pathology collaboratively across the world. While understanding that a substantial amount of literature already exists on virtual microscopy, much more research is still required to elucidate the full capabilities of this technology. This review explores the use of virtual microscopy in medical education and disease diagnosis with a unique focus on key requirements needed to take this technology to the next level in its use in medical education and clinical practice.

  3. Republished: going glass to digital: virtual microscopy as a simulation-based revolution in pathology and laboratory science.

    PubMed

    Nelson, Danielle; Ziv, Amitai; Bandali, Karim S

    2013-10-01

    The recent technological advance of digital high resolution imaging has allowed the field of pathology and medical laboratory science to undergo a dramatic transformation with the incorporation of virtual microscopy as a simulation-based educational and diagnostic tool. This transformation has correlated with an overall increase in the use of simulation in medicine in an effort to address dwindling clinical resource availability and patient safety issues currently facing the modern healthcare system. Virtual microscopy represents one such simulation-based technology that has the potential to enhance student learning and readiness to practice while revolutionising the ability to clinically diagnose pathology collaboratively across the world. While understanding that a substantial amount of literature already exists on virtual microscopy, much more research is still required to elucidate the full capabilities of this technology. This review explores the use of virtual microscopy in medical education and disease diagnosis with a unique focus on key requirements needed to take this technology to the next level in its use in medical education and clinical practice.

  4. Modelling of electronic excitation and radiation in the Direct Simulation Monte Carlo Macroscopic Chemistry Method

    NASA Astrophysics Data System (ADS)

    Goldsworthy, M. J.

    2012-10-01

    One of the most useful tools for modelling rarefied hypersonic flows is the Direct Simulation Monte Carlo (DSMC) method. Simulator particle movement and collision calculations are combined with statistical procedures to model thermal non-equilibrium flow-fields described by the Boltzmann equation. The Macroscopic Chemistry Method for DSMC simulations was developed to simplify the inclusion of complex thermal non-equilibrium chemistry. The macroscopic approach uses statistical information which is calculated during the DSMC solution process in the modelling procedures. Here it is shown how inclusion of macroscopic information in models of chemical kinetics, electronic excitation, ionization, and radiation can enhance the capabilities of DSMC to model flow-fields where a range of physical processes occur. The approach is applied to the modelling of a 6.4 km/s nitrogen shock wave and results are compared with those from existing shock-tube experiments and continuum calculations. Reasonable agreement between the methods is obtained. The quality of the comparison is highly dependent on the set of vibrational relaxation and chemical kinetic parameters employed.

  5. On Parallelizing Single Dynamic Simulation Using HPC Techniques and APIs of Commercial Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diao, Ruisheng; Jin, Shuangshuang; Howell, Frederic

    Time-domain simulations are heavily used in today’s planning and operation practices to assess power system transient stability and post-transient voltage/frequency profiles following severe contingencies to comply with industry standards. Because of the increased modeling complexity, it is several times slower than real time for state-of-the-art commercial packages to complete a dynamic simulation for a large-scale model. With the growing stochastic behavior introduced by emerging technologies, power industry has seen a growing need for performing security assessment in real time. This paper presents a parallel implementation framework to speed up a single dynamic simulation by leveraging the existing stability model librarymore » in commercial tools through their application programming interfaces (APIs). Several high performance computing (HPC) techniques are explored such as parallelizing the calculation of generator current injection, identifying fast linear solvers for network solution, and parallelizing data outputs when interacting with APIs in the commercial package, TSAT. The proposed method has been tested on a WECC planning base case with detailed synchronous generator models and exhibits outstanding scalable performance with sufficient accuracy.« less

  6. Gas network model allows full reservoir coupling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Methnani, M.M.

    The gas-network flow model (Gasnet) developed for and added to an existing Qatar General Petroleum Corp. (OGPC) in-house reservoir simulator, allows improved modeling of the interaction among the reservoir, wells, and pipeline networks. Gasnet is a three-phase model that is modified to handle gas-condensate systems. The numerical solution is based on a control volume scheme that uses the concept of cells and junctions, whereby pressure and phase densities are defined in cells, while phase flows are defined at junction links. The model features common numerical equations for the reservoir, the well, and the pipeline components and an efficient state-variable solutionmore » method in which all primary variables including phase flows are solved directly. Both steady-state and transient flow events can be simulated with the same tool. Three test cases show how the model runs. One case simulates flow redistribution in a simple two-branch gas network. The second simulates a horizontal gas well in a waterflooded gas reservoir. The third involves an export gas pipeline coupled to a producing reservoir.« less

  7. To Keep or Not to Keep? The Question of Crystallographic Waters for Enzyme Simulations in Organic Solvent

    PubMed Central

    Dahanayake, Jayangika N.; Gautam, Devaki N.; Verma, Rajni; Mitchell-Koch, Katie R.

    2016-01-01

    The use of enzymes in non-aqueous solvents expands the use of biocatalysts to hydrophobic substrates, with the ability to tune selectivity of reactions through solvent selection. Non-aqueous enzymology also allows for fundamental studies on the role of water and other solvents in enzyme structure, dynamics, and function. Molecular dynamics simulations serve as a powerful tool in this area, providing detailed atomic information about the effect of solvents on enzyme properties. However, a common protocol for non-aqueous enzyme simulations does not exist. If you want to simulate enzymes in non-aqueous solutions, how many and which crystallographic waters do you keep? In the present work, this question is addressed by determining which crystallographic water molecules lead most quickly to an equilibrated protein structure. Five different methods of selecting and keeping crystallographic waters are used in order to discover which crystallographic waters lead the protein structure to reach an equilibrated structure more rapidly in organic solutions. It is found that buried waters contribute most to rapid equilibration in organic solvent, with slow-diffusing waters giving similar results. PMID:27403032

  8. The use of a photoionization detector to detect harmful volatile chemicals by emergency personnel

    PubMed Central

    Patel, Neil D; Fales, William D; Farrell, Robert N

    2009-01-01

    Objective The objective of this investigation was to determine if a photoionization detector (PID) could be used to detect the presence of a simulated harmful chemical on simulated casualties of a chemical release. Methods A screening protocol, based on existing radiation screening protocols, was developed for the purposes of the investigation. Three simulated casualties were contaminated with a simulated chemical agent and two groups of emergency responders were involved in the trials. The success–failure ratio of the participants was used to judge the performance of the PID in this application. Results A high success rate was observed when the screening protocol was properly adhered to (97.67%). Conversely, the success rate suffered when participants deviated from the protocol (86.31%). With one exception, all failures were noted to have been the result of a failure to correctly observe the established screening protocol. Conclusions The results of this investigation indicate that the PID may be an effective screening tool for emergency responders. However, additional study is necessary to both confirm the effectiveness of the PID and refine the screening protocol if necessary. PMID:27147829

  9. New Tooling System for Forming Aluminum Beverage Can End Shell

    NASA Astrophysics Data System (ADS)

    Yamazaki, Koetsu; Otsuka, Takayasu; Han, Jing; Hasegawa, Takashi; Shirasawa, Taketo

    2011-08-01

    This paper proposes a new tooling system for forming shells of aluminum beverage can ends. At first, forming process of a conversional tooling system has been simulated using three-dimensional finite element models. Simulation results have been confirmed to be consistent with those of axisymmetric models, so simulations for further study have been performed using axisymmetric models to save computational time. A comparison shows that thinning of the shell formed by the proposed tooling system has been improved about 3.6%. Influences of the tool upmost surface profiles and tool initial positions in the new tooling system have been investigated and the design optimization method based on the numerical simulations has been then applied to search optimum design points, in order to minimize thinning subjected to the constraints of the geometrical dimensions of the shell. At last, the performance of the shell subjected to internal pressure has been confirmed to meet design requirements.

  10. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  11. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  12. Web-based, GPU-accelerated, Monte Carlo simulation and visualization of indirect radiation imaging detector performance.

    PubMed

    Dong, Han; Sharma, Diksha; Badano, Aldo

    2014-12-01

    Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridmantis, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webmantis and visualmantis to facilitate the setup of computational experiments via hybridmantis. The visualization tools visualmantis and webmantis enable the user to control simulation properties through a user interface. In the case of webmantis, control via a web browser allows access through mobile devices such as smartphones or tablets. webmantis acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridmantis. The users can download the output images and statistics through a zip file for future reference. In addition, webmantis provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. The visualization tools visualmantis and webmantis provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual feedback for the model predictions.

  13. Development of a numerical methodology for flowforming process simulation of complex geometry tubes

    NASA Astrophysics Data System (ADS)

    Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca

    2017-10-01

    Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.

  14. Small Engine Technology (SET) - Task 14 Axisymmetric Engine Simulation Environment

    NASA Technical Reports Server (NTRS)

    Miller, Max J.

    1999-01-01

    As part of the NPSS (Numerical Propulsion Simulation System) project, NASA Lewis has a goal of developing an U.S. industry standard for an axisymmetric engine simulation environment. In this program, AlliedSignal Engines (AE) contributed to this goal by evaluating the ENG20 software and developing support tools. ENG20 is a NASA developed axisymmetric engine simulation tool. The project was divided into six subtasks which are summarized below: Evaluate the capabilities of the ENG20 code using an existing test case to see how this procedure can capture the component interactions for a full engine. Link AE's compressor and turbine axisymmetric streamline curvature codes (UD0300M and TAPS) with ENG20, which will provide the necessary boundary conditions for an ENG20 engine simulation. Evaluate GE's Global Data System (GDS), attempt to use GDS to do the linking of codes described in Subtask 2 above. Use a turbofan engine test case to evaluate various aspects of the system, including the linkage of UD0300M and TAPS with ENG20 and the GE data storage system. Also, compare the solution results with cycle deck results, axisymmetric solutions (UD0300M and TAPS), and test data to determine the accuracy of the solution. Evaluate the order of accuracy and the convergence time for the solution. Provide a monthly status report and a final formal report documenting AE's evaluation of ENG20. Provide the developed interfaces that link UD0300M and TAPS with ENG20, to NASA. The interface that links UD0300M with ENG20 will be compatible with the industr,, version of UD0300M.

  15. The medical educator, the discourse analyst, and the phonetician: a collaborative feedback methodology for clinical communication.

    PubMed

    Woodward-Kron, Robyn; Stevens, Mary; Flynn, Eleanor

    2011-05-01

    Frameworks for clinical communication assist educators in making explicit the principles of good communication and providing feedback to medical trainees. However, existing frameworks rarely take into account the roles of culture and language in communication, which can be important for international medical graduates (IMGs) whose first language is not English. This article describes the collaboration by a medical educator, a discourse analyst, and a phonetician to develop a communication and language feedback methodology to assist IMG trainees at a Victorian hospital in Australia with developing their doctor-patient communication skills. The Communication and Language Feedback (CaLF) methodology incorporates a written tool and video recording of role-plays of doctor-patient interactions in a classroom setting or in an objective structured clinical examination (OSCE) practice session with a simulated patient. IMG trainees receive verbal feedback from their hospital-based medical clinical educator, the simulated patient, and linguists. The CaLF tool was informed by a model of language in context, observation of IMG communication training, and process evaluation by IMG participants during January to August 2009. The authors provided participants with a feedback package containing their practice video (which included verbal feedback) and the completed CaLF tool.The CaLF methodology provides a tool for medical educators and language practitioners to work collaboratively with IMGs to enhance communication and language skills. The ongoing interdisciplinary collaboration also provides much-needed applied research opportunities in intercultural health communication, an area the authors believe cannot be adequately addressed from the perspective of one discipline alone. Copyright © by the Association of American medical Colleges.

  16. In-silico wear prediction for knee replacements--methodology and corroboration.

    PubMed

    Strickland, M A; Taylor, M

    2009-07-22

    The capability to predict in-vivo wear of knee replacements is a valuable pre-clinical analysis tool for implant designers. Traditionally, time-consuming experimental tests provided the principal means of investigating wear. Today, computational models offer an alternative. However, the validity of these models has not been demonstrated across a range of designs and test conditions, and several different formulas are in contention for estimating wear rates, limiting confidence in the predictive power of these in-silico models. This study collates and retrospectively simulates a wide range of experimental wear tests using fast rigid-body computational models with extant wear prediction algorithms, to assess the performance of current in-silico wear prediction tools. The number of tests corroborated gives a broader, more general assessment of the performance of these wear-prediction tools, and provides better estimates of the wear 'constants' used in computational models. High-speed rigid-body modelling allows a range of alternative algorithms to be evaluated. Whilst most cross-shear (CS)-based models perform comparably, the 'A/A+B' wear model appears to offer the best predictive power amongst existing wear algorithms. However, the range and variability of experimental data leaves considerable uncertainty in the results. More experimental data with reduced variability and more detailed reporting of studies will be necessary to corroborate these models with greater confidence. With simulation times reduced to only a few minutes, these models are ideally suited to large-volume 'design of experiment' or probabilistic studies (which are essential if pre-clinical assessment tools are to begin addressing the degree of variation observed clinically and in explanted components).

  17. Quantifying the Economic and Grid Reliability Impacts of Improved Wind Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Martinez-Anido, Carlo Brancucci; Wu, Hongyu

    Wind power forecasting is an important tool in power system operations to address variability and uncertainty. Accurately doing so is important to reducing the occurrence and length of curtailment, enhancing market efficiency, and improving the operational reliability of the bulk power system. This research quantifies the value of wind power forecasting improvements in the IEEE 118-bus test system as modified to emulate the generation mixes of Midcontinent, California, and New England independent system operator balancing authority areas. To measure the economic value, a commercially available production cost modeling tool was used to simulate the multi-timescale unit commitment (UC) and economicmore » dispatch process for calculating the cost savings and curtailment reductions. To measure the reliability improvements, an in-house tool, FESTIV, was used to calculate the system's area control error and the North American Electric Reliability Corporation Control Performance Standard 2. The approach allowed scientific reproducibility of results and cross-validation of the tools. A total of 270 scenarios were evaluated to accommodate the variation of three factors: generation mix, wind penetration level, and wind fore-casting improvements. The modified IEEE 118-bus systems utilized 1 year of data at multiple timescales, including the day-ahead UC, 4-hour-ahead UC, and 5-min real-time dispatch. The value of improved wind power forecasting was found to be strongly tied to the conventional generation mix, existence of energy storage devices, and the penetration level of wind energy. The simulation results demonstrate that wind power forecasting brings clear benefits to power system operations.« less

  18. A study with ESI PAM-STAMP® on the influence of tool deformation on final part quality during a forming process

    NASA Astrophysics Data System (ADS)

    Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David

    2018-05-01

    As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.

  19. Multiplatform Mission Planning and Operations Simulation Environment for Adaptive Remote Sensors

    NASA Astrophysics Data System (ADS)

    Smith, G.; Ball, C.; O'Brien, A.; Johnson, J. T.

    2017-12-01

    We report on the design and development of mission simulator libraries to support the emerging field of adaptive remote sensors. We will outline the current state of the art in adaptive sensing, provide analysis of how the current approach to performing observing system simulation experiments (OSSEs) must be changed to enable adaptive sensors for remote sensing, and present an architecture to enable their inclusion in future OSSEs.The growing potential of sensors capable of real-time adaptation of their operational parameters calls for a new class of mission planning and simulation tools. Existing simulation tools used in OSSEs assume a fixed set of sensor parameters in terms of observation geometry, frequencies used, resolution, or observation time, which allows simplifications to be made in the simulation and allows sensor observation errors to be characterized a priori. Adaptive sensors may vary these parameters depending on the details of the scene observed, so that sensor performance is not simple to model without conducting OSSE simulations that include sensor adaptation in response to varying observational environment. Adaptive sensors are of significance to resource-constrained, small satellite platforms because they enable the management of power and data volumes while providing methods for multiple sensors to collaborate.The new class of OSSEs required to utilize adaptive sensors located on multiple platforms must answer the question: If the physical act of sensing has a cost, how does the system determine if the science value of a measurement is worth the cost and how should that cost be shared among the collaborating sensors?Here we propose to answer this question using an architecture structured around three modules: ADAPT, MANAGE and COLLABORATE. The ADAPT module is a set of routines to facilitate modeling of adaptive sensors, the MANAGE module will implement a set of routines to facilitate simulations of sensor resource management when power and data volume are constrained, and the COLLABORATE module will support simulations of coordination among multiple platforms with adaptive sensors. When used together these modules will for a simulation OSSEs that can enable both the design of adaptive algorithms to support remote sensing and the prediction of the sensor performance.

  20. The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.

    PubMed

    Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A

    2010-03-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).

  1. The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software

    PubMed Central

    Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung

    2010-01-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162

  2. Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab: an update on PR12-16-001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Battaglieri, M.

    This document is an update to the proposal PR12-16-001 Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab submitted to JLab-PAC44 in 2016 reporting progress in addressing questions raised regarding the beam-on backgrounds. The concerns are addressed by adopting a new simulation tool, FLUKA, and planning measurements of muon fluxes from the dump with its existing shielding around the dump. First, we have implemented the detailed BDX experimental geometry into a FLUKA simulation, in consultation with experts from the JLab Radiation Control Group. The FLUKA simulation has been compared directly to our GEANT4 simulations and shown to agreemore » in regions of validity. The FLUKA interaction package, with a tuned set of biasing weights, is naturally able to generate reliable particle distributions with very small probabilities and therefore predict rates at the detector location beyond the planned shielding around the beam dump. Second, we have developed a plan to conduct measurements of the muon ux from the Hall-A dump in its current configuration to validate our simulations.« less

  3. Data and Tools | Hydrogen and Fuel Cells | NREL

    Science.gov Websites

    researchers, developers, investors, and others interested in the viability, analysis, and development of , energy use, and emissions. Alternative Fuels Data Center Tools Collection of tools-calculators -makers reduce petroleum use. FASTSim: Future Automotive Systems Technology Simulator Simulation tool that

  4. Simulation Based Optimization of Complex Monolithic Composite Structures Using Cellular Core Technology

    NASA Astrophysics Data System (ADS)

    Hickmott, Curtis W.

    Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional tooling methods. Components made with the cellular core tooling method showed an improved strength at the joints. It is expected that this knowledge will help optimize the processing of complex, integrated structures and benefit applications in aerospace where lighter, structurally efficient components would be advantageous.

  5. The FluxCompensator: Making Radiative Transfer Models of Hydrodynamical Simulations Directly Comparable to Real Observations

    NASA Astrophysics Data System (ADS)

    Koepferl, Christine M.; Robitaille, Thomas P.

    2017-11-01

    When modeling astronomical objects throughout the universe, it is important to correctly treat the limitations of the data, for instance finite resolution and sensitivity. In order to simulate these effects, and to make radiative transfer models directly comparable to real observations, we have developed an open-source Python package called the FluxCompensator that enables the post-processing of the output of 3D Monte Carlo radiative transfer codes, such as Hyperion. With the FluxCompensator, realistic synthetic observations can be generated by modeling the effects of convolution with arbitrary point-spread functions, transmission curves, finite pixel resolution, noise, and reddening. Pipelines can be applied to compute synthetic observations that simulate observatories, such as the Spitzer Space Telescope or the Herschel Space Observatory. Additionally, this tool can read in existing observations (e.g., FITS format) and use the same settings for the synthetic observations. In this paper, we describe the package as well as present examples of such synthetic observations.

  6. Adaptive sparse grid approach for the efficient simulation of pulsed eddy current testing inspections

    NASA Astrophysics Data System (ADS)

    Miorelli, Roberto; Reboud, Christophe

    2018-04-01

    Pulsed Eddy Current Testing (PECT) is a popular NonDestructive Testing (NDT) technique for some applications like corrosion monitoring in the oil and gas industry, or rivet inspection in the aeronautic area. Its particularity is to use a transient excitation, which allows to retrieve more information from the piece than conventional harmonic ECT, in a simpler and cheaper way than multi-frequency ECT setups. Efficient modeling tools prove, as usual, very useful to optimize experimental sensors and devices or evaluate their performance, for instance. This paper proposes an efficient simulation of PECT signals based on standard time harmonic solvers and use of an Adaptive Sparse Grid (ASG) algorithm. An adaptive sampling of the ECT signal spectrum is performed with this algorithm, then the complete spectrum is interpolated from this sparse representation and PECT signals are finally synthesized by means of inverse Fourier transform. Simulation results corresponding to existing industrial configurations are presented and the performance of the strategy is discussed by comparison to reference results.

  7. The FluxCompensator: Making Radiative Transfer Models of Hydrodynamical Simulations Directly Comparable to Real Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koepferl, Christine M.; Robitaille, Thomas P., E-mail: koepferl@usm.lmu.de

    When modeling astronomical objects throughout the universe, it is important to correctly treat the limitations of the data, for instance finite resolution and sensitivity. In order to simulate these effects, and to make radiative transfer models directly comparable to real observations, we have developed an open-source Python package called the FluxCompensator that enables the post-processing of the output of 3D Monte Carlo radiative transfer codes, such as Hyperion. With the FluxCompensator, realistic synthetic observations can be generated by modeling the effects of convolution with arbitrary point-spread functions, transmission curves, finite pixel resolution, noise, and reddening. Pipelines can be applied tomore » compute synthetic observations that simulate observatories, such as the Spitzer Space Telescope or the Herschel Space Observatory . Additionally, this tool can read in existing observations (e.g., FITS format) and use the same settings for the synthetic observations. In this paper, we describe the package as well as present examples of such synthetic observations.« less

  8. A computational workflow for designing silicon donor qubits

    DOE PAGES

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; ...

    2016-09-19

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to performmore » detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. In conclusion, the resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.« less

  9. The Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; Wood, K.

    2018-04-01

    We present the public Monte Carlo photoionization and moving-mesh radiation hydrodynamics code CMACIONIZE, which can be used to simulate the self-consistent evolution of HII regions surrounding young O and B stars, or other sources of ionizing radiation. The code combines a Monte Carlo photoionization algorithm that uses a complex mix of hydrogen, helium and several coolants in order to self-consistently solve for the ionization and temperature balance at any given type, with a standard first order hydrodynamics scheme. The code can be run as a post-processing tool to get the line emission from an existing simulation snapshot, but can also be used to run full radiation hydrodynamical simulations. Both the radiation transfer and the hydrodynamics are implemented in a general way that is independent of the grid structure that is used to discretize the system, allowing it to be run both as a standard fixed grid code, but also as a moving-mesh code.

  10. Exploring protein kinase conformation using swarm-enhanced sampling molecular dynamics.

    PubMed

    Atzori, Alessio; Bruce, Neil J; Burusco, Kepa K; Wroblowski, Berthold; Bonnet, Pascal; Bryce, Richard A

    2014-10-27

    Protein plasticity, while often linked to biological function, also provides opportunities for rational design of selective and potent inhibitors of their function. The application of computational methods to the prediction of concealed protein concavities is challenging, as the motions involved can be significant and occur over long time scales. Here we introduce the swarm-enhanced sampling molecular dynamics (sesMD) method as a tool to improve sampling of conformational landscapes. In this approach, a swarm of replica simulations interact cooperatively via a set of pairwise potentials incorporating attractive and repulsive components. We apply the sesMD approach to explore the conformations of the DFG motif in the protein p38α mitogen-activated protein kinase. In contrast to multiple MD simulations, sesMD trajectories sample a range of DFG conformations, some of which map onto existing crystal structures. Simulated structures intermediate between the DFG-in and DFG-out conformations are predicted to have druggable pockets of interest for structure-based ligand design.

  11. Non-invasive localization of atrial ectopic beats by using simulated body surface P-wave integral maps

    PubMed Central

    Godoy, Eduardo J.; Lozano, Miguel; Martínez-Mateu, Laura; Atienza, Felipe; Saiz, Javier; Sebastian, Rafael

    2017-01-01

    Non-invasive localization of continuous atrial ectopic beats remains a cornerstone for the treatment of atrial arrhythmias. The lack of accurate tools to guide electrophysiologists leads to an increase in the recurrence rate of ablation procedures. Existing approaches are based on the analysis of the P-waves main characteristics and the forward body surface potential maps (BSPMs) or on the inverse estimation of the electric activity of the heart from those BSPMs. These methods have not provided an efficient and systematic tool to localize ectopic triggers. In this work, we propose the use of machine learning techniques to spatially cluster and classify ectopic atrial foci into clearly differentiated atrial regions by using the body surface P-wave integral map (BSPiM) as a biomarker. Our simulated results show that ectopic foci with similar BSPiM naturally cluster into differentiated non-intersected atrial regions and that new patterns could be correctly classified with an accuracy of 97% when considering 2 clusters and 96% for 4 clusters. Our results also suggest that an increase in the number of clusters is feasible at the cost of decreasing accuracy. PMID:28704537

  12. Stability Limits of Circumbinary Planets: Is There a Pile-up in the Kepler CBPs?

    NASA Astrophysics Data System (ADS)

    Quarles, B.; Satyal, S.; Kostov, V.; Kaib, N.; Haghighipour, N.

    2018-04-01

    The stability limit for circumbinary planets (CBPs) is not well defined and can depend on initial parameters defining either the planetary orbit and/or the inner binary orbit. We expand on the work of Holman & Wiegert (1999) to develop numerical tools for quick, easy, and accurate determination of the stability limit. The results of our simulations, as well as our numerical tools, are available to the community through Zenodo and GitHub, respectively. We employ a grid interpolation method based on ∼150 million full N-body simulations of initially circular, coplanar systems and compare to the nine known Kepler CBP systems. Using a formalism from planet packing studies, we find that 55% of the Kepler CBP systems allow for an additional equal-mass planet to potentially exist on an interior orbit relative to the observed planet. Therefore, we do not find strong evidence for a pile-up in the Kepler CBP systems and more detections are needed to adequately characterize the formation mechanisms for the CBP population. Observations from the Transiting Exoplanet Survey Satellite are expected to substantially increase the number of detections using the unique geometry of CBP systems, where multiple transits can occur during a single conjunction.

  13. Cell lineage tracing in the developing enteric nervous system: superstars revealed by experiment and simulation

    PubMed Central

    Cheeseman, Bevan L.; Zhang, Dongcheng; Binder, Benjamin J.; Newgreen, Donald F.; Landman, Kerry A.

    2014-01-01

    Cell lineage tracing is a powerful tool for understanding how proliferation and differentiation of individual cells contribute to population behaviour. In the developing enteric nervous system (ENS), enteric neural crest (ENC) cells move and undergo massive population expansion by cell division within self-growing mesenchymal tissue. We show that single ENC cells labelled to follow clonality in the intestine reveal extraordinary and unpredictable variation in number and position of descendant cells, even though ENS development is highly predictable at the population level. We use an agent-based model to simulate ENC colonization and obtain agent lineage tracing data, which we analyse using econometric data analysis tools. In all realizations, a small proportion of identical initial agents accounts for a substantial proportion of the total final agent population. We term these individuals superstars. Their existence is consistent across individual realizations and is robust to changes in model parameters. This inequality of outcome is amplified at elevated proliferation rate. The experiments and model suggest that stochastic competition for resources is an important concept when understanding biological processes which feature high levels of cell proliferation. The results have implications for cell-fate processes in the ENS. PMID:24501272

  14. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.

    2016-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  15. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, Frido; Kramer, Stefanie; Koch, Thomas; Seifert, Sven; Monninkhoff, Bertram; Pfützner, Bernd

    2017-04-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  16. Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES

    NASA Astrophysics Data System (ADS)

    Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu

    2016-11-01

    Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.

  17. Assured communications and combat resiliency: the relationship between effective national communications and combat efficiency

    NASA Astrophysics Data System (ADS)

    Allgood, Glenn O.; Kuruganti, Phani Teja; Nutaro, James; Saffold, Jay

    2009-05-01

    Combat resiliency is the ability of a commander to prosecute, control, and consolidate his/her's sphere of influence in adverse and changing conditions. To support this, an infrastructure must exist that allows the commander to view the world in varying degrees of granularity with sufficient levels of detail to permit confidence estimates to be levied against decisions and course of actions. An infrastructure such as this will include the ability to effectively communicate context and relevance within and across the battle space. To achieve this will require careful thought, planning, and understanding of a network and its capacity limitations in post-event command and control. Relevance and impact on any existing infrastructure must be fully understood prior to deployment to exploit the system's full capacity and capabilities. In this view, the combat communication network is considered an integral part of or National communication network and infrastructure. This paper will describe an analytical tool set developed at ORNL and RNI incorporating complexity theory, advanced communications modeling, simulation, and visualization technologies that could be used as a pre-planning tool or post event reasoning application to support response and containment.

  18. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  19. SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration

    ERIC Educational Resources Information Center

    Han, Kyung T.

    2012-01-01

    Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…

  20. Review of Real-Time Simulator and the Steps Involved for Implementation of a Model from MATLAB/SIMULINK to Real-Time

    NASA Astrophysics Data System (ADS)

    Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi

    2015-06-01

    Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.

  1. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  2. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors

    PubMed Central

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization. PMID:24600168

  3. ScintSim1: A new Monte Carlo simulation code for transport of optical photons in 2D arrays of scintillation detectors.

    PubMed

    Mosleh-Shirazi, Mohammad Amin; Zarrini-Monfared, Zinat; Karbasi, Sareh; Zamani, Ali

    2014-01-01

    Two-dimensional (2D) arrays of thick segmented scintillators are of interest as X-ray detectors for both 2D and 3D image-guided radiotherapy (IGRT). Their detection process involves ionizing radiation energy deposition followed by production and transport of optical photons. Only a very limited number of optical Monte Carlo simulation models exist, which has limited the number of modeling studies that have considered both stages of the detection process. We present ScintSim1, an in-house optical Monte Carlo simulation code for 2D arrays of scintillation crystals, developed in the MATLAB programming environment. The code was rewritten and revised based on an existing program for single-element detectors, with the additional capability to model 2D arrays of elements with configurable dimensions, material, etc., The code generates and follows each optical photon history through the detector element (and, in case of cross-talk, the surrounding ones) until it reaches a configurable receptor, or is attenuated. The new model was verified by testing against relevant theoretically known behaviors or quantities and the results of a validated single-element model. For both sets of comparisons, the discrepancies in the calculated quantities were all <1%. The results validate the accuracy of the new code, which is a useful tool in scintillation detector optimization.

  4. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.

    PubMed

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  5. gPKPDSim: a SimBiology®-based GUI application for PKPD modeling in drug development.

    PubMed

    Hosseini, Iraj; Gajjala, Anita; Bumbaca Yadav, Daniela; Sukumaran, Siddharth; Ramanujan, Saroja; Paxson, Ricardo; Gadkar, Kapil

    2018-04-01

    Modeling and simulation (M&S) is increasingly used in drug development to characterize pharmacokinetic-pharmacodynamic (PKPD) relationships and support various efforts such as target feasibility assessment, molecule selection, human PK projection, and preclinical and clinical dose and schedule determination. While model development typically require mathematical modeling expertise, model exploration and simulations could in many cases be performed by scientists in various disciplines to support the design, analysis and interpretation of experimental studies. To this end, we have developed a versatile graphical user interface (GUI) application to enable easy use of any model constructed in SimBiology ® to execute various common PKPD analyses. The MATLAB ® -based GUI application, called gPKPDSim, has a single screen interface and provides functionalities including simulation, data fitting (parameter estimation), population simulation (exploring the impact of parameter variability on the outputs of interest), and non-compartmental PK analysis. Further, gPKPDSim is a user-friendly tool with capabilities including interactive visualization, exporting of results and generation of presentation-ready figures. gPKPDSim was designed primarily for use in preclinical and translational drug development, although broader applications exist. gPKPDSim is a MATLAB ® -based open-source application and is publicly available to download from MATLAB ® Central™. We illustrate the use and features of gPKPDSim using multiple PKPD models to demonstrate the wide applications of this tool in pharmaceutical sciences. Overall, gPKPDSim provides an integrated, multi-purpose user-friendly GUI application to enable efficient use of PKPD models by scientists from various disciplines, regardless of their modeling expertise.

  6. A knowledge platform to inform on the effects of trawling on benthic communities

    NASA Astrophysics Data System (ADS)

    Muntadas, Alba; Lample, Michel; Demestre, Montserrat; Ballé-Béganton, Johanna; de Juan, Silvia; Maynou, Francesc; Bailly, Denis

    2018-02-01

    For a successful implementation of an Ecosystem Approach to Fisheries (EAF) management, it is necessary that all stakeholders involved in fisheries management are aware of the implications of fishing impacts on ecosystems and agree with the adopted measures to mitigate these impacts. In this context, there is a need for tools to share knowledge on the ecosystem effects of fisheries among these stakeholders. When managing bottom trawl fisheries under an EAF framework, one of the main concerns is the direct and indirect consequences of trawling impacts on benthic ecosystems. We developed a platform using the ExtendSim® software with a user-friendly interface that combines a simulation model based on existing knowledge, data collection and representation of predicted trawling impacts on the seabed. The platform aims to be a deliberation support tool for fisheries' stakeholders and, simultaneously, raise public awareness of the need for good benthic community knowledge to appropriately inform EAF management plans. The simulation procedure assumes that trawling affects benthic communities with an intensity that depends on the level of fishing effort exerted on benthic communities and on the habitat characteristics (i.e. sediment grain size). Data to build the simulation comes from epifaunal samples from 18 study sites located in Mediterranean continental shelves subjected to different levels of fishing effort. In this work, we present the simulation outputs of a 50% fishing effort increase (and decrease) in four of the study sites which cover different habitats and different levels of fishing effort. We discuss the platform strengths and weaknesses and potential future developments.

  7. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models

    PubMed Central

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A.

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients. PMID:25374542

  8. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  9. Monte Carlo simulation of proton track structure in biological matter

    DOE PAGES

    Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.; ...

    2017-05-25

    Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less

  10. Molecular Dynamics Simulation of the Thermophysical Properties of Quantum Liquid Helium Using the Feynman-Hibbs Potential

    NASA Astrophysics Data System (ADS)

    Liu, J.; Lu, W. Q.

    2010-03-01

    This paper presents the detailed MD simulation on the properties including the thermal conductivities and viscosities of the quantum fluid helium at different state points. The molecular interactions are represented by the Lennard-Jones pair potentials supplemented by quantum corrections following the Feynman-Hibbs approach and the properties are calculated using the Green-Kubo equations. A comparison is made among the numerical results using LJ and QFH potentials and the existing database and shows that the LJ model is not quantitatively correct for the supercritical liquid helium, thereby the quantum effect must be taken into account when the quantum fluid helium is studied. The comparison of the thermal conductivity is also made as a function of temperatures and pressure and the results show quantum effect correction is an efficient tool to get the thermal conductivities.

  11. Monte Carlo simulation of proton track structure in biological matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.

    Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less

  12. Impact of tool wear on cross wedge rolling process stability and on product quality

    NASA Astrophysics Data System (ADS)

    Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric

    2017-10-01

    Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.

  13. SolarPILOT | Concentrating Solar Power | NREL

    Science.gov Websites

    tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is

  14. Hydrologic cost-effectiveness ratio favors switchgrass production on marginal croplands over existing grasslands

    PubMed Central

    Yimam, Yohannes Tadesse; Ochsner, Tyson E.; Fox, Garey A.

    2017-01-01

    Switchgrass (Panicum virgatum L.) has attracted attention as a promising second generation biofuel feedstock. Both existing grasslands and marginal croplands have been suggested as targets for conversion to switchgrass, but the resulting production potentials and hydrologic impacts are not clear. The objectives of this study were to model switchgrass production on existing grasslands (scenario-I) and on marginal croplands that have severe to very severe limitations for crop production (scenario-II) and to evaluate the effects on evapotranspiration (ET) and streamflow. The Soil and Water Assessment Tool (SWAT) was applied to the 1063 km2 Skeleton Creek watershed in north-central Oklahoma, a watershed dominated by grasslands (35%) and winter wheat cropland (47%). The simulated average annual yield (2002–2011) for rainfed Alamo switchgrass for both scenarios was 12 Mg ha-1. Yield varied spatially under scenario-I from 6.1 to 15.3 Mg ha-1, while under scenario-II the range was from 8.2 to 13.8 Mg ha-1. Comparison of average annual ET and streamflow between the baseline simulation (existing land use) and scenario-I showed that scenario-I had 5.6% (37 mm) higher average annual ET and 27.7% lower streamflow, representing a 40.7 million m3 yr-1 streamflow reduction. Compared to the baseline, scenario-II had only 0.5% higher ET and 3.2% lower streamflow, but some monthly impacts were larger. In this watershed, the water yield reduction per ton of biomass production (i.e. hydrologic cost-effectiveness ratio) was more than 5X greater under scenario-I than under scenario-II. These results suggest that, from a hydrologic perspective, it may be preferable to convert marginal cropland to switchgrass production rather than converting existing grasslands. PMID:28792541

  15. Hydrologic cost-effectiveness ratio favors switchgrass production on marginal croplands over existing grasslands.

    PubMed

    Yimam, Yohannes Tadesse; Ochsner, Tyson E; Fox, Garey A

    2017-01-01

    Switchgrass (Panicum virgatum L.) has attracted attention as a promising second generation biofuel feedstock. Both existing grasslands and marginal croplands have been suggested as targets for conversion to switchgrass, but the resulting production potentials and hydrologic impacts are not clear. The objectives of this study were to model switchgrass production on existing grasslands (scenario-I) and on marginal croplands that have severe to very severe limitations for crop production (scenario-II) and to evaluate the effects on evapotranspiration (ET) and streamflow. The Soil and Water Assessment Tool (SWAT) was applied to the 1063 km2 Skeleton Creek watershed in north-central Oklahoma, a watershed dominated by grasslands (35%) and winter wheat cropland (47%). The simulated average annual yield (2002-2011) for rainfed Alamo switchgrass for both scenarios was 12 Mg ha-1. Yield varied spatially under scenario-I from 6.1 to 15.3 Mg ha-1, while under scenario-II the range was from 8.2 to 13.8 Mg ha-1. Comparison of average annual ET and streamflow between the baseline simulation (existing land use) and scenario-I showed that scenario-I had 5.6% (37 mm) higher average annual ET and 27.7% lower streamflow, representing a 40.7 million m3 yr-1 streamflow reduction. Compared to the baseline, scenario-II had only 0.5% higher ET and 3.2% lower streamflow, but some monthly impacts were larger. In this watershed, the water yield reduction per ton of biomass production (i.e. hydrologic cost-effectiveness ratio) was more than 5X greater under scenario-I than under scenario-II. These results suggest that, from a hydrologic perspective, it may be preferable to convert marginal cropland to switchgrass production rather than converting existing grasslands.

  16. Coupling the Multizone Airflow and Contaminant Transport Software CONTAM with EnergyPlus Using Co-Simulation.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-08-01

    Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.

  17. Creation and Delphi-method refinement of pediatric disaster triage simulations.

    PubMed

    Cicero, Mark X; Brown, Linda; Overly, Frank; Yarzebski, Jorge; Meckler, Garth; Fuchs, Susan; Tomassoni, Anthony; Aghababian, Richard; Chung, Sarita; Garrett, Andrew; Fagbuyi, Daniel; Adelgais, Kathleen; Goldman, Ran; Parker, James; Auerbach, Marc; Riera, Antonio; Cone, David; Baum, Carl R

    2014-01-01

    There is a need for rigorously designed pediatric disaster triage (PDT) training simulations for paramedics. First, we sought to design three multiple patient incidents for EMS provider training simulations. Our second objective was to determine the appropriate interventions and triage level for each victim in each of the simulations and develop evaluation instruments for each simulation. The final objective was to ensure that each simulation and evaluation tool was free of bias toward any specific PDT strategy. We created mixed-methods disaster simulation scenarios with pediatric victims: a school shooting, a school bus crash, and a multiple-victim house fire. Standardized patients, high-fidelity manikins, and low-fidelity manikins were used to portray the victims. Each simulation had similar acuity of injuries and 10 victims. Examples include children with special health-care needs, gunshot wounds, and smoke inhalation. Checklist-based evaluation tools and behaviorally anchored global assessments of function were created for each simulation. Eight physicians and paramedics from areas with differing PDT strategies were recruited as Subject Matter Experts (SMEs) for a modified Delphi iterative critique of the simulations and evaluation tools. The modified Delphi was managed with an online survey tool. The SMEs provided an expected triage category for each patient. The target for modified Delphi consensus was ≥85%. Using Likert scales and free text, the SMEs assessed the validity of the simulations, including instances of bias toward a specific PDT strategy, clarity of learning objectives, and the correlation of the evaluation tools to the learning objectives and scenarios. After two rounds of the modified Delphi, consensus for expected triage level was >85% for 28 of 30 victims, with the remaining two achieving >85% consensus after three Delphi iterations. To achieve consensus, we amended 11 instances of bias toward a specific PDT strategy and corrected 10 instances of noncorrelation between evaluations and simulation. The modified Delphi process, used to derive novel PDT simulation and evaluation tools, yielded a high degree of consensus among the SMEs, and eliminated biases toward specific PDT strategies in the evaluations. The simulations and evaluation tools may now be tested for reliability and validity as part of a prehospital PDT curriculum.

  18. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  19. A simulator tool set for evaluating HEVC/SHVC streaming

    NASA Astrophysics Data System (ADS)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to the standard can be investigated. We demonstrate the effectiveness and usability of our toolset by evaluating SHVC streaming and adaptation to meet terminal constraints and network conditions in a range of wired, wireless, and large scale wireless mesh network scenarios, each of which is designed to simulate a realistic environment. Our results are compared to those for H264/SVC, the scalable extension to the existing H.264/AVC advanced video coding standard.

  20. ISTAR: Intelligent System for Telemetry Analysis in Real-time

    NASA Technical Reports Server (NTRS)

    Simmons, Charles

    1994-01-01

    The intelligent system for telemetry analysis in real-time (ISTAR) is an advanced vehicle monitoring environment incorporating expert systems, analysis tools, and on-line hypermedia documentation. The system was developed for the Air Force Space and Missile Systems Center (SMC) in Los Angeles, California, in support of the inertial upper stage (IUS) booster vehicle. Over a five year period the system progressed from rapid prototype to operational system. ISTAR has been used to support five IUS missions and countless mission simulations. There were a significant number of lessons learned with respect to integrating an expert system capability into an existing ground system.

Top