Requirements analysis for a hardware, discrete-event, simulation engine accelerator
NASA Astrophysics Data System (ADS)
Taylor, Paul J., Jr.
1991-12-01
An analysis of a general Discrete Event Simulation (DES), executing on the distributed architecture of an eight mode Intel PSC/2 hypercube, was performed. The most time consuming portions of the general DES algorithm were determined to be the functions associated with message passing of required simulation data between processing nodes of the hypercube architecture. A behavioral description, using the IEEE standard VHSIC Hardware Description and Design Language (VHDL), for a general DES hardware accelerator is presented. The behavioral description specifies the operational requirements for a DES coprocessor to augment the hypercube's execution of DES simulations. The DES coprocessor design implements the functions necessary to perform distributed discrete event simulations using a conservative time synchronization protocol.
The use of cognitive task analysis to improve instructional descriptions of procedures.
Clark, Richard E; Pugh, Carla M; Yates, Kenneth A; Inaba, Kenji; Green, Donald J; Sullivan, Maura E
2012-03-01
Surgical training relies heavily on the ability of expert surgeons to provide complete and accurate descriptions of a complex procedure. However, research from a variety of domains suggests that experts often omit critical information about the judgments, analysis, and decisions they make when solving a difficult problem or performing a complex task. In this study, we compared three methods for capturing surgeons' descriptions of how to perform the procedure for inserting a femoral artery shunt (unaided free-recall, unaided free-recall with simulation, and cognitive task analysis methods) to determine which method produced more accurate and complete results. Cognitive task analysis was approximately 70% more complete and accurate than free-recall and or free-recall during a simulation of the procedure. Ten expert trauma surgeons at a major urban trauma center were interviewed separately and asked to describe how to perform an emergency shunt procedure. Four surgeons provided an unaided free-recall description of the shunt procedure, five surgeons provided an unaided free-recall description of the procedure using visual aids and surgical instruments (simulation), and one (chosen randomly) was interviewed using cognitive task analysis (CTA) methods. An 11th vascular surgeon approved the final CTA protocol. The CTA interview with only one expert surgeon resulted in significantly greater accuracy and completeness of the descriptions compared with the unaided free-recall interviews with multiple expert surgeons. Surgeons in the unaided group omitted nearly 70% of necessary decision steps. In the free-recall group, heavy use of simulation improved surgeons' completeness when describing the steps of the procedure. CTA significantly increases the completeness and accuracy of surgeons' instructional descriptions of surgical procedures. In addition, simulation during unaided free-recall interviews may improve the completeness of interview data. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.; Hoffler, Keith D.; Proffitt, Melissa S.; Brown, Philip W.; Phillips, Michael R.; Rivers, Robert A.; Messina, Michael D.; Carzoo, Susan W.; Bacon, Barton J.; Foster, John F.
1994-01-01
This paper describes the design, analysis, and nonlinear simulation results (batch and piloted) for a longitudinal controller which is scheduled to be flight-tested on the High-Alpha Research Vehicle (HARV). The HARV is an F-18 airplane modified for and equipped with multi-axis thrust vectoring. The paper includes a description of the facilities, a detailed review of the feedback controller design, linear analysis results of the feedback controller, a description of the feed-forward controller design, nonlinear batch simulation results, and piloted simulation results. Batch simulation results include maximum pitch stick agility responses, angle of attack alpha captures, and alpha regulation for full lateral stick rolls at several alpha's. Piloted simulation results include task descriptions for several types of maneuvers, task guidelines, the corresponding Cooper-Harper ratings from three test pilots, and some pilot comments. The ratings show that desirable criteria are achieved for almost all of the piloted simulation tasks.
Simulation loop between cad systems, GEANT-4 and GeoModel: Implementation and results
NASA Astrophysics Data System (ADS)
Sharmazanashvili, A.; Tsutskiridze, Niko
2016-09-01
Compare analysis of simulation and as-built geometry descriptions of detector is important field of study for data_vs_Monte-Carlo discrepancies. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: (1) Difference between simulated and as-built geometry descriptions; (2) Internal inaccuracies of geometry transformations added by simulation software infrastructure itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML->CATIA; VP1->CATIA; Geo-Model->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each other using the full power of CATIA and investigate both classes of reasons of faults of geometry descriptions. Paper represents results of case studies of ATLAS Coils and End-Cap toroid structures.
Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas
2011-12-15
The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined.
Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan
2016-01-01
Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.
Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J.; Hasenauer, Jan
2016-01-01
Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/. PMID:26807911
Utilizing traffic simulation tools with MOVES and AERMOD
DOT National Transportation Integrated Search
2011-01-01
Overview: Quantify the emissions and fuel consumption associated with traffic congestion from Commercial Motor Vehicle (CMV) crashes Project Description Traffic Simulation Emissions Analysis Future Use with Dispersion Analysis
DOT National Transportation Integrated Search
1981-09-01
This report presents a description of a vehicle simulation program, which can determine the fuel economy and performance of a specified motor vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated by HEVSIM i...
DOT National Transportation Integrated Search
1981-01-01
This report presents an updated description of a vehicle simulation program, VEHSIM, which can determine the fuel economy and performance of a specified vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated ...
DOT National Transportation Integrated Search
1981-09-01
This report presents a description of a vehicle simulation program, which can determine the fuel economy and performance of a specified motor vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated by HEVSIM i...
DOT National Transportation Integrated Search
1981-10-01
This report presents an updated description of a vehicle simulation program, VEHSIM, which can determine the fuel economy and performance of a specified vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated ...
2011-01-01
Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined. PMID:22172142
NASA Technical Reports Server (NTRS)
Davidson, John B.; Murphy, Patrick C.; Lallman, Frederick J.; Hoffler, Keith D.; Bacon, Barton J.
1998-01-01
This report contains a description of a lateral-directional control law designed for the NASA High-Alpha Research Vehicle (HARV). The HARV is a F/A-18 aircraft modified to include a research flight computer, spin chute, and thrust-vectoring in the pitch and yaw axes. Two separate design tools, CRAFT and Pseudo Controls, were integrated to synthesize the lateral-directional control law. This report contains a description of the lateral-directional control law, analyses, and nonlinear simulation (batch and piloted) results. Linear analysis results include closed-loop eigenvalues, stability margins, robustness to changes in various plant parameters, and servo-elastic frequency responses. Step time responses from nonlinear batch simulation are presented and compared to design guidelines. Piloted simulation task scenarios, task guidelines, and pilot subjective ratings for the various maneuvers are discussed. Linear analysis shows that the control law meets the stability margin guidelines and is robust to stability and control parameter changes. Nonlinear batch simulation analysis shows the control law exhibits good performance and meets most of the design guidelines over the entire range of angle-of-attack. This control law (designated NASA-1A) was flight tested during the Summer of 1994 at NASA Dryden Flight Research Center.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-09-04
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-06-01
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Inventory of File nam.t00z.goes24300.tm00.grib2
of Records: 4 Number Level/Layer Parameter Forecast Valid Description 001 top of atmosphere SBT122 analysis Simulated Brightness Temperature for GOES 12, Channel 2 [K] 002 top of atmosphere SBT123 analysis Simulated Brightness Temperature for GOES 12, Channel 3 [K] 003 top of atmosphere SBT124 analysis Simulated
Inventory of File nam.t00z.goes21800.tm00.grib2
of Records: 4 Number Level/Layer Parameter Forecast Valid Description 001 top of atmosphere SBT122 analysis Simulated Brightness Temperature for GOES 12, Channel 2 [K] 002 top of atmosphere SBT123 analysis Simulated Brightness Temperature for GOES 12, Channel 3 [K] 003 top of atmosphere SBT124 analysis Simulated
Inventory of File nam.t00z.goes22100.tm00.grib2
of Records: 4 Number Level/Layer Parameter Forecast Valid Description 001 top of atmosphere SBT122 analysis Simulated Brightness Temperature for GOES 12, Channel 2 [K] 002 top of atmosphere SBT123 analysis Simulated Brightness Temperature for GOES 12, Channel 3 [K] 003 top of atmosphere SBT124 analysis Simulated
Utilization of a CRT display light pen in the design of feedback control systems
NASA Technical Reports Server (NTRS)
Thompson, J. G.; Young, K. R.
1972-01-01
A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.
DOT National Transportation Integrated Search
2013-06-01
As part of the Federal Highway Administrations (FHWA) Active Transportation and Demand Management (ATDM) Foundational Research, this ATDM Analysis, Modeling and Simulation (AMS) Concept of Operations (CONOPS) provides the description of the ATDM A...
The Exponential Expansion of Simulation: How Simulation has Grown as a Research Tool
2012-09-01
exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently
2017-05-23
Systems and the NRL Code 5763 Radio Frequency (RF) Stimulator. It includes and covers system descriptions , setup, data collection, and test goals that...6 4. Test Asset Descriptions ...7 4.1. Description of FOXTROT Anti-ship Missile (ASM) Simulator ......................................... 7
ERIC Educational Resources Information Center
Moore, John W., Ed.
1981-01-01
Provides short descriptions of chemists' applications of computers in instruction: an interactive instructional program for Instrumental-Qualitative Organic Analysis; question-and-answer exercises in organic chemistry; computerized organic nomenclature drills; integration of theoretical and descriptive materials; acid-base titration simulation;…
The Exponential Expansion of Simulation in Research
2012-12-01
exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently
NASA Astrophysics Data System (ADS)
Matsuzaki, F.; Yoshikawa, N.; Tanaka, M.; Fujimaki, A.; Takai, Y.
2003-10-01
Recently many single flux quantum (SFQ) logic circuits containing several thousands of Josephson junctions have been designed successfully by using digital domain simulation based on the hard ware description language (HDL). In the present HDL-based design of SFQ circuits, a structure-level HDL description has been used, where circuits are made up of basic gate cells. However, in order to analyze large-scale SFQ digital systems, such as a microprocessor, more higher-level circuit abstraction is necessary to reduce the circuit simulation time. In this paper we have investigated the way to describe functionality of the large-scale SFQ digital circuits by a behavior-level HDL description. In this method, the functionality and the timing of the circuit block is defined directly by describing their behavior by the HDL. Using this method, we can dramatically reduce the simulation time of large-scale SFQ digital circuits.
LES, DNS and RANS for the analysis of high-speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Givi, Peyman; Taulbee, Dale B.; Adumitroaie, Virgil; Sabini, George J.; Shieh, Geoffrey S.
1994-01-01
The purpose of this research is to continue our efforts in advancing the state of knowledge in large eddy simulation (LES), direct numerical simulation (DNS), and Reynolds averaged Navier Stokes (RANS) methods for the computational analysis of high-speed reacting turbulent flows. In the second phase of this work, covering the period 1 Sep. 1993 - 1 Sep. 1994, we have focused our efforts on two research problems: (1) developments of 'algebraic' moment closures for statistical descriptions of nonpremixed reacting systems, and (2) assessments of the Dirichlet frequency in presumed scalar probability density function (PDF) methods in stochastic description of turbulent reacting flows. This report provides a complete description of our efforts during this past year as supported by the NASA Langley Research Center under Grant NAG1-1122.
WEST-3 wind turbine simulator development
NASA Technical Reports Server (NTRS)
Hoffman, J. A.; Sridhar, S.
1985-01-01
The software developed for WEST-3, a new, all digital, and fully programmable wind turbine simulator is given. The process of wind turbine simulation on WEST-3 is described in detail. The major steps are, the processing of the mathematical models, the preparation of the constant data, and the use of system software generated executable code for running on WEST-3. The mechanics of reformulation, normalization, and scaling of the mathematical models is discussed in detail, in particulr, the significance of reformulation which leads to accurate simulations. Descriptions for the preprocessor computer programs which are used to prepare the constant data needed in the simulation are given. These programs, in addition to scaling and normalizing all the constants, relieve the user from having to generate a large number of constants used in the simulation. Also given are brief descriptions of the components of the WEST-3 system software: Translator, Assembler, Linker, and Loader. Also included are: details of the aeroelastic rotor analysis, which is the center of a wind turbine simulation model, analysis of the gimbal subsystem; and listings of the variables, constants, and equations used in the simulation.
Fault-Sensitivity and Wear-Out Analysis of VLSI Systems.
1995-06-01
DESCRIPTION MIXED-MODE HIERARCIAIFAULT DESCRIPTION FAULT SIMULATION TYPE OF FAULT TRANSIENT/STUCK-AT LOCATION/TIME * _AUTOMATIC FAULT INJECTION TRACE...4219-4224, December 1985. [15] J. Sosnowski, "Evaluation of transient hazards in microprocessor controll - ers," Digest, FTCS-16, The Sixteenth
Simulation Higher Order Language Requirements Study.
ERIC Educational Resources Information Center
Goodenough, John B.; Braun, Christine L.
The definitions provided for high order language (HOL) requirements for programming flight training simulators are based on the analysis of programs written for a variety of simulators. Examples drawn from these programs are used to justify the need for certain HOL capabilities. A description of the general structure and organization of the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brink, A.; Kilpinen, P.; Hupa, M.
1996-01-01
Two methods to improve the modeling of NO{sub x} emissions in numerical flow simulation of combustion are investigated. The models used are a reduced mechanism for nitrogen chemistry in methane combustion and a new model based on regression analysis of perfectly stirred reactor simulations using detailed comprehensive reaction kinetics. The applicability of the methods to numerical flow simulation of practical furnaces, especially in the near burner region, is tested against experimental data from a pulverized coal fired single burner furnace. The results are also compared to those obtained using a commonly used description for the overall reaction rate of NO.
The Community Multiscale Air Quality (CMAQ) modeling system has recently been adapted to simulate the emission, transport, transformation and deposition of atmospheric mercury in three distinct forms; elemental mercury gas, reactive gaseous mercury, and particulate mercury. Emis...
User's manual for the Simulated Life Analysis of Vehicle Elements (SLAVE) model
NASA Technical Reports Server (NTRS)
Paul, D. D., Jr.
1972-01-01
The simulated life analysis of vehicle elements model was designed to perform statistical simulation studies for any constant loss rate. The outputs of the model consist of the total number of stages required, stages successfully completing their lifetime, and average stage flight life. This report contains a complete description of the model. Users' instructions and interpretation of input and output data are presented such that a user with little or no prior programming knowledge can successfully implement the program.
Instrumental resolution of the chopper spectrometer 4SEASONS evaluated by Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Kajimoto, Ryoichi; Sato, Kentaro; Inamura, Yasuhiro; Fujita, Masaki
2018-05-01
We performed simulations of the resolution function of the 4SEASONS spectrometer at J-PARC by using the Monte Carlo simulation package McStas. The simulations showed reasonably good agreement with analytical calculations of energy and momentum resolutions by using a simplified description. We implemented new functionalities in Utsusemi, the standard data analysis tool used in 4SEASONS, to enable visualization of the simulated resolution function and predict its shape for specific experimental configurations.
Employing Simulation to Evaluate Designs: The APEX Approach
NASA Technical Reports Server (NTRS)
Freed, Michael A.; Shafto, Michael G.; Remington, Roger W.; Null, Cynthia H. (Technical Monitor)
1998-01-01
The key innovations of APEX are its integrated approaches to task analysis, procedure definition, and intelligent, resource-constrained multi-tasking. This paper presents a step-by-step description of how APEX is used, from scenario development through trace analysis.
Simulating Silvicultural Treatments Using FIA Data
Christopher W. Woodall; Carl E. Fiedler
2005-01-01
Potential uses of the Forest Inventory and Analysis Database (FIADB) extend far beyond descriptions and summaries of current forest resources. Silvicultural treatments, although typically conducted at the stand level, may be simulated using the FIADB for predicting future forest conditions and resources at broader scales. In this study, silvicultural prescription...
Trajectory Reconstruction Program Milestone 2/3 Report. Volume 1. Description and Overview
1974-12-16
Simulation Data Generation Missile Trajectory Error Analysis Modularized Program Guidance and Targeting Multiple Vehicle Simulation IBM 360/370 Numerical...consists of vehicle simulation subprograms designed and written in FORTRAN for CDC 6600/7600, IBM 360/370, and UNIVAC 1108/1110 series computers. The o-erall...vehicle simulation subprograms designed and written in FORTRAN fcr CDC 6600/7600, IBM 360/370, and UNIVAC l08/1110 series computers. The overall
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Learning and Learning-to-Learn by Doing: Simulating Corporate Practice in Law School.
ERIC Educational Resources Information Center
Okamoto, Karl S.
1995-01-01
A law school course in advanced corporate legal practice is described. The course, a series of simulated lawyering tasks centered on a hypothetical leveraged buyout transaction, is designed to go beyond basic legal analysis to develop professional expertise in legal problem solving. The course description includes goals, syllabus design,…
The Lake Tahoe Basin Land Use Simulation Model
Forney, William M.; Oldham, I. Benson
2011-01-01
This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.
Simulation of a Canard in Fluid Flow Driven by a Piezoelectric Beam with a Software Control Loop
2014-04-01
The canard is actuated by a piezoelectric beam that bends as voltage is applied. The voltage is controlled by a software subroutine that measures...Dynamic system Modeling Co-simulation Simulation Abaqus Finite element analysis (FEA) Finite element method (FEM) Computational...is unlimited. i CONTENTS Page Introduction 1 Model Description 1 Fluid Model 2 Structural Model 3 Control Subroutine 4 Results 4
MHDL CAD tool with fault circuit handling
NASA Astrophysics Data System (ADS)
Espinosa Flores-Verdad, Guillermo; Altamirano Robles, Leopoldo; Osorio Roque, Leticia
2003-04-01
Behavioral modeling and simulation, with Analog Hardware and Mixed Signal Description High Level Languages (MHDLs), have generated the development of diverse simulation tools that allow handling the requirements of the modern designs. These systems have million of transistors embedded and they are radically diverse between them. This tendency of simulation tools is exemplified by the development of languages for modeling and simulation, whose applications are the re-use of complete systems, construction of virtual prototypes, realization of test and synthesis. This paper presents the general architecture of a Mixed Hardware Description Language, based on the standard 1076.1-1999 IEEE VHDL Analog and Mixed-Signal Extensions known as VHDL-AMS. This architecture is novel by consider the modeling and simulation of faults. The main modules of the CAD tool are briefly described in order to establish the information flow and its transformations, starting from the description of a circuit model, going throw the lexical analysis, mathematical models generation and the simulation core, ending at the collection of the circuit behavior as simulation"s data. In addition, the incorporated mechanisms to the simulation core are explained in order to realize the handling of faults into the circuit models. Currently, the CAD tool works with algebraic and differential descriptions for the circuit models, nevertheless the language design is open to be able to handle different model types: Fuzzy Models, Differentials Equations, Transfer Functions and Tables. This applies for fault models too, in this sense the CAD tool considers the inclusion of mutants and saboteurs. To exemplified the results obtained until now, the simulated behavior of a circuit is shown when it is fault free and when it has been modified by the inclusion of a fault as a mutant or a saboteur. The obtained results allow the realization of a virtual diagnosis for mixed circuits. This language works in a UNIX system; it was developed with an object-oriented methodology and programmed in C++.
Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0
NASA Technical Reports Server (NTRS)
Etheridge, Melvin; Plugge, Joana; Retina, Nusrat
1998-01-01
The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syring, R.P.; Grubb, R.L.
1979-09-30
This document reports on the following: (1) experimental determination of the response of 16 basic structural elements and 7 B-52 components to simulated nuclear overpressure environments (utilizing Sandia Corporation's Thunderpipe Shock Tube), (2) analysis of these test specimens utilizing the NOVA-2 computer program, and (3) correlation of test and analysis results.
VHDL simulation with access to transistor models
NASA Technical Reports Server (NTRS)
Gibson, J.
1991-01-01
Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.
Hybrid water immersion simulation of manual IVA performance in weightlessness
NASA Technical Reports Server (NTRS)
Loats, H. L., Jr.; Mattingly, G. S.
1971-01-01
A description is given of the development, tests, and analysis of a manual simulator. The simulator was developed to test mass handling and translation under weightlessness conditions by a test subject. The system is composed of a hybrid simulator with a combination of water immersion and mechanical, Peter Pan, simulation. The concept operates on the equivalence principle, with the subject and the cargo remaining quasi-stationary. Movement is effected through a moving device controlled through force by the subject. Motion response is determined through computations of the inertial movement under such conditions.
1979-02-02
R2 = 1.8 nmi (10,940 ft). An analysis of a CAS employing range and range rate indicated that the form of the equation used in ANTC-117 was valid ...interrogations persecond. Preliminary analysis of flight data indicated the system is capable of tracking successfully through garbled situations...ATC simulation, Monte-Carlo simulation of 12 mid-airs and analysis of ARTS III data for ATC interaction. The results of the effort points to the need
Lemonade's the Name, Simulation's the Game.
ERIC Educational Resources Information Center
Friel, Susan
1983-01-01
Provides a detailed description of Lemonade, a business game designed to introduce elementary and secondary students to the basics of business; i.e., problem solving strategies, hypothesis formulation and testing, trend analysis, prediction, comparative analysis, and effects of such factors as advertising and climatic conditions on sales and…
Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G
2012-01-01
Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.
An application of sedimentation simulation in Tahe oilfield
NASA Astrophysics Data System (ADS)
Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He
2017-12-01
The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.
NASA Astrophysics Data System (ADS)
Vogler, Marcel; Horiuchi, Michio; Bessler, Wolfgang G.
A detailed computational model of a direct-flame solid oxide fuel cell (DFFC) is presented. The DFFC is based on a fuel-rich methane-air flame stabilized on a flat-flame burner and coupled to a solid oxide fuel cell (SOFC). The model consists of an elementary kinetic description of the premixed methane-air flame, a stagnation-point flow description of the coupled heat and mass transport within the gas phase, an elementary kinetic description of the electrochemistry, as well as heat, mass and charge transport within the SOFC. Simulated current-voltage characteristics show excellent agreement with experimental data published earlier (Kronemayer et al., 2007 [10]). The model-based analysis of loss processes reveals that ohmic resistance in the current collection wires dominates polarization losses, while electronic loss currents in the mixed conducting electrolyte have only little influence on the polarized cell. The model was used to propose an optimized cell design. Based on this analysis, power densities of above 200 mW cm -2 can be expected.
Description of the GMAO OSSE for Weather Analysis Software Package: Version 3
NASA Technical Reports Server (NTRS)
Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.;
2017-01-01
The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.
Effective Management Selection: The Analysis of Behavior by Simulation Techniques.
ERIC Educational Resources Information Center
Jaffee, Cabot L.
This book presents a system by which feedback might be generated and used as a basis for organizational change. The major areas covered consist of the development of a rationale for the use of simulation in the selection of supervisors, a description of actual techniques, and a method for training individuals in the use of the material. The…
NASA Astrophysics Data System (ADS)
Pond, Mark J.; Errington, Jeffrey R.; Truskett, Thomas M.
2011-09-01
Partial pair-correlation functions of colloidal suspensions with continuous polydispersity can be challenging to characterize from optical microscopy or computer simulation data due to inadequate sampling. As a result, it is common to adopt an effective one-component description of the structure that ignores the differences between particle types. Unfortunately, whether this kind of simplified description preserves or averages out information important for understanding the behavior of the fluid depends on the degree of polydispersity and can be difficult to assess, especially when the corresponding multicomponent description of the pair correlations is unavailable for comparison. Here, we present a computer simulation study that examines the implications of adopting an effective one-component structural description of a polydisperse fluid. The square-well model that we investigate mimics key aspects of the experimental behavior of suspended colloids with short-range, polymer-mediated attractions. To characterize the partial pair-correlation functions and thermodynamic excess entropy of this system, we introduce a Monte Carlo sampling strategy appropriate for fluids with a large number of pseudo-components. The data from our simulations at high particle concentrations, as well as exact theoretical results for dilute systems, show how qualitatively different trends between structural order and particle attractions emerge from the multicomponent and effective one-component treatments, even with systems characterized by moderate polydispersity. We examine consequences of these differences for excess-entropy based scalings of shear viscosity, and we discuss how use of the multicomponent treatment reveals similarities between the corresponding dynamic scaling behaviors of attractive colloids and liquid water that the effective one-component analysis does not capture.
A program code generator for multiphysics biological simulation using markup languages.
Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi
2012-01-01
To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.
Analysis of Waves in Space Plasma (WISP) near field simulation and experiment
NASA Technical Reports Server (NTRS)
Richie, James E.
1992-01-01
The WISP payload scheduler for a 1995 space transportation system (shuttle flight) will include a large power transmitter on board at a wide range of frequencies. The levels of electromagnetic interference/electromagnetic compatibility (EMI/EMC) must be addressed to insure the safety of the shuttle crew. This report is concerned with the simulation and experimental verification of EMI/EMC for the WISP payload in the shuttle cargo bay. The simulations have been carried out using the method of moments for both thin wires and patches to stimulate closed solids. Data obtained from simulation is compared with experimental results. An investigation of the accuracy of the modeling approach is also included. The report begins with a description of the WISP experiment. A description of the model used to simulate the cargo bay follows. The results of the simulation are compared to experimental data on the input impedance of the WISP antenna with the cargo bay present. A discussion of the methods used to verify the accuracy of the model is shown to illustrate appropriate methods for obtaining this information. Finally, suggestions for future work are provided.
Atmospheric turbulence simulation for Shuttle orbiter
NASA Technical Reports Server (NTRS)
Tatom, F. B.; Smith, S. R.
1979-01-01
An improved non-recursive model for atmospheric turbulence along the flight path of the Shuttle Orbiter is developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model the time series for both gusts and gust gradients are generated and stored on a series of magnetic tapes. Section 2 provides a description of the various technical considerations associated with the turbulence simulation model. Included in this section are descriptions of the digital filter simulation model, the von Karman spectra with finite upper limits, and the final non recursive turbulence simulation model which was used to generate the time series. Section 2 provides a description of the various technical considerations associated with the turbulence simulation model. Included in this section are descriptions of the digial filter simulation model, the von Karman spectra with finite upper limits, and the final non recursive turbulence simulation model which was used to generate the time series. Section 3 provides a description of the time series as currently recorded on magnetic tape. Conclusions and recommendations are presented in Section 4.
2010-09-01
analysis process is to categorize the goal according to (Gagné, 2005) domains of learning . These domains are: verbal information, intellectual...to terrain features. The ability to provide a clear verbal description of a unique feature is a learned task that may be separate from the...and experts differently. The process of verbally encoding information on location and providing this description may detract from the primary task of
NASA Technical Reports Server (NTRS)
Lunsford, Myrtis Leigh
1998-01-01
The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.
User's manual for the Composite HTGR Analysis Program (CHAP-1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.
1977-03-01
CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.
Space Ultrareliable Modular Computer (SUMC) instruction simulator
NASA Technical Reports Server (NTRS)
Curran, R. T.
1972-01-01
The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.
Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania
NASA Astrophysics Data System (ADS)
1980-09-01
The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.
Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania
NASA Technical Reports Server (NTRS)
1980-01-01
The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.
Machine learning of fault characteristics from rocket engine simulation data
NASA Technical Reports Server (NTRS)
Ke, Min; Ali, Moonis
1990-01-01
Transformation of data into knowledge through conceptual induction has been the focus of our research described in this paper. We have developed a Machine Learning System (MLS) to analyze the rocket engine simulation data. MLS can provide to its users fault analysis, characteristics, and conceptual descriptions of faults, and the relationships of attributes and sensors. All the results are critically important in identifying faults.
Modeling and simulation of biological systems using SPICE language
Lallement, Christophe; Haiech, Jacques
2017-01-01
The article deals with BB-SPICE (SPICE for Biochemical and Biological Systems), an extension of the famous Simulation Program with Integrated Circuit Emphasis (SPICE). BB-SPICE environment is composed of three modules: a new textual and compact description formalism for biological systems, a converter that handles this description and generates the SPICE netlist of the equivalent electronic circuit and NGSPICE which is an open-source SPICE simulator. In addition, the environment provides back and forth interfaces with SBML (System Biology Markup Language), a very common description language used in systems biology. BB-SPICE has been developed in order to bridge the gap between the simulation of biological systems on the one hand and electronics circuits on the other hand. Thus, it is suitable for applications at the interface between both domains, such as development of design tools for synthetic biology and for the virtual prototyping of biosensors and lab-on-chip. Simulation results obtained with BB-SPICE and COPASI (an open-source software used for the simulation of biochemical systems) have been compared on a benchmark of models commonly used in systems biology. Results are in accordance from a quantitative viewpoint but BB-SPICE outclasses COPASI by 1 to 3 orders of magnitude regarding the computation time. Moreover, as our software is based on NGSPICE, it could take profit of incoming updates such as the GPU implementation, of the coupling with powerful analysis and verification tools or of the integration in design automation tools (synthetic biology). PMID:28787027
Visualizing human communication in business process simulations
NASA Astrophysics Data System (ADS)
Groehn, Matti; Jalkanen, Janne; Haho, Paeivi; Nieminen, Marko; Smeds, Riitta
1999-03-01
In this paper a description of business process simulation is given. Crucial part in the simulation of business processes is the analysis of social contacts between the participants. We will introduce a tool to collect log data and how this log data can be effectively analyzed using two different kind of methods: discussion flow charts and self-organizing maps. Discussion flow charts revealed the communication patterns and self-organizing maps are a very effective way of clustering the participants into development groups.
Vermeulen, Joeri; Beeckman, Katrien; Turcksin, Rivka; Van Winkel, Lies; Gucciardo, Léonardo; Laubach, Monika; Peersman, Wim; Swinnen, Eva
2017-06-01
Simulation training is a powerful and evidence-based teaching method in healthcare. It allows students to develop essential competences that are often difficult to achieve during internships. High-Fidelity Perinatal Simulation exposes them to real-life scenarios in a safe environment. Although student midwives' experiences need to be considered to make the simulation training work, these have been overlooked so far. To explore the experiences of last-year student midwives with High-Fidelity Perinatal Simulation training. A qualitative descriptive study, using three focus group conversations with last-year student midwives (n=24). Audio tapes were transcribed and a thematic content analysis was performed. The entire data set was coded according to recurrent or common themes. To achieve investigator triangulation and confirm themes, discussions among the researchers was incorporated in the analysis. Students found High-Fidelity Perinatal Simulation training to be a positive learning method that increased both their competence and confidence. Their experiences varied over the different phases of the High-Fidelity Perinatal Simulation training. Although uncertainty, tension, confusion and disappointment were experienced throughout the simulation trajectory, they reported that this did not affect their learning and confidence-building. As High-Fidelity Perinatal Simulation training constitutes a helpful learning experience in midwifery education, it could have a positive influence on maternal and neonatal outcomes. In the long term, it could therefore enhance the midwifery profession in several ways. The present study is an important first step in opening up the debate about the pedagogical use of High-Fidelity Perinatal Simulation training within midwifery education. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
Battista, Alexis
2017-01-01
The dominant frameworks for describing how simulations support learning emphasize increasing access to structured practice and the provision of feedback which are commonly associated with skills-based simulations. By contrast, studies examining student participants' experiences during scenario-based simulations suggest that learning may also occur through participation. However, studies directly examining student participation during scenario-based simulations are limited. This study examined the types of activities student participants engaged in during scenario-based simulations and then analyzed their patterns of activity to consider how participation may support learning. Drawing from Engeström's first-, second-, and third-generation activity systems analysis, an in-depth descriptive analysis was conducted. The study drew from multiple qualitative methods, namely narrative, video, and activity systems analysis, to examine student participants' activities and interaction patterns across four video-recorded simulations depicting common motivations for using scenario-based simulations (e.g., communication, critical patient management). The activity systems analysis revealed that student participants' activities encompassed three clinically relevant categories, including (a) use of physical clinical tools and artifacts, (b) social interactions, and (c) performance of structured interventions. Role assignment influenced participants' activities and the complexity of their engagement. Importantly, participants made sense of the clinical situation presented in the scenario by reflexively linking these three activities together. Specifically, student participants performed structured interventions, relying upon the use of physical tools, clinical artifacts, and social interactions together with interactions between students, standardized patients, and other simulated participants to achieve their goals. When multiple student participants were present, such as in a team-based scenario, they distributed the workload to achieve their goals. The findings suggest that student participants learned as they engaged in these scenario-based simulations when they worked to make sense of the patient's clinical presentation. The findings may provide insight into how student participants' meaning-making efforts are mediated by the cultural artifacts (e.g., physical clinical tools) they access, the social interactions they engage in, the structured interventions they perform, and the roles they are assigned. The findings also highlight the complex and emergent properties of scenario-based simulations as well as how activities are nested. Implications for learning, instructional design, and assessment are discussed.
Space Trajectories Error Analysis (STEAP) Programs. Volume 1: Analytic manual, update
NASA Technical Reports Server (NTRS)
1971-01-01
Manual revisions are presented for the modified and expanded STEAP series. The STEAP 2 is composed of three independent but related programs: NOMAL for the generation of n-body nominal trajectories performing a number of deterministic guidance events; ERRAN for the linear error analysis and generalized covariance analysis along specific targeted trajectories; and SIMUL for testing the mathematical models used in the navigation and guidance process. The analytic manual provides general problem description, formulation, and solution and the detailed analysis of subroutines. The programmers' manual gives descriptions of the overall structure of the programs as well as the computational flow and analysis of the individual subroutines. The user's manual provides information on the input and output quantities of the programs. These are updates to N69-36472 and N69-36473.
A Descriptive Guide to Trade Space Analysis
2015-09-01
Development QFD Quality Function Deployment RSM Response Surface Method RSE Response Surface Equation SE Systems Engineering SME Subject Matter...surface equations ( RSEs ) as surrogate models. It uses the RSEs with Monte Carlo simulation to quantitatively explore changes across the surfaces to
Snowmass Computing Frontier: Computing for the Cosmic Frontier, Astrophysics, and Cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connolly, A.; Habib, S.; Szalay, A.
2013-11-12
This document presents (off-line) computing requrements and challenges for Cosmic Frontier science, covering the areas of data management, analysis, and simulations. We invite contributions to extend the range of covered topics and to enhance the current descriptions.
SIMREL: Software for Coefficient Alpha and Its Confidence Intervals with Monte Carlo Studies
ERIC Educational Resources Information Center
Yurdugul, Halil
2009-01-01
This article describes SIMREL, a software program designed for the simulation of alpha coefficients and the estimation of its confidence intervals. SIMREL runs on two alternatives. In the first one, if SIMREL is run for a single data file, it performs descriptive statistics, principal components analysis, and variance analysis of the item scores…
NASA Astrophysics Data System (ADS)
Soti, G.; Wauters, F.; Breitenfeldt, M.; Finlay, P.; Kraev, I. S.; Knecht, A.; Porobić, T.; Zákoucký, D.; Severijns, N.
2013-11-01
Geant4 simulations play a crucial role in the analysis and interpretation of experiments providing low energy precision tests of the Standard Model. This paper focuses on the accuracy of the description of the electron processes in the energy range between 100 and 1000 keV. The effect of the different simulation parameters and multiple scattering models on the backscattering coefficients is investigated. Simulations of the response of HPGe and passivated implanted planar Si detectors to β particles are compared to experimental results. An overall good agreement is found between Geant4 simulations and experimental data.
NASA Astrophysics Data System (ADS)
Donà, G.; Faletra, M.
2015-09-01
This paper presents the TT&C performance simulator toolkit developed internally at Thales Alenia Space Italia (TAS-I) to support the design of TT&C subsystems for space exploration and scientific satellites. The simulator has a modular architecture and has been designed using a model-based approach using standard engineering tools such as MATLAB/SIMULINK and mission analysis tools (e.g. STK). The simulator is easily reconfigurable to fit different types of satellites, different mission requirements and different scenarios parameters. This paper provides a brief description of the simulator architecture together with two examples of applications used to demonstrate some of the simulator’s capabilities.
An analysis of simulated and observed storm characteristics
NASA Astrophysics Data System (ADS)
Benestad, R. E.
2010-09-01
A calculus-based cyclone identification (CCI) method has been applied to the most recent re-analysis (ERAINT) from the European Centre for Medium-range Weather Forecasts and results from regional climate model (RCM) simulations. The storm frequency for events with central pressure below a threshold value of 960-990hPa were examined, and the gradient wind from the simulated storm systems were compared with corresponding estimates from the re-analysis. The analysis also yielded estimates for the spatial extent of the storm systems, which was also included in the regional climate model cyclone evaluation. A comparison is presented between a number of RCMs and the ERAINT re-analysis in terms of their description of the gradient winds, number of cyclones, and spatial extent. Furthermore, a comparison between geostrophic wind estimated though triangules of interpolated or station measurements of SLP is presented. Wind still represents one of the more challenging variables to model realistically.
Bucknall, Tracey K; Forbes, Helen; Phillips, Nicole M; Hewitt, Nicky A; Cooper, Simon; Bogossian, Fiona
2016-10-01
The aim of this study was to examine the decision-making of nursing students during team based simulations on patient deterioration to determine the sources of information, the types of decisions made and the influences underpinning their decisions. Missed, misinterpreted or mismanaged physiological signs of deterioration in hospitalized patients lead to costly serious adverse events. Not surprisingly, an increased focus on clinical education and graduate nurse work readiness has resulted. A descriptive exploratory design. Clinical simulation laboratories in three Australian universities were used to run team based simulations with a patient actor. A convenience sample of 97 final-year nursing students completed simulations, with three students forming a team. Four teams from each university were randomly selected for detailed analysis. Cued recall during video review of team based simulation exercises to elicit descriptions of individual and team based decision-making and reflections on performance were audio-recorded post simulation (2012) and transcribed. Students recalled 11 types of decisions, including: information seeking; patient assessment; diagnostic; intervention/treatment; evaluation; escalation; prediction; planning; collaboration; communication and reflective. Patient distress, uncertainty and a lack of knowledge were frequently recalled influences on decisions. Incomplete information, premature diagnosis and a failure to consider alternatives when caring for patients is likely to lead to poor quality decisions. All health professionals have a responsibility in recognizing and responding to clinical deterioration within their scope of practice. A typology of nursing students' decision-making in teams, in this context, highlights the importance of individual knowledge, leadership and communication. © 2016 John Wiley & Sons Ltd.
Lui, Justin T; Hoy, Monica Y
2017-06-01
Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P < .006). Conclusion In the context of a diverse population of virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.
Simulation of CIFF (Centralized IFF) remote control displays
NASA Astrophysics Data System (ADS)
Tucker, D. L.; Leibowitz, L. M.
1986-06-01
This report presents the software simulation of the Remote-Control-Display (RCS) proposed to be used in the Centralized IFF (CIFF) system. A description of the simulation programs along with simulated menu formats are presented. A sample listing of the simulation programs and a brief description of the program operation are also included.
Efficient generation of connectivity in neuronal networks from simulator-independent descriptions
Djurfeldt, Mikael; Davison, Andrew P.; Eppler, Jochen M.
2014-01-01
Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed. We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeler's needs. We have used the connection generator interface to connect C++ and Python implementations of the previously described connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modeling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface. PMID:24795620
Lunar drill and test apparatus
NASA Technical Reports Server (NTRS)
Norrington, David W.; Ardoin, Didier C.; Alexander, Stephen G.; Rowland, Philip N.; Vastakis, Frank N.; Linsey, Steven L.
1988-01-01
The design of an experimental lunar drill and a facility to test the drill under simulated lunar conditions is described. The drill utilizes a polycrystalline diamond compact drag bit and an auger to mechanically remove cuttings from the hole. The drill will be tested in a vacuum chamber and powered through a vacuum seal by a drive mechanism located above the chamber. A general description of the design is provided followed by a detailed description and analysis of each component. Recommendations for the further development of the design are included.
Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 1
NASA Technical Reports Server (NTRS)
Bernard, Douglas E. (Editor); Man, Guy K. (Editor)
1989-01-01
Conference topics included definition of tool requirements, advanced multibody component representation descriptions, model reduction, parallel computation, real time simulation, control design and analysis software, user interface issues, testing and verification, and applications to spacecraft, robotics, and aircraft.
Pilot/vehicle model analysis of visually guided flight
NASA Technical Reports Server (NTRS)
Zacharias, Greg L.
1991-01-01
Information is given in graphical and outline form on a pilot/vehicle model description, control of altitude with simple terrain clues, simulated flight with visual scene delays, model-based in-cockpit display design, and some thoughts on the role of pilot/vehicle modeling.
Biosafety Level 3 Recon Training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickens, Brian Scott; Chavez, Melanie Ann; Heimer, Donovan J.
The Biosafety Level 3 Recon training is a 3D virtual tool developed for the Counter WMD Analysis Cell (CWAC) and the Asymmetric Warfare Group (AWG) by the Application Modeling and Development Team within the NEN-3 International Threat Reduction Group. The training simulates a situation where friendly forces have secured from hostile forces a suspected bioweapons development laboratory. The trainee is a squad member tasked to investigate the facility, locate laboratories within the facility, and identify hazards to entrants and the surrounding area. Before beginning the 3D simulation, the trainee must select the appropriate MOPP level for entering the facility. Themore » items in the simulation, including inside and outside the bioweapon facility, are items that are commonly used by scientists in Biosafety Level (BSL) laboratories. Each item has clickable red tags that, when activated, give the trainee a brief description of the item and a controllable turn-around view. The descriptions also contain information about potential hazards the item can present. Trainees must find all tagged items in order to complete the simulation, but can also reference descriptions and turn-around view of the items in a glossary menu. Training is intended to familiarize individuals whom have little or no biology or chemistry background with technical equipment used in BSL laboratories. The revised edition of this simulation (Biosafety Level 3 Virtual Lab) changes the trainee into a investigator instead of a military combatant. Many doors now require a virtual badge swipe to open. Airlock doors may come in sets such that the open door must be closed before the next door in the set can be opened. A user interface was added so that the instructor can edit the information about the items (the brief descriptions mentioned above) using the simulation software instead of the previous method of manually entering the material in xml settings files. Facility labels, such as "No Parking" and "Men's room", were changed from Korean, into English. No other changes were made.« less
An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation
NASA Technical Reports Server (NTRS)
Bartos, R. D.
1993-01-01
Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.
NASA Technical Reports Server (NTRS)
Davidson, Frederic M.; Sun, Xiaoli; Field, Christopher T.
1994-01-01
This interim report consists of two reports: 'Space Radiation Effects on Si APDs for GLAS' and 'Computer Simulation of Avalanche Photodiode and Preamplifier Output for Laser Altimeters.' The former contains a detailed description of our proton radiation test of Si APD's performed at the Brookhaven National Laboratory. The latter documents the computer program subroutines which were written for the upgrade of NASA's GLAS simulator.
P.L. Tedder; R.N. La Mont; J.C. Kincaid
1987-01-01
TRIM (Timber Resource Inventory Model) is a yield table projection system developed for timber supply projections and policy analysis. TRIM simulates timber growth, inventories, management and area changes, and removals over the projection period. Programs in the TRIM system, card-by-card descriptions of required inputs, table formats, and sample results are presented...
Money-center structures in dynamic banking systems
NASA Astrophysics Data System (ADS)
Li, Shouwei; Zhang, Minghui
2016-10-01
In this paper, we propose a dynamic model for banking systems based on the description of balance sheets. It generates some features identified through empirical analysis. Through simulation analysis of the model, we find that banking systems have the feature of money-center structures, that bank asset distributions are power-law distributions, and that contract size distributions are log-normal distributions.
Zhan, Ping; Tian, Honglei; Zhang, Xiaoming; Wang, Liping
2013-03-15
Changes in the aroma characteristics of mutton process flavors (MPFs) prepared from sheep bone protein hydrolysates (SBPHs) with different degrees of hydrolysis (DH) were evaluated using gas chromatography-mass spectrometry (GC-MS), gas chromatography-olfactometry (GC-O), and descriptive sensory analysis (DSA). Five attributes (muttony, meaty, roasted, mouthful, and simulate) were selected to assess MPFs. The results of DSA showed a distinct difference among the control sample MPF0 and other MPF samples with added SBPHs for different DHs of almost all sensory attributes. MPF5 (DH 25.92%) was the strongest in the muttony, meaty, and roasted attributes, whereas MPF6 (DH 30.89%) was the strongest in the simulate and roasted attributes. Thirty-six compounds were identified as odor-active compounds for the evaluation of the sensory characteristics of MPFs via GC-MS-O analysis. The results of correlation analysis among odor-active compounds, molecular weight, and DSA further confirmed that the SBPH with a DH range of 25.92-30.89% may be a desirable precursor for the sensory characteristics of MPF. Copyright © 2013 Elsevier B.V. All rights reserved.
Pharmacist. Occupational Simulation Kit.
ERIC Educational Resources Information Center
Parsley, Nancy
This career exploration instructional booklet on the pharmacist's occupation is one of several resulting from the rural southwestern Colorado CEPAC Project (Career Education Process of Attitude Change). Based on a job analysis and utilizing a programed instructional format, the following content is included: A brief description of two real…
NASA Technical Reports Server (NTRS)
Andrisani, D., II; Daughaday, H.; Dittenhauser, J.; Rynaski, E.
1978-01-01
The aerodynamics, control system, instrumentation complement and recording system of the USAF Total In/Flight Simulator (TIFS) airplane are described. A control system that would allow the ailerons to be operated collectively, as well as, differentially to entrance the ability of the vehicle to perform the dual function of maneuver load control and gust alleviation is emphasized. Mathematical prediction of the rigid body and the flexible equations of longitudinal motion using the level 2.01 FLEXSTAB program are included along with a definition of the vehicle geometry, the mass and stiffness distribution, the calculated mode frequencies and mode shapes, and the resulting aerodynamic equations of motion of the flexible vehicle. A complete description of the control and instrumentation system of the aircraft is presented, including analysis, ground test and flight data comparisons of the performance and bandwidth of the aerodynamic surface servos. Proposed modification for improved performance of the servos are also presented.
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation
Campbell, Robert James; Gantt, Laura; Congdon, Tamara
2009-01-01
This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533
LES, DNS and RANS for the analysis of high-speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Adumitroaie, V.; Colucci, P. J.; Taulbee, D. B.; Givi, P.
1995-01-01
The purpose of this research is to continue our efforts in advancing the state of knowledge in large eddy simulation (LES), direct numerical simulation (DNS), and Reynolds averaged Navier Stokes (RANS) methods for the computational analysis of high-speed reacting turbulent flows. In the second phase of this work, covering the period 1 Aug. 1994 - 31 Jul. 1995, we have focused our efforts on two programs: (1) developments of explicit algebraic moment closures for statistical descriptions of compressible reacting flows and (2) development of Monte Carlo numerical methods for LES of chemically reacting flows.
Tools for 3D scientific visualization in computational aerodynamics
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.
NASA Technical Reports Server (NTRS)
Houck, J. A.; Markos, A. T.
1980-01-01
This paper describes the work being done at the National Aeronautics and Space Administration's (NASA) Langley Research Center on the development of a multi-media crew-training program for the Terminal Configured Vehicle (TCV) Mission Simulator. Brief descriptions of the goals and objectives of the TCV Program and of the TCV Mission Simulator are presented. A detailed description of the training program is provided along with a description of the performance of the first group of four commercial pilots to be qualified in the TCV Mission Simulator.
NASA Technical Reports Server (NTRS)
Rhouck, J. A.; Markos, A. T.
1980-01-01
This paper describes the work being done at the National Aeronautics and Space Administration's (NASA) Langley Research Center on the development of a multi-media crew-training program for the Terminal Configured Vehicle (TCV) Mission Simulator. Brief descriptions of the goals and objectives of the TCV Program and of the TCV Mission Simulator are presented. A detailed description of the training program is provided along with a description of the performance of the first group of four commercial pilots to be qualified in the TCV Mission Simulator.
Ballangrud, Randi; Hall-Lord, Marie Louise; Persenius, Mona; Hedelin, Birgitta
2014-08-01
To describe intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care. Failures in team processes are found to be contributory factors to incidents in an intensive care environment. Simulation-based training is recommended as a method to make health-care personnel aware of the importance of team working and to improve their competencies. The study uses a qualitative descriptive design. Individual qualitative interviews were conducted with 18 intensive care nurses from May to December 2009, all of which had attended a simulation-based team training programme. The interviews were analysed by qualitative content analysis. One main category emerged to illuminate the intensive care nurse perception: "training increases awareness of clinical practice and acknowledges the importance of structured work in teams". Three generic categories were found: "realistic training contributes to safe care", "reflection and openness motivates learning" and "finding a common understanding of team performance". Simulation-based team training makes intensive care nurses more prepared to care for severely ill patients. Team training creates a common understanding of how to work in teams with regard to patient safety. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
1982-06-01
This volume provides a general description of the Airport Landside Simulation Model. A summary of simulated passenger and vehicular processing through the landside is presented. Program operating characteristics and assumptions are documented and a c...
Method for simulating discontinuous physical systems
Baty, Roy S.; Vaughn, Mark R.
2001-01-01
The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.
Generalized simulation technique for turbojet engine system analysis
NASA Technical Reports Server (NTRS)
Seldner, K.; Mihaloew, J. R.; Blaha, R. J.
1972-01-01
A nonlinear analog simulation of a turbojet engine was developed. The purpose of the study was to establish simulation techniques applicable to propulsion system dynamics and controls research. A schematic model was derived from a physical description of a J85-13 turbojet engine. Basic conservation equations were applied to each component along with their individual performance characteristics to derive a mathematical representation. The simulation was mechanized on an analog computer. The simulation was verified in both steady-state and dynamic modes by comparing analytical results with experimental data obtained from tests performed at the Lewis Research Center with a J85-13 engine. In addition, comparison was also made with performance data obtained from the engine manufacturer. The comparisons established the validity of the simulation technique.
The Propulsive Small Expendable Deployer System (ProSEDS)
NASA Technical Reports Server (NTRS)
Lorenzini, Enrico C.; Cosmo, Mario L.; Estes, Robert D.; Sanmartin, Juan; Pelaez, Jesus; Ruiz, Manuel
2003-01-01
This Final Report covers the following main topics: 1) Brief Description of ProSEDS; 2) Mission Analysis; 3) Dynamics Reference Mission; 4) Dynamics Stability; 5) Deployment Control; 6) Updated System Performance; 7) Updated Mission Analysis; 8) Updated Dynamics Reference Mission; 9) Updated Deployment Control Profiles and Simulations; 10) Updated Reference Mission; 11) Evaluation of Power Delivered by the Tether; 12) Deployment Control Profile Ref. #78 and Simulations; 13) Kalman Filters for Mission Estimation; 14) Analysis/Estimation of Deployment Flight Data; 15) Comparison of ED Tethers and Electrical Thrusters; 16) Dynamics Analysis for Mission Starting at a Lower Altitude; 17) Deployment Performance at a Lower Altitude; 18) Satellite Orbit after a Tether Cut; 19) Deployment with Shorter Dyneema Tether Length; 20) Interactive Software for ED Tethers.
Space radiator simulation manual for computer code
NASA Technical Reports Server (NTRS)
Black, W. Z.; Wulff, W.
1972-01-01
A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.
NASA Technical Reports Server (NTRS)
1977-01-01
A preliminary design for a helicopter/VSTOL wide angle simulator image generation display system is studied. The visual system is to become part of a simulator capability to support Army aviation systems research and development within the near term. As required for the Army to simulate a wide range of aircraft characteristics, versatility and ease of changing cockpit configurations were primary considerations of the study. Due to the Army's interest in low altitude flight and descents into and landing in constrained areas, particular emphasis is given to wide field of view, resolution, brightness, contrast, and color. The visual display study includes a preliminary design, demonstrated feasibility of advanced concepts, and a plan for subsequent detail design and development. Analysis and tradeoff considerations for various visual system elements are outlined and discussed.
SYSTID - A flexible tool for the analysis of communication systems.
NASA Technical Reports Server (NTRS)
Dawson, C. T.; Tranter, W. H.
1972-01-01
Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.
A user's guide to the Flexible Spacecraft Dynamics and Control Program
NASA Technical Reports Server (NTRS)
Fedor, J. V.
1984-01-01
A guide to the use of the Flexible Spacecraft Dynamics Program (FSD) is presented covering input requirements, control words, orbit generation, spacecraft description and simulation options, and output definition. The program can be used in dynamics and control analysis as well as in orbit support of deployment and control of spacecraft. The program is applicable to inertially oriented spinning, Earth oriented or gravity gradient stabilized spacecraft. Internal and external environmental effects can be simulated.
Optics of exciton-plasmon nanomaterials
NASA Astrophysics Data System (ADS)
Sukharev, Maxim; Nitzan, Abraham
2017-11-01
This review provides a brief introduction to the physics of coupled exciton-plasmon systems, the theoretical description and experimental manifestation of such phenomena, followed by an account of the state-of-the-art methodology for the numerical simulations of such phenomena and supplemented by a number of FORTRAN codes, by which the interested reader can introduce himself/herself to the practice of such simulations. Applications to CW light scattering as well as transient response and relaxation are described. Particular attention is given to so-called strong coupling limit, where the hybrid exciton-plasmon nature of the system response is strongly expressed. While traditional descriptions of such phenomena usually rely on analysis of the electromagnetic response of inhomogeneous dielectric environments that individually support plasmon and exciton excitations, here we explore also the consequences of a more detailed description of the molecular environment in terms of its quantum density matrix (applied in a mean field approximation level). Such a description makes it possible to account for characteristics that cannot be described by the dielectric response model: the effects of dephasing on the molecular response on one hand, and nonlinear response on the other. It also highlights the still missing important ingredients in the numerical approach, in particular its limitation to a classical description of the radiation field and its reliance on a mean field description of the many-body molecular system. We end our review with an outlook to the near future, where these limitations will be addressed and new novel applications of the numerical approach will be pursued.
NASA Technical Reports Server (NTRS)
Merrill, W. C.
1986-01-01
A hypothetical turbofan engine simplified simulation with a multivariable control and sensor failure detection, isolation, and accommodation logic (HYTESS II) is presented. The digital program, written in FORTRAN, is self-contained, efficient, realistic and easily used. Simulated engine dynamics were developed from linearized operating point models. However, essential nonlinear effects are retained. The simulation is representative of the hypothetical, low bypass ratio turbofan engine with an advanced control and failure detection logic. Included is a description of the engine dynamics, the control algorithm, and the sensor failure detection logic. Details of the simulation including block diagrams, variable descriptions, common block definitions, subroutine descriptions, and input requirements are given. Example simulation results are also presented.
LibKiSAO: a Java library for Querying KiSAO.
Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas
2012-09-24
The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
NASA Technical Reports Server (NTRS)
Rummler, D. R.
1976-01-01
The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.
Travel Agent. Occupational Simulation Kit.
ERIC Educational Resources Information Center
Peterson, Wayne
This career exploration instructional booklet on the travel agent's occupation is one of several resulting from the rural southwestern Colorado CEPAC Project (Career Education Process of Attitude Change). Based on a job analysis and utilizing a programed instructional format, the following content is included: A brief description of what a travel…
Social Worker. Occupational Simulation Kit.
ERIC Educational Resources Information Center
Brandt, Joy
This career exploration instructional booklet on the occupation of the social worker is one of several resulting from the rural southwestern Colorado CEPAC Project (Career Education Process of Attitude Change). Based on a job analysis and utilizing a programed instructional format, the following content is included: A brief description of what a…
NASA Technical Reports Server (NTRS)
Houck, J. A.
1980-01-01
This paper describes the work being done at the National Aeronautics and Space Administration's Langley Research Center on the development of a mission simulator for use in the Terminal Configured Vehicle Program. A brief description of the goals and objectives of the Terminal Configured Vehicle Program is presented. A more detailed description of the Mission Simulator, in its present configuration, and its components is provided. Finally, a description of the first research study conducted in the Mission Simulator is presented along with a discussion of some preliminary results from this study.
NASA Technical Reports Server (NTRS)
Kubat, Gregory
2016-01-01
This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.
NASA Astrophysics Data System (ADS)
Christian, Paul M.; Wells, Randy
2001-09-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
A Multirater Instrument for the Assessment of Simulated Pediatric Crises
Calhoun, Aaron W; Boone, Megan; Miller, Karen H; Taulbee, Rebecca L; Montgomery, Vicki L; Boland, Kimberly
2011-01-01
Background Few validated instruments exist to measure pediatric code team skills. The goal of this study was to develop an instrument for the assessment of resuscitation competency and self-appraisal using multirater and gap analysis methodologies. Methods Multirater assessment with gap analysis is a robust methodology that enables the measurement of self-appraisal as well as competency, offering faculty the ability to provide enhanced feedback. The Team Performance during Simulated Crises Instrument (TPDSCI) was grounded in the Accreditation Council for Graduate Medical Education competencies. The instrument contains 5 competencies, each assessed by a series of descriptive rubrics. It was piloted during a series of simulation-based interdisciplinary pediatric crisis resource management education sessions. Course faculty assessed participants, who also did self-assessments. Internal consistency and interrater reliability were analyzed using Cronbach α and intraclass correlation (ICC) statistics. Gap analysis results were examined descriptively. Results Cronbach α for the instrument was between 0.72 and 0.69. The overall ICC was 0.82. ICC values for the medical knowledge, clinical skills, communication skills, and systems-based practice were between 0.87 and 0.72. The ICC for the professionalism domain was 0.22. Further examination of the professionalism competency revealed a positive skew, 43 simulated sessions (98%) had significant gaps for at least one of the competencies, 38 sessions (86%) had gaps indicating self-overappraisal, and 15 sessions (34%) had gaps indicating self-underappraisal. Conclusions The TPDSCI possesses good measures of internal consistency and interrater reliability with respect to medical knowledge, clinical skills, communication skills, systems-based practice, and overall competence in the context of simulated interdisciplinary pediatric medical crises. Professionalism remains difficult to assess. These results provide an encouraging first step toward instrument validation. Gap analysis reveals disparities between faculty and self-assessments that indicate inadequate participant self-reflection. Identifying self-overappraisal can facilitate focused interventions. PMID:22379528
The development of a simulation model of primary prevention strategies for coronary heart disease.
Babad, Hannah; Sanderson, Colin; Naidoo, Bhash; White, Ian; Wang, Duolao
2002-11-01
This paper describes the present state of development of a discrete-event micro-simulation model for coronary heart disease prevention. The model is intended to support health policy makers in assessing the impacts on health care resources of different primary prevention strategies. For each person, a set of times to disease events, conditional on the individual's risk factor profile, is sampled from a set of probability distributions that are derived from a new analysis of the Framingham cohort study on coronary heart disease. Methods used to model changes in behavioural and physiological risk factors are discussed and a description of the simulation logic is given. The model incorporates POST (Patient Oriented Simulation Technique) simulation routines.
Domain of validity of the perturbative approach to femtosecond optical spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelin, Maxim F.; Rao, B. Jayachander; Nest, Mathias
2013-12-14
We have performed numerical nonperturbative simulations of transient absorption pump-probe responses for a series of molecular model systems. The resulting signals as a function of the laser field strength and the pump-probe delay time are compared with those obtained in the perturbative response function formalism. The simulations and their theoretical analysis indicate that the perturbative description remains valid up to moderately strong laser pulses, corresponding to a rather substantial depopulation (population) of the initial (final) electronic states.
Quantitative description of charge-carrier transport in a white organic light-emitting diode
NASA Astrophysics Data System (ADS)
Schober, M.; Anderson, M.; Thomschke, M.; Widmer, J.; Furno, M.; Scholz, R.; Lüssem, B.; Leo, K.
2011-10-01
We present a simulation model for the analysis of charge-carrier transport in organic thin-film devices, and apply it to a three-color white hybrid organic light-emitting diode (OLED) with fluorescent blue and phosphorescent red and green emission. We simulate a series of single-carrier devices, which reconstruct the OLED layer sequence step by step. Thereby, we determine the energy profiles for hole and electron transport, show how to discern bulk from interface limitation, and identify trap states.
Inelastic vibrational bulk and surface losses of swift electrons in ionic nanostructures
NASA Astrophysics Data System (ADS)
Hohenester, Ulrich; Trügler, Andreas; Batson, Philip E.; Lagos, Maureen J.
2018-04-01
In a recent paper [Lagos et al., Nature (London) 543, 533 (2017), 10.1038/nature21699] we have used electron energy loss spectroscopy with sub-10 meV energy and atomic spatial resolution to map optical and acoustic, bulk and surface vibrational modes in magnesium oxide nanocubes. We found that a local dielectric description works well for the simulation of aloof geometries, similar to related work for surface plasmons and surface plasmon polaritons, while for intersecting geometries such a description fails to reproduce the rich spectral features associated with excitation of bulk acoustic and optical phonons. To account for scatterings with a finite momentum exchange, in this paper we investigate molecular and lattice dynamics simulations of bulk losses in magnesium-oxide nanocubes using a rigid-ion description and investigate the loss spectra for intersecting electron beams. From our analysis we can evaluate the capability of electron energy loss spectroscopy for the investigation of phonon modes at the nanoscale, and we discuss shortcomings of our simplified approach as well as directions for future investigations.
Virtual reality based surgery simulation for endoscopic gynaecology.
Székely, G; Bajka, M; Brechbühler, C; Dual, J; Enzler, R; Haller, U; Hug, J; Hutter, R; Ironmonger, N; Kauer, M; Meier, V; Niederer, P; Rhomberg, A; Schmid, P; Schweitzer, G; Thaler, M; Vuskovic, V; Tröster, G
1999-01-01
Virtual reality (VR) based surgical simulator systems offer very elegant possibilities to both enrich and enhance traditional education in endoscopic surgery. However, while a wide range of VR simulator systems have been proposed and realized in the past few years, most of these systems are far from able to provide a reasonably realistic surgical environment. We explore the basic approaches to the current limits of realism and ultimately seek to extend these based on our description and analysis of the most important components of a VR-based endoscopic simulator. The feasibility of the proposed techniques is demonstrated on a first modular prototype system implementing the basic algorithms for VR-training in gynaecologic laparoscopy.
Aircraft/Air Traffic Management Functional Analysis Model. Version 2.0; User's Guide
NASA Technical Reports Server (NTRS)
Etheridge, Melvin; Plugge, Joana; Retina, Nusrat
1998-01-01
The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) a National Aeronautics and Space Administration (NASA) contract. This document provides a guide for using the model in analysis. Those interested in making enhancements or modification to the model should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Technical Description.
"I got it on Ebay!": cost-effective approach to surgical skills laboratories.
Schneider, Ethan; Schenarts, Paul J; Shostrom, Valerie; Schenarts, Kimberly D; Evans, Charity H
2017-01-01
Surgical education is witnessing a surge in the use of simulation. However, implementation of simulation is often cost-prohibitive. Online shopping offers a low budget alternative. The aim of this study was to implement cost-effective skills laboratories and analyze online versus manufacturers' prices to evaluate for savings. Four skills laboratories were designed for the surgery clerkship from July 2014 to June 2015. Skills laboratories were implemented using hand-built simulation and instruments purchased online. Trademarked simulation was priced online and instruments priced from a manufacturer. Costs were compiled, and a descriptive cost analysis of online and manufacturers' prices was performed. Learners rated their level of satisfaction for all educational activities, and levels of satisfaction were compared. A total of 119 third-year medical students participated. Supply lists and costs were compiled for each laboratory. A descriptive cost analysis of online and manufacturers' prices showed online prices were substantially lower than manufacturers, with a per laboratory savings of: $1779.26 (suturing), $1752.52 (chest tube), $2448.52 (anastomosis), and $1891.64 (laparoscopic), resulting in a year 1 savings of $47,285. Mean student satisfaction scores for the skills laboratories were 4.32, with statistical significance compared to live lectures at 2.96 (P < 0.05) and small group activities at 3.67 (P < 0.05). A cost-effective approach for implementation of skills laboratories showed substantial savings. By using hand-built simulation boxes and online resources to purchase surgical equipment, surgical educators overcome financial obstacles limiting the use of simulation and provide learning opportunities that medical students perceive as beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Showanasai, Parinya; Lu, Jiafang; Hallinger, Philip
2013-01-01
Purpose: The extant literature on school leadership development is dominated by conceptual analysis, descriptive studies of current practice, critiques of current practice, and prescriptions for better ways to approach practice. Relatively few studies have examined impact of leadership development using experimental methods, among which even fewer…
NOVA HIGH SCHOOL--DESCRIPTION OF TENTH-GRADE SOCIAL STUDIES COURSE.
ERIC Educational Resources Information Center
COGSWELL, JOHN F.
SYSTEMS ANALYSIS AND COMPUTER SIMULATION TECHNIQUES WERE APPLIED IN A STUDY OF INNOVATION FOR A 10TH-GRADE SOCIAL STUDIES COURSE. THE COURSE CONTENT WAS AMERICAN HISTORY WHICH WAS DIVIDED INTO 10 CONTENT AREAS SUCH AS COLONIAL, REVOLUNTIONARY, AND CONSTITUTIONAL AMERICAN. THE ACTIVITIES OF THE COURSE INCLUDED TEAM TEACHING, LECTURES, MEDIA…
Karlsen, Marte-Marie Wallander; Gabrielsen, Anita Kristin; Falch, Anne Lise; Stubberud, Dag-Gunnar
2017-10-01
The aim of this study was to explore intensive care nursing students experiences with confirming communication skills training in a simulation-based environment. The study has a qualitative, exploratory and descriptive design. The participants were students in a post-graduate program in intensive care nursing, that had attended a one day confirming communication course. Three focus group interviews lasting between 60 and 80min were conducted with 14 participants. The interviews were transcribed verbatim. Thematic analysis was performed, using Braun & Clark's seven steps. The analysis resulted in three main themes: "awareness", "ice-breaker" and "challenging learning environment". The participants felt that it was a challenge to see themselves on the video-recordings afterwards, however receiving feedback resulted in better self-confidence in mastering complex communication. The main finding of the study is that the students reported improved communication skills after the confirming communication course. However; it is uncertain how these skills can be transferred to clinical practice improving patient outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Chemistry of Shocked High-energy Materials: Connecting Atomistic Simulations to Experiments
NASA Astrophysics Data System (ADS)
Islam, Md Mahbubul; Strachan, Alejandro
2017-06-01
A comprehensive atomistic-level understanding of the physics and chemistry of shocked high energy (HE) materials is crucial for designing safe and efficient explosives. Advances in the ultrafast spectroscopy and laser shocks enabled the study of shock-induced chemistry at extreme conditions occurring at picosecond timescales. Despite this progress experiments are not without limitations and do not enable a direct characterization of chemical reactions. At the same time, large-scale reactive molecular dynamics (MD) simulations are capable of providing description of the shocked-induced chemistry but the uncertainties resulting from the use of approximate descriptions of atomistic interactions remain poorly quantified. We use ReaxFF MD simulations to investigate the shock and temperature induced chemical decomposition mechanisms of polyvinyl nitrate, RDX, and nitromethane. The effect of various shock pressures on reaction initiation mechanisms is investigated for all three materials. We performed spectral analysis from atomistic velocities at different shock pressures to enable direct comparison with experiments. The simulations predict volume-increasing reactions at the shock-to-detonation transitions and the shock vs. particle velocity data are in good agreement with available experimental data. The ReaxFF MD simulations validated against experiments enabled prediction of reaction kinetics of shocked materials, and interpretation of experimental spectroscopy data via assignment of the spectral peaks to dictate various reaction pathways at extreme conditions.
Analysis of multimode fiber bundles for endoscopic spectral-domain optical coherence tomography
Risi, Matthew D.; Makhlouf, Houssine; Rouse, Andrew R.; Gmitro, Arthur F.
2016-01-01
A theoretical analysis of the use of a fiber bundle in spectral-domain optical coherence tomography (OCT) systems is presented. The fiber bundle enables a flexible endoscopic design and provides fast, parallelized acquisition of the OCT data. However, the multimode characteristic of the fibers in the fiber bundle affects the depth sensitivity of the imaging system. A description of light interference in a multimode fiber is presented along with numerical simulations and experimental studies to illustrate the theoretical analysis. PMID:25967012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, E.R.
1983-09-01
The appendixes for the Saguaro Power Plant includes the following: receiver configuration selection report; cooperating modes and transitions; failure modes analysis; control system analysis; computer codes and simulation models; procurement package scope descriptions; responsibility matrix; solar system flow diagram component purpose list; thermal storage component and system test plans; solar steam generator tube-to-tubesheet weld analysis; pipeline listing; management control schedule; and system list and definitions.
Contact dynamics recording and analysis system using an optical fiber sensor approach
NASA Astrophysics Data System (ADS)
Anghel, F.; Pavelescu, D.; Grattan, K. T. V.; Palmer, A. W.
1997-09-01
A contact dynamics recording and analysis system configured using an optical fiber sensor has been developed having been designed with a particular application to the accurate and time-varying description of moving contact operating during electrical arc breaking, in an experimental platform simulating the operation of a vacuum circuit breaker. The system utilizes dynamic displacement measurement and data recording and a post-process data analysis to reveal the dynamic speed and acceleration data of the equipment.
Ares-I-X Vehicle Preliminary Range Safety Malfunction Turn Analysis
NASA Technical Reports Server (NTRS)
Beaty, James R.; Starr, Brett R.; Gowan, John W., Jr.
2008-01-01
Ares-I-X is the designation given to the flight test version of the Ares-I rocket (also known as the Crew Launch Vehicle - CLV) being developed by NASA. As part of the preliminary flight plan approval process for the test vehicle, a range safety malfunction turn analysis was performed to support the launch area risk assessment and vehicle destruct criteria development processes. Several vehicle failure scenarios were identified which could cause the vehicle trajectory to deviate from its normal flight path, and the effects of these failures were evaluated with an Ares-I-X 6 degrees-of-freedom (6-DOF) digital simulation, using the Program to Optimize Simulated Trajectories Version 2 (POST2) simulation framework. The Ares-I-X simulation analysis provides output files containing vehicle state information, which are used by other risk assessment and vehicle debris trajectory simulation tools to determine the risk to personnel and facilities in the vicinity of the launch area at Kennedy Space Center (KSC), and to develop the vehicle destruct criteria used by the flight test range safety officer. The simulation analysis approach used for this study is described, including descriptions of the failure modes which were considered and the underlying assumptions and ground rules of the study, and preliminary results are presented, determined by analysis of the trajectory deviation of the failure cases, compared with the expected vehicle trajectory.
Chivukula, V; Mousel, J; Lu, J; Vigmostad, S
2014-12-01
The current research presents a novel method in which blood particulates - biconcave red blood cells (RBCs) and spherical cells are modeled using isogeometric analysis, specifically Non-Uniform Rational B-Splines (NURBS) in 3-D. The use of NURBS ensures that even with a coarse representation, the geometry of the blood particulates maintains an accurate description when subjected to large deformations. The fundamental advantage of this method is the coupling of the geometrical description and the stress analysis of the cell membrane into a single, unified framework. Details on the modeling approach, implementation of boundary conditions and the membrane mechanics analysis using isogeometric modeling are presented, along with validation cases for spherical and biconcave cells. Using NURBS - based isogeometric analysis, the behavior of individual cells in fluid flow is presented and analyzed in different flow regimes using as few as 176 elements for a spherical cell and 220 elements for a biconcave RBC. This work provides a framework for modeling a large number of 3-D deformable biological cells, each with its own geometric description and membrane properties. To the best knowledge of the authors, this is the first application of the NURBS - based isogeometric analysis to model and simulate blood particulates in flow in 3D. Copyright © 2014 John Wiley & Sons, Ltd.
A simulation model for wind energy storage systems. Volume 3: Program descriptions
NASA Technical Reports Server (NTRS)
Warren, A. W.; Edsinger, R. W.; Burroughs, J. D.
1977-01-01
Program descriptions, flow charts, and program listings for the SIMWEST model generation program, the simulation program, the file maintenance program, and the printer plotter program are given. For Vol 2, see .
Equation-oriented specification of neural models for simulations
Stimberg, Marcel; Goodman, Dan F. M.; Benichoux, Victor; Brette, Romain
2013-01-01
Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modeling software is to build network models based on a library of pre-defined components and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions. The presented approach has been implemented in the Brian2 simulator. PMID:24550820
Genome Scale Modeling in Systems Biology: Algorithms and Resources
Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali
2014-01-01
In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031
Transition to an aging Japan: public pension, savings, and capital taxation.
Kato, R
1998-09-01
This study examined options for compensating for the shortages of money for public pensions due to population aging in Japan: increases in pension contributions, consumption pension taxes, interest income pension taxes, and inheritance pension taxes. The analysis relied on simulation in an expanded life cycle growth model. Data were obtained from 1992 estimations of population by the Institute of Population Problems of the Ministry of Health and Welfare. This study is unique in its use of real population data for the simulations and in its use of transition states. The analysis begins with a description of the altered Overlapping Generations Model by Auerback and Kotlikoff (1983). The model accounts for the inaccuracy of lifetime and liquidity constraints and ordinary budget constraints and reproduces the consumption-savings profiles of older people and incorporates wage income taxation and other forms of taxation. Income includes wage and interest income. The analysis includes a description of the method of simulation, assumptions, and evaluation of the effects of population aging. It is assumed that narrower government sector spending on general expenditures per worker will increase by 1% every year. It is concluded that national saving rates will probably decrease due to population aging. The lowest levels of capital stock and savings will result from higher pension contributions. The highest level of capital stock will result from higher consumption pension taxes during 1990-2015. Preferred policies should focus on increasing interest income rates.
Research of TREETOPS Structural Dynamics Controls Simulation Upgrade
NASA Technical Reports Server (NTRS)
Yates, Rose M.
1996-01-01
Under the provisions of contract number NAS8-40194, which was entitled 'TREETOPS Structural Dynamics and Controls Simulation System Upgrade', Oakwood College contracted to produce an upgrade to the existing TREETOPS suite of analysis tools. This suite includes the main simulation program, TREETOPS, two interactive preprocessors, TREESET and TREEFLX, an interactive post processor, TREEPLOT, and an adjunct program, TREESEL. A 'Software Design Document', which provides descriptions of the argument lists and internal variables for each subroutine in the TREETOPS suite, was established. Additionally, installation guides for both DOS and UNIX platforms were developed. Finally, updated User's Manuals, as well as a Theory Manual, were generated.
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1985-01-01
The mathematical model that has been a cornerstone for the systems analysis of space-flight physiological studies is the Guyton model describing circulatory, fluid and electrolyte regulation. The model and the modifications that are made to permit simulation and analysis of the stress of weightlessness are described.
Design Evaluation for Personnel, Training and Human Factors (DEPTH) Final Report.
1998-01-17
human activity was primarily intended to facilitate man-machine design analyses of complex systems. By importing computer aided design (CAD) data, the human figure models and analysis algorithms can help to ensure components can be seen, reached, lifted and removed by most maintainers. These simulations are also useful for logistics data capture, training, and task analysis. DEPTH was also found to be useful in obtaining task descriptions for technical
NASA Technical Reports Server (NTRS)
1974-01-01
The feasibility is evaluated of an evolutionary development for use of a single-axis gimbal star tracker from prior two-axis gimbal star tracker based system applications. Detailed evaluation of the star tracker gimbal encoder is considered. A brief system description is given including the aspects of tracker evolution and encoder evaluation. System analysis includes evaluation of star availability and mounting constraints for the geosynchronous orbit application, and a covariance simulation analysis to evaluate performance potential. Star availability and covariance analysis digital computer programs are included.
iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems
NASA Astrophysics Data System (ADS)
Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.
2017-11-01
iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.
First experiences of high-fidelity simulation training in junior nursing students in Korea.
Lee, Suk Jeong; Kim, Sang Suk; Park, Young-Mi
2015-07-01
This study was conducted to explore first experiences of high-fidelity simulation training in Korean nursing students, in order to develop and establish more effective guidelines for future simulation training in Korea. Thirty-three junior nursing students participated in high-fidelity simulation training for the first time. Using both qualitative and quantitative methods, data were collected from reflective journals and questionnaires of simulation effectiveness after simulation training. Descriptive statistics were used to analyze simulation effectiveness and content analysis was performed with the reflective journal data. Five dimensions and 31 domains, both positive and negative experiences, emerged from qualitative analysis: (i) machine-human interaction in a safe environment; (ii) perceived learning capability; (iii) observational learning; (iv) reconciling practice with theory; and (v) follow-up debriefing effect. More than 70% of students scored high on increased ability to identify changes in the patient's condition, critical thinking, decision-making, effectiveness of peer observation, and debriefing in effectiveness of simulation. This study reported both positive and negative experiences of simulation. The results of this study could be used to set the level of task difficulty in simulation. Future simulation programs can be designed by reinforcing the positive experiences and modifying the negative results. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.
A Computational Approach for Probabilistic Analysis of Water Impact Simulations
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.
2009-01-01
NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.
Kinetic description of cyclotron-range oscillations of a non-neutral plasma column
NASA Astrophysics Data System (ADS)
Neu, S. C.; Morales, G. J.
1998-04-01
The kinetic analysis introduced by Prasad, Morales, and Fried [Prasad et al., Phys. Fluids 30, 3093 (1987)] is used to derive damping conditions and a differential equation for azimuthally propagating waves in a non-neutral plasma column in the limits rl/L≪1 and krl≪1 (where rl is the Larmor radius, k is the wave number, and L is the density scale length). The predictions of the kinetic analysis are verified using a two-dimensional particle-in-cell simulation of Bernstein modes in a thermal rigid-rotor equilibrium. Differences between modes in a strongly magnetized limit and near the Brillouin limit are studied in the simulation.
NAVSIM 2: A computer program for simulating aided-inertial navigation for aircraft
NASA Technical Reports Server (NTRS)
Bjorkman, William S.
1987-01-01
NAVSIM II, a computer program for analytical simulation of aided-inertial navigation for aircraft, is described. The description is supported by a discussion of the program's application to the design and analysis of aided-inertial navigation systems as well as instructions for utilizing the program and for modifying it to accommodate new models, constraints, algorithms and scenarios. NAVSIM II simulates an airborne inertial navigation system built around a strapped-down inertial measurement unit and aided in its function by GPS, Doppler radar, altimeter, airspeed, and position-fix measurements. The measurements are incorporated into the navigation estimate via a UD-form Kalman filter. The simulation was designed and implemented using structured programming techniques and with particular attention to user-friendly operation.
A field study of wind over a simulated block building
NASA Technical Reports Server (NTRS)
Frost, W.; Shahabi, A. M.
1977-01-01
A full-scale field study of the wind over a simulated two-dimensional building is reported. The study develops an experiment to investigate the structure and magnitude of the wind fields. A description of the experimental arrangement, the type and expected accuracy of the data, and the range of the data are given. The data are expected to provide a fundamental understanding of mean wind and turbulence structure of the wind field around the bluff body. Preliminary analysis of the data demonstrates the reliability and completeness of the data in this regard.
A simplified fuel control approach for low cost aircraft gas turbines
NASA Technical Reports Server (NTRS)
Gold, H.
1973-01-01
Reduction in the complexity of gas turbine fuel controls without loss of control accuracy, reliability, or effectiveness as a method for reducing engine costs is discussed. A description and analysis of hydromechanical approach are presented. A computer simulation of the control mechanism is given and performance of a physical model in engine test is reported.
NASA Technical Reports Server (NTRS)
Kung, E. C.
1984-01-01
Energetics characteristics of Goddard Laboratory for Atmospheric Sciences (GLAS) General Circulation Models (GCM) as they are reflected on the First GARD GLobal Experiment (FGGE) analysis data set are discussed. Energetics descriptions of GLAS GCM forecast experiments are discussed as well as Eneretics response of GLAS GCM climatic simulation experiments.
Human Performance Modeling and Simulation for Launch Team Applications
NASA Technical Reports Server (NTRS)
Peaden, Cary J.; Payne, Stephen J.; Hoblitzell, Richard M., Jr.; Chandler, Faith T.; LaVine, Nils D.; Bagnall, Timothy M.
2006-01-01
This paper describes ongoing research into modeling and simulation of humans for launch team analysis, training, and evaluation. The initial research is sponsored by the National Aeronautics and Space Administration's (NASA)'s Office of Safety and Mission Assurance (OSMA) and NASA's Exploration Program and is focused on current and future launch team operations at Kennedy Space Center (KSC). The paper begins with a description of existing KSC launch team environments and procedures. It then describes the goals of new Simulation and Analysis of Launch Teams (SALT) research. The majority of this paper describes products from the SALT team's initial proof-of-concept effort. These products include a nominal case task analysis and a discrete event model and simulation of launch team performance during the final phase of a shuttle countdown; and a first proof-of-concept training demonstration of launch team communications in which the computer plays most roles, and the trainee plays a role of the trainee's choice. This paper then describes possible next steps for the research team and provides conclusions. This research is expected to have significant value to NASA's Exploration Program.
Geomega: MEGAlib's Uniform Geometry and Detector Description Tool for Geant3, MGGPOD, and Geant4
NASA Astrophysics Data System (ADS)
Zoglauer, Andreas C.; Andritschke, R.; Schopper, F.; Wunderer, C. B.
2006-09-01
The Medium Energy Gamma-ray Astronomy library MEGAlib is a set of software tools for the analysis of low to medium energy gamma-ray telescopes, especially Compton telescopes. It comprises all necessary data analysis steps from simulation/measurements via event reconstruction to image reconstruction and enables detailed performance assessments. In the energy range of Compton telescopes (with energy deposits from a few keV up to hundreds of MeV), the Geant Monte-Carlo software packages (Geant3 with its MGGPOD extension as well as Geant4) are widely used. Since each tool has its unique advantages, MEGAlib contains a geometry and detector description library, called Geomega, which allows to use those tools in a uniform way. It incorporates the versatile 3D display facilities available within the ROOT libraries. The same geometry, material, trigger, and detector description can be used for all simulation tools as well as for the later event analysis in the MEGAlib framework. This is done by converting the MEGAlib geometry into the Geant3 or MGGPOD format or directly linking the Geomega library into Geant4. The geometry description can handle most (and can be extended to handle all) volumes common to Geant3, Geant4 and ROOT. In Geomega a list of features is implemented which are especially useful for optimizing detector geometries: It allows to define constants, can handle mathematical operations, enables volume scaling, checks for overlaps of detector volumes, does mass calculations, etc. Used in combination with MEGAlib, Geomega enables discretization, application of detector noise, thresholds, various trigger conditions, defective pixels, etc. The highly modular and completely object-oriented library is written in C++ and based on ROOT. It has been originally developed for the tracking Compton scattering and Pair creation telescope MEGA and has been successfully applied to a wide variety of telescopes, such as ACT, NuSTAR, or GRI.
Ostermeir, Katja; Zacharias, Martin
2014-12-01
Coarse-grained elastic network models (ENM) of proteins offer a low-resolution representation of protein dynamics and directions of global mobility. A Hamiltonian-replica exchange molecular dynamics (H-REMD) approach has been developed that combines information extracted from an ENM analysis with atomistic explicit solvent MD simulations. Based on a set of centers representing rigid segments (centroids) of a protein, a distance-dependent biasing potential is constructed by means of an ENM analysis to promote and guide centroid/domain rearrangements. The biasing potentials are added with different magnitude to the force field description of the MD simulation along the replicas with one reference replica under the control of the original force field. The magnitude and the form of the biasing potentials are adapted during the simulation based on the average sampled conformation to reach a near constant biasing in each replica after equilibration. This allows for canonical sampling of conformational states in each replica. The application of the methodology to a two-domain segment of the glycoprotein 130 and to the protein cyanovirin-N indicates significantly enhanced global domain motions and improved conformational sampling compared with conventional MD simulations. © 2014 Wiley Periodicals, Inc.
Segmental Analysis of Cardiac Short-Axis Views Using Lagrangian Radial and Circumferential Strain.
Ma, Chi; Wang, Xiao; Varghese, Tomy
2016-11-01
Accurate description of myocardial deformation in the left ventricle is a three-dimensional problem, requiring three normal strain components along its natural axis, that is, longitudinal, radial, and circumferential strains. Although longitudinal strains are best estimated from long-axis views, radial and circumferential strains are best depicted in short-axis views. An algorithm that utilizes a polar grid for short-axis views previously developed in our laboratory for a Lagrangian description of tissue deformation is utilized for radial and circumferential displacement and strain estimation. Deformation of the myocardial wall, utilizing numerical simulations with ANSYS, and a finite-element analysis-based canine heart model were adapted as the input to a frequency-domain ultrasound simulation program to generate radiofrequency echo signals. Clinical in vivo data were also acquired from a healthy volunteer. Local displacements estimated along and perpendicular to the ultrasound beam propagation direction are then transformed into radial and circumferential displacements and strains using the polar grid based on a pre-determined centroid location. Lagrangian strain variations demonstrate good agreement with the ideal strain when compared with Eulerian results. Lagrangian radial and circumferential strain estimation results are also demonstrated for experimental data on a healthy volunteer. Lagrangian radial and circumferential strain tracking provide accurate results with the assistance of the polar grid, as demonstrated using both numerical simulations and in vivo study. © The Author(s) 2015.
Stochastic Simulation Tool for Aerospace Structural Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F.; Moore, David F.
2006-01-01
Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.
A Hybrid Model for Multiscale Laser Plasma Simulations with Detailed Collisional Physics
2017-06-23
the effects of inelastic collisions on the Multi-Fluid description of plasmas. 15. SUBJECT TERMS Electric propulsion; plasma; collisional...modeling as well as the effects of inelastic collisions on the Multi-Fluid description of plasmas. This work has been recognized in two workshop...encountered during simulation was to define when breakdown occurred during the simulation and correlating the results to the experimentally determined
dos Santos, Mateus Casanova; Leite, Maria Cecília Lorea; Heck, Rita Maria
2010-12-01
This is an investigative case study with descriptive and participative character, based on an educational experience with the Simulation in Nursing learning trigger. It was carried out during the second semester of the first cycle of Faculdade de Enfermagem (FEN), Universidade Federal de Pelotas (UFPel). The aim is to study the recontextualization of pedagogic practice of simulation-based theories developed by Basil Bernstein, an education sociologist, and to contribute with the improvement process of education planning, and especially the evaluation of learning trigger. The research shows that Bernstein's theory is a powerful tool semiotic pedagogical of practices which contributes to the planning and analysis of curricular educational device.
NASA Technical Reports Server (NTRS)
Mukhopadhyay, A. K.
1978-01-01
A description is presented of six simulation cases investigating the effect of the variation of static-dynamic Coulomb friction on servo system stability/performance. The upper and lower levels of dynamic Coulomb friction which allowed operation within requirements were determined roughly to be three times and 50% respectively of nominal values considered in a table. A useful application for the nonlinear time response simulation is the sensitivity analysis of final hardware design with respect to such system parameters as cannot be varied realistically or easily in the actual hardware. Parameters of the static/dynamic Coulomb friction fall in this category.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).
Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar
2018-03-19
The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.
Bespamyatnov, Igor O; Rowan, William L; Granetz, Robert S
2008-10-01
Charge exchange recombination spectroscopy on Alcator C-Mod relies on the use of the diagnostic neutral beam injector as a source of neutral particles which penetrate deep into the plasma. It employs the emission resulting from the interaction of the beam atoms with fully ionized impurity ions. To interpret the emission from a given point in the plasma as the density of emitting impurity ions, the density of beam atoms must be known. Here, an analysis of beam propagation is described which yields the beam density profile throughout the beam trajectory from the neutral beam injector to the core of the plasma. The analysis includes the effects of beam formation, attenuation in the neutral gas surrounding the plasma, and attenuation in the plasma. In the course of this work, a numerical simulation and an analytical approximation for beam divergence are developed. The description is made sufficiently compact to yield accurate results in a time consistent with between-shot analysis.
Direct Method Transcription for a Human-Class Translunar Injection Trajectory Optimization
NASA Technical Reports Server (NTRS)
Witzberger, Kevin E.; Zeiler, Tom
2012-01-01
This paper presents a new trajectory optimization software package developed in the framework of a low-to-high fidelity 3 degrees-of-freedom (DOF)/6-DOF vehicle simulation program named Mission Analysis Simulation Tool in Fortran (MASTIF) and its application to a translunar trajectory optimization problem. The functionality of the developed optimization package is implemented as a new "mode" in generalized settings to make it applicable for a general trajectory optimization problem. In doing so, a direct optimization method using collocation is employed for solving the problem. Trajectory optimization problems in MASTIF are transcribed to a constrained nonlinear programming (NLP) problem and solved with SNOPT, a commercially available NLP solver. A detailed description of the optimization software developed is provided as well as the transcription specifics for the translunar injection (TLI) problem. The analysis includes a 3-DOF trajectory TLI optimization and a 3-DOF vehicle TLI simulation using closed-loop guidance.
NASA Astrophysics Data System (ADS)
Bazilevs, Yuri; Hsu, M.-C.; Benson, D. J.; Sankaran, S.; Marsden, A. L.
2009-12-01
The Fontan procedure is a surgery that is performed on single-ventricle heart patients, and, due to the wide range of anatomies and variations among patients, lends itself nicely to study by advanced numerical methods. We focus on a patient-specific Fontan configuration, and perform a fully coupled fluid-structure interaction (FSI) analysis of hemodynamics and vessel wall motion. To enable physiologically realistic simulations, a simple approach to constructing a variable-thickness blood vessel wall description is proposed. Rest and exercise conditions are simulated and rigid versus flexible vessel wall simulation results are compared. We conclude that flexible wall modeling plays an important role in predicting quantities of hemodynamic interest in the Fontan connection. To the best of our knowledge, this paper presents the first three-dimensional patient-specific fully coupled FSI analysis of a total cavopulmonary connection that also includes large portions of the pulmonary circulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yung-Chen Andrew; Engelhard, Mark H.; Baer, Donald R.
2016-03-07
Abstract or short description: Spectral modeling of photoelectrons can serve as a valuable tool when combined with X-ray photoelectron spectroscopy (XPS) analysis. Herein, a new version of the NIST Simulation of Electron Spectra for Surface Analysis (SESSA 2.0) software, capable of directly simulating spherical multilayer NPs, was applied to model citrate stabilized Au/Ag-core/shell nanoparticles (NPs). The NPs were characterized using XPS and scanning transmission electron microscopy (STEM) to determine the composition and morphology of the NPs. The Au/Ag-core/shell NPs were observed to be polydispersed in size, non-circular, and contain off-centered Au-cores. Using the average NP dimensions determined from STEM analysis,more » SESSA spectral modeling indicated that washed Au/Ag-core shell NPs were stabilized with a 0.8 nm l« less
Effective description of a 3D object for photon transportation in Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Suganuma, R.; Ogawa, K.
2000-06-01
Photon transport simulation by means of the Monte Carlo method is an indispensable technique for examining scatter and absorption correction methods in SPECT and PET. The authors have developed a method for object description with maximum size regions (maximum rectangular regions: MRRs) to speed up photon transport simulation, and compared the computation time with that for conventional object description methods, a voxel-based (VB) method and an octree method, in the simulations of two kinds of phantoms. The simulation results showed that the computation time with the proposed method became about 50% of that with the VD method and about 70% of that with the octree method for a high resolution MCAT phantom. Here, details of the expansion of the MRR method to three dimensions are given. Moreover, the effectiveness of the proposed method was compared with the VB and octree methods.
Expedited Systems Engineering for Rapid Capability and Urgent Needs
2012-12-31
rapid organizations start to differ from traditional ones, and there is a shift in energy , commitment, and knowledge. These findings are motivated by...123 C.7.1 Description: Integration of Modeling and Simulation , Software Design, and...differ from traditional ones, and there is a shift in energy , commitment, and knowledge. These findings are motivated by an analysis of effective
A COST-EFFECTIVENESS MODEL FOR THE ANALYSIS OF TITLE I ESEA PROJECT PROPOSALS, PART I-VII.
ERIC Educational Resources Information Center
ABT, CLARK C.
SEVEN SEPARATE REPORTS DESCRIBE AN OVERVIEW OF A COST-EFFECTIVENESS MODEL AND FIVE SUBMODELS FOR EVALUATING THE EFFECTIVENESS OF ELEMENTARY AND SECONDARY ACT TITLE I PROPOSALS. THE DESIGN FOR THE MODEL ATTEMPTS A QUANTITATIVE DESCRIPTION OF EDUCATION SYSTEMS WHICH MAY BE PROGRAMED AS A COMPUTER SIMULATION TO INDICATE THE IMPACT OF A TITLE I…
Optimization of MLS receivers for multipath environments
NASA Technical Reports Server (NTRS)
Mcalpine, G. A.; Highfill, J. H., III; Tzeng, C. P. J.; Koleyni, G.
1978-01-01
Reduced order receiver (suboptimal receiver) analysis in multipath environments is presented. The origin and objective of MLS is described briefly. Signal modeling in MLS the optimum receiver is also included and a description of a computer oriented technique which was used in the simulation study of the suboptimal receiver is provided. Results and conclusion obtained from the research for the suboptimal receiver are reported.
SCEC Earthquake System Science Using High Performance Computing
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.
2008-12-01
The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.
NASA Technical Reports Server (NTRS)
daSilva, Arlinda
2012-01-01
A model-based Observing System Simulation Experiment (OSSE) is a framework for numerical experimentation in which observables are simulated from fields generated by an earth system model, including a parameterized description of observational error characteristics. Simulated observations can be used for sampling studies, quantifying errors in analysis or retrieval algorithms, and ultimately being a planning tool for designing new observing missions. While this framework has traditionally been used to assess the impact of observations on numerical weather prediction, it has a much broader applicability, in particular to aerosols and chemical constituents. In this talk we will give a general overview of Observing System Simulation Experiments (OSSE) activities at NASA's Global Modeling and Assimilation Office, with focus on its emerging atmospheric composition component.
Portable Life Support Subsystem Thermal Hydraulic Performance Analysis
NASA Technical Reports Server (NTRS)
Barnes, Bruce; Pinckney, John; Conger, Bruce
2010-01-01
This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.
Shuttle operations simulation model programmers'/users' manual
NASA Technical Reports Server (NTRS)
Porter, D. G.
1972-01-01
The prospective user of the shuttle operations simulation (SOS) model is given sufficient information to enable him to perform simulation studies of the space shuttle launch-to-launch operations cycle. The procedures used for modifying the SOS model to meet user requirements are described. The various control card sequences required to execute the SOS model are given. The report is written for users with varying computer simulation experience. A description of the components of the SOS model is included that presents both an explanation of the logic involved in the simulation of the shuttle operations cycle and a description of the routines used to support the actual simulation.
Vertical motion simulator familiarization guide
NASA Technical Reports Server (NTRS)
Danek, George L.
1993-01-01
The Vertical Motion Simulator Familiarization Guide provides a synoptic description of the Vertical Motion Simulator (VMS) and descriptions of the various simulation components and systems. The intended audience is the community of scientists and engineers who employ the VMS for research and development. The concept of a research simulator system is introduced and the building block nature of the VMS is emphasized. Individual sections describe all the hardware elements in terms of general properties and capabilities. Also included are an example of a typical VMS simulation which graphically illustrates the composition of the system and shows the signal flow among the elements and a glossary of specialized terms, abbreviations, and acronyms.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.
2008-08-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.
2010-06-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
The Use of Computer Simulation Techniques in Educational Planning.
ERIC Educational Resources Information Center
Wilson, Charles Z.
Computer simulations provide powerful models for establishing goals, guidelines, and constraints in educational planning. They are dynamic models that allow planners to examine logical descriptions of organizational behavior over time as well as permitting consideration of the large and complex systems required to provide realistic descriptions of…
Chapter 4: Variant descriptions
Duncan C. Lutes; Donald C. E. Robinson
2003-01-01
The Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. This report documents differences between geographic variants of the FFE. It is a companion document to the FFE "Model Description" and "User's Guide."...
A hadron-nucleus collision event generator for simulations at intermediate energies
NASA Astrophysics Data System (ADS)
Ackerstaff, K.; Bisplinghoff, J.; Bollmann, R.; Cloth, P.; Diehl, O.; Dohrmann, F.; Drüke, V.; Eisenhardt, S.; Engelhardt, H. P.; Ernst, J.; Eversheim, P. D.; Filges, D.; Fritz, S.; Gasthuber, M.; Gebel, R.; Greiff, J.; Gross, A.; Gross-Hardt, R.; Hinterberger, F.; Jahn, R.; Lahr, U.; Langkau, R.; Lippert, G.; Maschuw, R.; Mayer-Kuckuk, T.; Mertler, G.; Metsch, B.; Mosel, F.; Paetz gen. Schieck, H.; Petry, H. R.; Prasuhn, D.; von Przewoski, B.; Rohdjeß, H.; Rosendaal, D.; Roß, U.; von Rossen, P.; Scheid, H.; Schirm, N.; Schulz-Rojahn, M.; Schwandt, F.; Scobel, W.; Sterzenbach, G.; Theis, D.; Weber, J.; Wellinghausen, A.; Wiedmann, W.; Woller, K.; Ziegler, R.; EDDA-Collaboration
2002-10-01
Several available codes for hadronic event generation and shower simulation are discussed and their predictions are compared to experimental data in order to obtain a satisfactory description of hadronic processes in Monte Carlo studies of detector systems for medium energy experiments. The most reasonable description is found for the intra-nuclear-cascade (INC) model of Bertini which employs microscopic description of the INC, taking into account elastic and inelastic pion-nucleon and nucleon-nucleon scattering. The isobar model of Sternheimer and Lindenbaum is used to simulate the inelastic elementary collisions inside the nucleus via formation and decay of the Δ33-resonance which, however, limits the model at higher energies. To overcome this limitation, the INC model has been extended by using the resonance model of the HADRIN code, considering all resonances in elementary collisions contributing more than 2% to the total cross-section up to kinetic energies of 5 GeV. In addition, angular distributions based on phase shift analysis are used for elastic nucleon-nucleon as well as elastic and charge exchange pion-nucleon scattering. Also kaons and antinucleons can be treated as projectiles. Good agreement with experimental data is found predominantly for lower projectile energies, i.e. in the regime of the Bertini code. The original as well as the extended Bertini model have been implemented as shower codes into the high energy detector simulation package GEANT-3.14, allowing now its use also in full Monte Carlo studies of detector systems at intermediate energies. The GEANT-3.14 here have been used mainly for its powerful geometry and analysing packages due to the complex EDDA detector system.
An Astrobiological Experiment to Explore the Habitability of Tidally Locked M-Dwarf Planets
NASA Astrophysics Data System (ADS)
Angerhausen, Daniel; Sapers, Haley; Simoncini, Eugenio; Lutz, Stefanie; Alexandre, Marcelo da Rosa; Galante, Douglas
2014-04-01
We present a summary of a three-year academic research proposal drafted during the Sao Paulo Advanced School of Astrobiology (SPASA) to prepare for upcoming observations of tidally locked planets orbiting M-dwarf stars. The primary experimental goal of the suggested research is to expose extremophiles from analogue environments to a modified space simulation chamber reproducing the environmental parameters of a tidally locked planet in the habitable zone of a late-type star. Here we focus on a description of the astronomical analysis used to define the parameters for this climate simulation.
On verifying a high-level design. [cost and error analysis
NASA Technical Reports Server (NTRS)
Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.
1993-01-01
An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.
Sequentially Simulated Outcomes: Kind Experience versus Nontransparent Description
ERIC Educational Resources Information Center
Hogarth, Robin M.; Soyer, Emre
2011-01-01
Recently, researchers have investigated differences in decision making based on description and experience. We address the issue of when experience-based judgments of probability are more accurate than are those based on description. If description is well understood ("transparent") and experience is misleading ("wicked"), it…
Parallelizing Timed Petri Net simulations
NASA Technical Reports Server (NTRS)
Nicol, David M.
1993-01-01
The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galassi, Mark C.
Diorama is written as a collection of modules that can run in separate threads or in separate processes. This defines a clear interface between the modules and also allows concurrent processing of different parts of the pipeline. The pipeline is determined by a description in a scenario file[Norman and Tornga, 2012, Tornga and Norman, 2014]. The scenario manager parses the XML scenario and sets up the sequence of modules which will generate an event, propagate the signal to a set of sensors, and then run processing modules on the results provided by those sensor simulations. During a run a varietymore » of “observer” and “processor” modules can be invoked to do interim analysis of results. Observers do not modify the simulation results, while processors may affect the final result. At the end of a run results are collated and final reports are put out. A detailed description of the scenario file and how it puts together a simulation are given in [Tornga and Norman, 2014]. The processing pipeline and how to program it with the Diorama API is described in Tornga et al. [2015] and Tornga and Wakeford [2015]. In this report I describe the communications infrastructure that is used.« less
NASA Technical Reports Server (NTRS)
Follen, Gregory; auBuchon, M.
2000-01-01
Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.
Ares I-X Malfunction Turn Range Safety Analysis
NASA Technical Reports Server (NTRS)
Beaty, J. R.
2011-01-01
Ares I-X was the designation given to the flight test version of the Ares I rocket which was developed by NASA (also known as the Crew Launch Vehicle (CLV) component of the Constellation Program). The Ares I-X flight test vehicle achieved a successful flight test on October 28, 2009, from Pad LC-39B at Kennedy Space Center, Florida (KSC). As part of the flight plan approval for the test vehicle, a range safety malfunction turn analysis was performed to support the risk assessment and vehicle destruct criteria development processes. Several vehicle failure scenarios were identified which could have caused the vehicle trajectory to deviate from its normal flight path. The effects of these failures were evaluated with an Ares I-X 6 degrees-of-freedom (6-DOF) digital simulation, using the Program to Optimize Simulated Trajectories Version II (POST2) simulation tool. The Ares I-X simulation analysis provided output files containing vehicle trajectory state information. These were used by other risk assessment and vehicle debris trajectory simulation tools to determine the risk to personnel and facilities in the vicinity of the launch area at KSC, and to develop the vehicle destruct criteria used by the flight test range safety officer in the event of a flight test anomaly of the vehicle. The simulation analysis approach used for this study is described, including descriptions of the failure modes which were considered and the underlying assumptions and ground rules of the study.
Development and validation of the Simulation Learning Effectiveness Inventory.
Chen, Shiah-Lian; Huang, Tsai-Wei; Liao, I-Chen; Liu, Chienchi
2015-10-01
To develop and psychometrically test the Simulation Learning Effectiveness Inventory. High-fidelity simulation helps students develop clinical skills and competencies. Yet, reliable instruments measuring learning outcomes are scant. A descriptive cross-sectional survey was used to validate psychometric properties of the instrument measuring students' perception of stimulation learning effectiveness. A purposive sample of 505 nursing students who had taken simulation courses was recruited from a department of nursing of a university in central Taiwan from January 2010-June 2010. The study was conducted in two phases. In Phase I, question items were developed based on the literature review and the preliminary psychometric properties of the inventory were evaluated using exploratory factor analysis. Phase II was conducted to evaluate the reliability and validity of the finalized inventory using confirmatory factor analysis. The results of exploratory and confirmatory factor analyses revealed the instrument was composed of seven factors, named course arrangement, equipment resource, debriefing, clinical ability, problem-solving, confidence and collaboration. A further second-order analysis showed comparable fits between a three second-order factor (preparation, process and outcome) and the seven first-order factor models. Internal consistency was supported by adequate Cronbach's alphas and composite reliability. Convergent and discriminant validities were also supported by confirmatory factor analysis. The study provides evidence that the Simulation Learning Effectiveness Inventory is reliable and valid for measuring student perception of learning effectiveness. The instrument is helpful in building the evidence-based knowledge of the effect of simulation teaching on students' learning outcomes. © 2015 John Wiley & Sons Ltd.
Simulation of FIB-SEM images for analysis of porous microstructures.
Prill, Torben; Schladitz, Katja
2013-01-01
Focused ion beam nanotomography-scanning electron microscopy tomography yields high-quality three-dimensional images of materials microstructures at the nanometer scale combining serial sectioning using a focused ion beam with SEM. However, FIB-SEM tomography of highly porous media leads to shine-through artifacts preventing automatic segmentation of the solid component. We simulate the SEM process in order to generate synthetic FIB-SEM image data for developing and validating segmentation methods. Monte-Carlo techniques yield accurate results, but are too slow for the simulation of FIB-SEM tomography requiring hundreds of SEM images for one dataset alone. Nevertheless, a quasi-analytic description of the specimen and various acceleration techniques, including a track compression algorithm and an acceleration for the simulation of secondary electrons, cut down the computing time by orders of magnitude, allowing for the first time to simulate FIB-SEM tomography. © Wiley Periodicals, Inc.
Cazade, Pierre-André; Berezovska, Ganna; Meuwly, Markus
2015-05-01
The nature of ligand motion in proteins is difficult to characterize directly using experiment. Specifically, it is unclear to what degree these motions are coupled. All-atom simulations are used to sample ligand motion in truncated Hemoglobin N. A transition network analysis including ligand- and protein-degrees of freedom is used to analyze the microscopic dynamics. Clustering of two different subsets of MD trajectories highlights the importance of a diverse and exhaustive description to define the macrostates for a ligand-migration network. Monte Carlo simulations on the transition matrices from one particular clustering are able to faithfully capture the atomistic simulations. Contrary to clustering by ligand positions only, including a protein degree of freedom yields considerably improved coarse grained dynamics. Analysis with and without imposing detailed balance agree closely which suggests that the underlying atomistic simulations are converged with respect to sampling transitions between neighboring sites. Protein and ligand dynamics are not independent from each other and ligand migration through globular proteins is not passive diffusion. Transition network analysis is a powerful tool to analyze and characterize the microscopic dynamics in complex systems. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.
Magnetosphere-Ionosphere Coupling and Associated Ring Current Energization Processes
NASA Technical Reports Server (NTRS)
Liemohn, M. W.; Khazanov, G. V.
2004-01-01
Adiabatic processes in the ring current are examined. In particular, an analysis of the factors that parameterize the net adiabatic energy gain in the inner magnetosphere during magnetic storms is presented. A single storm was considered, that of April 17, 2002. Three simulations were conducted with similar boundary conditions but with different electric field descriptions. It is concluded that the best parameter for quantifying the net adiabatic energy gain in the inner magnetosphere during storms is the instantaneous value of the product of the maximum westward electric field at the outer simulation boundary with the nightside plasma sheet density. However, all of the instantaneous magnetospheric quantities considered in this study produced large correlation coefficients. Therefore, they all could be considered useful predictors of the net adiabatic energy gain of the ring current. Long integration times over the parameters lessen the significance of the correlation. Finally, some significant differences exist in the correlation coefficients depending on the electric field description.
Coupling all-atom molecular dynamics simulations of ions in water with Brownian dynamics.
Erban, Radek
2016-02-01
Molecular dynamics (MD) simulations of ions (K + , Na + , Ca 2+ and Cl - ) in aqueous solutions are investigated. Water is described using the SPC/E model. A stochastic coarse-grained description for ion behaviour is presented and parametrized using MD simulations. It is given as a system of coupled stochastic and ordinary differential equations, describing the ion position, velocity and acceleration. The stochastic coarse-grained model provides an intermediate description between all-atom MD simulations and Brownian dynamics (BD) models. It is used to develop a multiscale method which uses all-atom MD simulations in parts of the computational domain and (less detailed) BD simulations in the remainder of the domain.
NASA Astrophysics Data System (ADS)
Dave, Eshan V.
Asphalt concrete pavements are inherently graded viscoelastic structures. Oxidative aging of asphalt binder and temperature cycling due to climatic conditions being the major cause of non-homogeneity. Current pavement analysis and simulation procedures dwell on the use of layered approach to account for these non-homogeneities. The conventional finite-element modeling (FEM) technique discretizes the problem domain into smaller elements, each with a unique constitutive property. However the assignment of unique material property description to an element in the FEM approach makes it an unattractive choice for simulation of problems with material non-homogeneities. Specialized elements such as "graded elements" allow for non-homogenous material property definitions within an element. This dissertation describes the development of graded viscoelastic finite element analysis method and its application for analysis of asphalt concrete pavements. Results show that the present research improves efficiency and accuracy of simulations for asphalt pavement systems. Some of the practical implications of this work include the new technique's capability for accurate analysis and design of asphalt pavements and overlay systems and for the determination of pavement performance with varying climatic conditions and amount of in-service age. Other application areas include simulation of functionally graded fiber-reinforced concrete, geotechnical materials, metal and metal composites at high temperatures, polymers, and several other naturally existing and engineered materials.
The Mesoscale Predictability of Terrain Induced Flows
2009-09-30
simulations, we focus on assessing the predictability of winds, mountain waves and clear air turbulence ( CAT ) in the lee of the Sierra Nevada...complete description of the sensitivity of mountain waves, CAT and downslope to small variations in the initial conditions. WORK COMPLETED We...completed the analysis of the sensitivity of mountain waves, CAT and downslope winds to small perturbations in the upstream conditions. We also
Ion Channel Conductance Measurements on a Silicon-Based Platform
2006-01-01
calculated using the molecular dynamics code, GROMACS . Reasonable agreement is obtained in the simulated versus measured conductance over the range of...measurements of the lipid giga-seal characteristics have been performed, including AC conductance measurements and statistical analysis in order to...Dynamics kernel self-consistently coupled to Poisson equations using a P3M force field scheme and the GROMACS description of protein structure and
NASA Technical Reports Server (NTRS)
Miller, R. S.; Bellan, J.
1997-01-01
An Investigation of the statistical description of binary mixing and/or reaction between a carrier gas and an evaporated vapor species in two-phase gas-liquid turbulent flows is perfomed through both theroetical analysis and comparisons with results from direct numerical simulations (DNS) of a two-phase mixing layer.
Team Training for Command and Control Systems: Status.
1982-04-01
in order to develop this set of C 2 systems, including a project listing for the Electronic Systems Division (ESD) of the Air...do the job. Task analysis results in a detailed description of tasks and task steps, and associated environmental and equipment conditions and...simulation exercises match the projected threat either in terms of numbers or capabilities. Live exercises are even less satisfactory because
Selected Urban Simulations and Games. IFF Working Paper WP-4.
ERIC Educational Resources Information Center
Nagelberg, Mark; Little, Dennis L.
Summary descriptions of selected urban simulations and games that have been developed outside the Institute For The Future are presented. The operating characteristics and potential applications of each model are described. These include (1) the history of development, (2) model and player requirements, (3) a description of the environment being…
BRENDA: a dynamic simulator for a sodium-cooled fast reactor power plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hetrick, D.L.; Sowers, G.W.
1978-06-01
This report is a users' manual for one version of BRENDA (Breeder Reactor Nuclear Dynamic Analysis), which is a digital program for simulating the dynamic behavior of a sodium-cooled fast reactor power plant. This version, which contains 57 differential equations, represents a simplified model of the Clinch River Breeder Reactor Project (CRBRP). BRENDA is an input deck for DARE P (Differential Analyzer Replacement, Portable), which is a continuous-system simulation language developed at the University of Arizona. This report contains brief descriptions of DARE P and BRENDA, instructions for using BRENDA in conjunction with DARE P, and some sample output. Amore » list of variable names and a listing for BRENDA are included as appendices.« less
NASA Technical Reports Server (NTRS)
Galante, Joseph M.; Eepoel, John Van; Strube, Matt; Gill, Nat; Gonzalez, Marcelo; Hyslop, Andrew; Patrick, Bryan
2012-01-01
Argon is a flight-ready sensor suite with two visual cameras, a flash LIDAR, an on- board flight computer, and associated electronics. Argon was designed to provide sensing capabilities for relative navigation during proximity, rendezvous, and docking operations between spacecraft. A rigorous ground test campaign assessed the performance capability of the Argon navigation suite to measure the relative pose of high-fidelity satellite mock-ups during a variety of simulated rendezvous and proximity maneuvers facilitated by robot manipulators in a variety of lighting conditions representative of the orbital environment. A brief description of the Argon suite and test setup are given as well as an analysis of the performance of the system in simulated proximity and rendezvous operations.
The transition of a real-time single-rotor helicopter simulation program to a supercomputer
NASA Technical Reports Server (NTRS)
Martinez, Debbie
1995-01-01
This report presents the conversion effort and results of a real-time flight simulation application transition to a CONVEX supercomputer. Enclosed is a detailed description of the conversion process and a brief description of the Langley Research Center's (LaRC) flight simulation application program structure. Currently, this simulation program may be configured to represent Sikorsky S-61 helicopter (a five-blade, single-rotor, commercial passenger-type helicopter) or an Army Cobra helicopter (either the AH-1 G or AH-1 S model). This report refers to the Sikorsky S-61 simulation program since it is the most frequently used configuration.
Propulsion simulator for magnetically-suspended wind tunnel models
NASA Technical Reports Server (NTRS)
Joshi, Prakash B.; Goldey, C. L.; Sacco, G. P.; Lawing, Pierce L.
1991-01-01
The objective of phase two of a current investigation sponsored by NASA Langley Research Center is to demonstrate the measurement of aerodynamic forces/moments, including the effects of exhaust gases, in magnetic suspension and balance system (MSBS) wind tunnels. Two propulsion simulator models are being developed: a small-scale and a large-scale unit, both employing compressed, liquified carbon dioxide as propellant. The small-scale unit was designed, fabricated, and statically-tested at Physical Sciences Inc. (PSI). The large-scale simulator is currently in the preliminary design stage. The small-scale simulator design/development is presented, and the data from its static firing on a thrust stand are discussed. The analysis of this data provides important information for the design of the large-scale unit. A description of the preliminary design of the device is also presented.
Network visualization of conformational sampling during molecular dynamics simulation.
Ahlstrom, Logan S; Baker, Joseph Lee; Ehrlich, Kent; Campbell, Zachary T; Patel, Sunita; Vorontsov, Ivan I; Tama, Florence; Miyashita, Osamu
2013-11-01
Effective data reduction methods are necessary for uncovering the inherent conformational relationships present in large molecular dynamics (MD) trajectories. Clustering algorithms provide a means to interpret the conformational sampling of molecules during simulation by grouping trajectory snapshots into a few subgroups, or clusters, but the relationships between the individual clusters may not be readily understood. Here we show that network analysis can be used to visualize the dominant conformational states explored during simulation as well as the connectivity between them, providing a more coherent description of conformational space than traditional clustering techniques alone. We compare the results of network visualization against 11 clustering algorithms and principal component conformer plots. Several MD simulations of proteins undergoing different conformational changes demonstrate the effectiveness of networks in reaching functional conclusions. Copyright © 2013 Elsevier Inc. All rights reserved.
Contact Kinetics in Fractal Macromolecules.
Dolgushev, Maxim; Guérin, Thomas; Blumen, Alexander; Bénichou, Olivier; Voituriez, Raphaël
2015-11-13
We consider the kinetics of first contact between two monomers of the same macromolecule. Relying on a fractal description of the macromolecule, we develop an analytical method to compute the mean first contact time for various molecular sizes. In our theoretical description, the non-Markovian feature of monomer motion, arising from the interactions with the other monomers, is captured by accounting for the nonequilibrium conformations of the macromolecule at the very instant of first contact. This analysis reveals a simple scaling relation for the mean first contact time between two monomers, which involves only their equilibrium distance and the spectral dimension of the macromolecule, independently of its microscopic details. Our theoretical predictions are in excellent agreement with numerical stochastic simulations.
A multiscale physical model for the transient analysis of PEM water electrolyzer anodes.
Oliveira, Luiz Fernando L; Laref, Slimane; Mayousse, Eric; Jallut, Christian; Franco, Alejandro A
2012-08-07
Polymer electrolyte membrane water electrolyzers (PEMWEs) are electrochemical devices that can be used for the production of hydrogen. In a PEMWE the anode is the most complex electrode to study due to the high overpotential of the oxygen evolution reaction (OER), not widely understood. A physical bottom-up multi-scale transient model describing the operation of a PEMWE anode is proposed here. This model includes a detailed description of the elementary OER kinetics in the anode, a description of the non-equilibrium behavior of the nanoscale catalyst-electrolyte interface, and a microstructural-resolved description of the transport of charges and O(2) at the micro and mesoscales along the whole anode. The impact of different catalyst materials on the performance of the PEMWE anode, and a study of sensitivity to the operation conditions are evaluated from numerical simulations and the results are discussed in comparison with experimental data.
NASA Astrophysics Data System (ADS)
Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai
2015-06-01
Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.
NASA Technical Reports Server (NTRS)
Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.
1991-01-01
A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rougelot, Thomas; Burlion, Nicolas, E-mail: nicolas.burlion@polytech-lille.f; Bernard, Dominique
2010-02-15
Chemical shock of cement based materials leads to significant degradation of their physical properties. A typical scenario is a calcium leaching due to water (water with very low pH compared with that of pore fluid). The main objective of this paper is to evaluate the evolution of microstructure induced by leaching of a cementitious composite using synchrotron X-ray micro tomography, mainly from an experimental point of view. In this particular case, it was possible to identify cracking induced by leaching. After a description of the degradation mechanism and the X-ray synchrotron microtomographic analysis, numerical simulations are performed in order tomore » show that cracking is induced by an initial pre-stressing of the composite, coupled with decalcification shrinkage and dramatic decrease in tensile strength during leaching. X-ray microtomography analysis allowed to make evidence of an induced microcracking in cementitious material submitted to leaching.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Amy Cha-Tien; Downes, Paula Sue; Heinen, Russell
Analysis of chemical supply chains is an inherently complex task, given the dependence of these supply chains on multiple infrastructure systems (e.g., the petroleum sector, transportation, etc.). This effort requires data and information at various levels of resolution, ranging from network-level distribution systems to individual chemical reactions. Sandia National Laboratories (Sandia) has integrated its existing simulation and infrastructure analysis capabilities with chemical data models to analyze the chemical supply chains of several nationally critical chemical commodities. This paper describes how Sandia models the ethylene supply chain; that is, the supply chain for the most widely used raw material for plasticsmore » production including a description of the types of data and modeling capabilities that are required to represent the ethylene supply chain. The paper concludes with a description of Sandia's use the model to project how the supply chain would be affected by and adapt to a disruptive scenario hurricane.« less
Sensitivity of fire behavior simulations to fuel model variations
Lucy A. Salazar
1985-01-01
Stylized fuel models, or numerical descriptions of fuel arrays, are used as inputs to fire behavior simulation models. These fuel models are often chosen on the basis of generalized fuel descriptions, which are related to field observations. Site-specific observations of fuels or fire behavior in the field are not readily available or necessary for most fire management...
Booth, Richard G; Scerbo, Christina Ko; Sinclair, Barbara; Hancock, Michele; Reid, David; Denomy, Eileen
2017-04-01
Little research has been completed exploring knowledge development and transfer from and between simulated and clinical practice settings in nurse education. This study sought to explore the content learned, and the knowledge transferred, in a hybrid mental health clinical course consisting of simulated and clinical setting experiences. A qualitative, interpretive descriptive study design. Clinical practice consisted of six 10-hour shifts in a clinical setting combined with six two-hour simulations. 12 baccalaureate nursing students enrolled in a compressed time frame program at a large, urban, Canadian university participated. Document analysis and a focus group were used to draw thematic representations of content and knowledge transfer between clinical environments (i.e., simulated and clinical settings) using the constant comparative data analysis technique. Four major themes arose: (a) professional nursing behaviors; (b) understanding of the mental health nursing role; (c) confidence gained in interview skills; and, (d) unexpected learning. Nurse educators should further explore the intermingling of simulation and clinical practice in terms of knowledge development and transfer with the goal of preparing students to function within the mental health nursing specialty. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cordier, P.; Sun, X.; Fressengeas, C.; Taupin, V.
2015-12-01
A crossover between atomistic description and continuous representation of grain boundaries in polycrystals is set-up to model the periodic arrays of structural units by using dislocation and disclination dipole arrays along grain boundaries. Continuous modeling of the boundary is built by bottom-up processing, meaning that the strain, rotation, curvature, disclination and dislocation density fields are calculated by using the discrete atomic positions generated by molecular dynamics simulations. Continuous modeling of a 18.9° symmetric tilt boundary in copper [1] is conducted as a benchmark case. Its accuracy is validated by comparison with a similar recent technique [2]. Then, results on the 60.8° Mg2SiO4 tilt boundary [3-4] are presented. By linking the atomistic description with continuum mechanics representations, they provide new insights into the structure of the grain boundary. [1] Fressengeas, C., Taupin, V., Capolungo, L., 2014. Continuous modelling of the structure of symmetric tilt boundaries. Int. J. Solids Struct. 51, 1434-1441. [2] Zimmerman, J.A., Bammann, D.J., Gao, H., 2009. Deformation gradients for continuum mechanical analysis of atomistic simulations. Int. J. Solids Struct. 46, 238-253. [3] Cordier, P., Demouchy, S., Beausir, B., Taupin, V., Barou, F., Fressengeas, C., 2014. Disclinations provide the missing mechanism for deforming olivine-rich rocks in the mantle. Nature 507, 51-56. [4] Adjaoud, O., Marquardt, K., Jahn, S., 2012. Atomic structures and energies of grain boundaries in Mg2SiO4 forsterite from atomistic modeling. Phys. Chem. Miner. 39, 749-760.
Reliability database development for use with an object-oriented fault tree evaluation program
NASA Technical Reports Server (NTRS)
Heger, A. Sharif; Harringtton, Robert J.; Koen, Billy V.; Patterson-Hine, F. Ann
1989-01-01
A description is given of the development of a fault-tree analysis method using object-oriented programming. In addition, the authors discuss the programs that have been developed or are under development to connect a fault-tree analysis routine to a reliability database. To assess the performance of the routines, a relational database simulating one of the nuclear power industry databases has been constructed. For a realistic assessment of the results of this project, the use of one of existing nuclear power reliability databases is planned.
Modal analysis for Liapunov stability of rotating elastic bodies. Ph.D. Thesis. Final Report
NASA Technical Reports Server (NTRS)
Colin, A. D.
1973-01-01
This study consisted of four parallel efforts: (1) modal analyses of elastic continua for Liapunov stability analysis of flexible spacecraft; (2) development of general purpose simulation equations for arbitrary spacecraft; (3) evaluation of alternative mathematical models for elastic components of spacecraft; and (4) examination of the influence of vehicle flexibility on spacecraft attitude control system performance. A complete record is given of achievements under tasks (1) and (3), in the form of technical appendices, and a summary description of progress under tasks two and four.
NASA Technical Reports Server (NTRS)
Bielawa, R. L.
1982-01-01
Mathematical development is presented for the expanded capabilities of the United Technologies Research Center (UTRC) G400 Rotor Aeroelastic Analysis. This expanded analysis, G400PA, simulates the dynamics of teetered rotors, blade pendulum vibration absorbers and the higher harmonic excitations resulting from prescribed vibratory hub motions and higher harmonic blade pitch control. Formulations are also presented for calculating the rotor impedance matrix appropriate to these higher harmonic blade excitations. This impedance matrix and the associated vibratory hub loads are intended as the rotor blade characteristics elements for use in the Simplified Coupled Rotor/Fuselage Vibration Analysis (SIMVIB). Sections are included presenting updates to the development of the original G400 theory, and material appropriate to the user of the G400PA computer program. This material includes: (1) a general descriptionof the tructuring of the G400PA FORTRAN coding, (2) a detaild description of the required input data and other useful information for successfully running the program, and (3) a detailed description of the output results.
Electro-optical system for gunshot detection: analysis, concept, and performance
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Madura, H.; Trzaskawka, P.; Bieszczad, G.; Sosnowski, T.
2011-08-01
The paper discusses technical possibilities to build an effective electro-optical sensor unit for sniper detection using infrared cameras. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. At first, the analysis was presented of three distinguished phases of sniper activity: before, during and after the shot. On the basis of experimental data the parameters defining the relevant sniper signatures were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets and the descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. The analyzed infrared systems were simulated using NVTherm software. The calculations for several cameras, equipped with different lenses and detector types were performed. The simulation of detection ranges was performed for the selected scenarios of sniper detection tasks. After the analysis of simulation results, the technical specifications of infrared sniper detection system were discussed, required to provide assumed detection range. Finally the infrared camera setup was proposed which can detected sniper from 1000 meters range.
Development of weight/sizing design synthesis computer program. Volume 3: User Manual
NASA Technical Reports Server (NTRS)
Garrison, J. M.
1973-01-01
The user manual for the weight/sizing design synthesis program is presented. The program is applied to an analysis of the basic weight relationships for the space shuttle which contribute significant portions of the inert weight. The relationships measure the parameters of load, geometry, material, and environment. A verbal description of the processes simulated, data input procedures, output data, and values present in the program is included.
Construction of Interaction Layer on Socio-Environmental Simulation
NASA Astrophysics Data System (ADS)
Torii, Daisuke; Ishida, Toru
In this study, we propose a method to construct a system based on a legacy socio-environmental simulator which enables to design more realistic interaction models in socio-environmetal simulations. First, to provide a computational model suitable for agent interactions, an interaction layer is constructed and connected from outside of a legacy socio-environmental simulator. Next, to configure the agents interacting ability, connection description for controlling the flow of information in the connection area is provided. As a concrete example, we realized an interaction layer by Q which is a scenario description language and connected it to CORMAS, a socio-envirionmental simulator. Finally, we discuss the capability of our method, using the system, in the Fire-Fighter domain.
Homogeneous buoyancy-generated turbulence
NASA Technical Reports Server (NTRS)
Batchelor, G. K.; Canuto, V. M.; Chasnov, J. R.
1992-01-01
Using a theoretical analysis of fundamental equations and a numerical simulation of the flow field, the statistically homogeneous motion that is generated by buoyancy forces after the creation of homogeneous random fluctuations in the density of infinite fluid at an initial instant is examined. It is shown that analytical results together with numerical results provide a comprehensive description of the 'birth, life, and death' of buoyancy-generated turbulence. Results of numerical simulations yielded the mean-square density mean-square velocity fluctuations and the associated spectra as functions of time for various initial conditions, and the time required for the mean-square density fluctuation to fall to a specified small value was estimated.
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.
2011-12-01
In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.
Analysis of post-mining excavations as places for municipal waste
NASA Astrophysics Data System (ADS)
Górniak-Zimroz, Justyna
2018-01-01
Waste management planning is an interdisciplinary task covering a wide range of issues including costs, legal requirements, spatial planning, environmental protection, geography, demographics, and techniques used in collecting, transporting, processing and disposing of waste. Designing and analyzing this issue is difficult and requires the use of advanced analysis methods and tools available in GIS geographic information systems containing readily available graphical and descriptive databases, data analysis tools providing expert decision support while selecting the best-designed alternative, and simulation models that allow the user to simulate many variants of waste management together with graphical visualization of the results of performed analyzes. As part of the research study, there have been works undertaken concerning the use of multi-criteria data analysis in waste management in areas located in southwestern Poland. These works have proposed the inclusion in waste management of post-mining excavations as places for the final or temporary collection of waste assessed in terms of their suitability with the tools available in GIS systems.
Development and psychometric testing of the satisfaction with Cultural Simulation Experience Scale.
Courtney-Pratt, Helen; Levett-Jones, Tracy; Lapkin, Samuel; Pitt, Victoria; Gilligan, Conor; Van der Riet, Pamela; Rossiter, Rachel; Jones, Donovan; Everson, Naleya
2015-11-01
Decreasing the numbers of adverse health events experienced by people from culturally diverse backgrounds rests, in part, on the ability of education providers to provide quality learning experiences that support nursing students in developing cultural competence, an essential professional attribute. This paper reports on the implementation and evaluation of an immersive 3D cultural empathy simulation. The Satisfaction with Cultural Simulation Experience Scale used in this study was adapted and validated as the first stage of this study. Exploratory factor analysis and confirmatory factor analysis were undertaken to investigate the psychometric properties of the scale using two randomly-split sub-samples. Cronbach's Alpha was used to examine internal consistency reliability. Descriptive statistics were used for analysis of mean satisfaction scores and qualitative comments to open-ended questions were analysed and coded. A purposive sample (n = 497) of second of nursing students participated in the study. The overall Cronbach's alpha for the scale was 0.95 and each subscale demonstrated high internal consistency: 0.92; 0.92; 0.72 respectively. The mean satisfaction score was 4.64 (SD 0.51) out of a maximum of 5 indicating a high level of participant satisfaction with the simulation. Three factors emerged from qualitative analysis: "Becoming culturally competent", "Learning from the debrief" and "Reflecting on practice". The cultural simulation was highly regarded by students. Psychometric testing of the Satisfaction with Cultural Simulation Experience Scale demonstrated that it is a reliable instrument. However, there is room for improvement and further testing in other contexts is therefore recommended. Copyright © 2015 Elsevier Ltd. All rights reserved.
Adiabatic coarse-graining and simulations of stochastic biochemical networks
Sinitsyn, N. A.; Hengartner, Nicolas; Nemenman, Ilya
2009-01-01
We propose a universal approach for analysis and fast simulations of stiff stochastic biochemical networks, which rests on elimination of fast chemical species without a loss of information about mesoscopic, non-Poissonian fluctuations of the slow ones. Our approach is similar to the Born–Oppenheimer approximation in quantum mechanics and follows from the stochastic path integral representation of the cumulant generating function of reaction events. In applications with a small number of chemical reactions, it produces analytical expressions for cumulants of chemical fluxes between the slow variables. This allows for a low-dimensional, interpretable representation and can be used for high-accuracy, low-complexity coarse-grained numerical simulations. As an example, we derive the coarse-grained description for a chain of biochemical reactions and show that the coarse-grained and the microscopic simulations agree, but the former is 3 orders of magnitude faster. PMID:19525397
VLP Simulation: An Interactive Simple Virtual Model to Encourage Geoscience Skill about Volcano
NASA Astrophysics Data System (ADS)
Hariyono, E.; Liliasari; Tjasyono, B.; Rosdiana, D.
2017-09-01
The purpose of this study was to describe physics students predicting skills after following the geoscience learning using VLP (Volcano Learning Project) simulation. This research was conducted to 24 physics students at one of the state university in East Java-Indonesia. The method used is the descriptive analysis based on students’ answers related to predicting skills about volcanic activity. The results showed that the learning by using VLP simulation was very potential to develop physics students predicting skills. Students were able to explain logically about volcanic activity and they have been able to predict the potential eruption that will occur based on the real data visualization. It can be concluded that the VLP simulation is very suitable for physics student requirements in developing geosciences skill and recommended as an alternative media to educate the society in an understanding of volcanic phenomena.
NASA Technical Reports Server (NTRS)
Geyser, L. C.
1978-01-01
A digital computer program, DYGABCD, was developed that generates linearized, dynamic models of simulated turbofan and turbojet engines. DYGABCD is based on an earlier computer program, DYNGEN, that is capable of calculating simulated nonlinear steady-state and transient performance of one- and two-spool turbojet engines or two- and three-spool turbofan engines. Most control design techniques require linear system descriptions. For multiple-input/multiple-output systems such as turbine engines, state space matrix descriptions of the system are often desirable. DYGABCD computes the state space matrices commonly referred to as the A, B, C, and D matrices required for a linear system description. The report discusses the analytical approach and provides a users manual, FORTRAN listings, and a sample case.
Reactive transport codes for subsurface environmental simulation
Steefel, C. I.; Appelo, C. A. J.; Arora, B.; ...
2014-09-26
A general description of the mathematical and numerical formulations used in modern numerical reactive transport codes relevant for subsurface environmental simulations is presented. The formulations are followed by short descriptions of commonly used and available subsurface simulators that consider continuum representations of flow, transport, and reactions in porous media. These formulations are applicable to most of the subsurface environmental benchmark problems included in this special issue. The list of codes described briefly here includes PHREEQC, HPx, PHT3D, OpenGeoSys (OGS), HYTEC, ORCHESTRA, TOUGHREACT, eSTOMP, HYDROGEOCHEM, CrunchFlow, MIN3P, and PFLOTRAN. The descriptions include a high-level list of capabilities for each of themore » codes, along with a selective list of applications that highlight their capabilities and historical development.« less
LISP based simulation generators for modeling complex space processes
NASA Technical Reports Server (NTRS)
Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing
1987-01-01
The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.
ST-analyzer: a web-based user interface for simulation trajectory analysis.
Jeong, Jong Cheol; Jo, Sunhwan; Wu, Emilia L; Qi, Yifei; Monje-Galvan, Viviana; Yeom, Min Sun; Gorenstein, Lev; Chen, Feng; Klauda, Jeffery B; Im, Wonpil
2014-05-05
Molecular dynamics (MD) simulation has become one of the key tools to obtain deeper insights into biological systems using various levels of descriptions such as all-atom, united-atom, and coarse-grained models. Recent advances in computing resources and MD programs have significantly accelerated the simulation time and thus increased the amount of trajectory data. Although many laboratories routinely perform MD simulations, analyzing MD trajectories is still time consuming and often a difficult task. ST-analyzer, http://im.bioinformatics.ku.edu/st-analyzer, is a standalone graphical user interface (GUI) toolset to perform various trajectory analyses. ST-analyzer has several outstanding features compared to other existing analysis tools: (i) handling various formats of trajectory files from MD programs, such as CHARMM, NAMD, GROMACS, and Amber, (ii) intuitive web-based GUI environment--minimizing administrative load and reducing burdens on the user from adapting new software environments, (iii) platform independent design--working with any existing operating system, (iv) easy integration into job queuing systems--providing options of batch processing either on the cluster or in an interactive mode, and (v) providing independence between foreground GUI and background modules--making it easier to add personal modules or to recycle/integrate pre-existing scripts utilizing other analysis tools. The current ST-analyzer contains nine main analysis modules that together contain 18 options, including density profile, lipid deuterium order parameters, surface area per lipid, and membrane hydrophobic thickness. This article introduces ST-analyzer with its design, implementation, and features, and also illustrates practical analysis of lipid bilayer simulations. Copyright © 2014 Wiley Periodicals, Inc.
Flow of GE90 Turbofan Engine Simulated
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
1999-01-01
The objective of this task was to create and validate a three-dimensional model of the GE90 turbofan engine (General Electric) using the APNASA (average passage) flow code. This was a joint effort between GE Aircraft Engines and the NASA Lewis Research Center. The goal was to perform an aerodynamic analysis of the engine primary flow path, in under 24 hours of CPU time, on a parallel distributed workstation system. Enhancements were made to the APNASA Navier-Stokes code to make it faster and more robust and to allow for the analysis of more arbitrary geometry. The resulting simulation exploited the use of parallel computations by using two levels of parallelism, with extremely high efficiency.The primary flow path of the GE90 turbofan consists of a nacelle and inlet, 49 blade rows of turbomachinery, and an exhaust nozzle. Secondary flows entering and exiting the primary flow path-such as bleed, purge, and cooling flows-were modeled macroscopically as source terms to accurately simulate the engine. The information on these source terms came from detailed descriptions of the cooling flow and from thermodynamic cycle system simulations. These provided boundary condition data to the three-dimensional analysis. A simplified combustor was used to feed boundary conditions to the turbomachinery. Flow simulations of the fan, high-pressure compressor, and high- and low-pressure turbines were completed with the APNASA code.
Buslaev, Pavel; Gordeliy, Valentin; Grudinin, Sergei; Gushchin, Ivan
2016-03-08
Molecular dynamics simulations of lipid bilayers are ubiquitous nowadays. Usually, either global properties of the bilayer or some particular characteristics of each lipid molecule are evaluated in such simulations, but the structural properties of the molecules as a whole are rarely studied. Here, we show how a comprehensive quantitative description of conformational space and dynamics of a single lipid molecule can be achieved via the principal component analysis (PCA). We illustrate the approach by analyzing and comparing simulations of DOPC bilayers obtained using eight different force fields: all-atom generalized AMBER, CHARMM27, CHARMM36, Lipid14, and Slipids and united-atom Berger, GROMOS43A1-S3, and GROMOS54A7. Similarly to proteins, most of the structural variance of a lipid molecule can be described by only a few principal components. These major components are similar in different simulations, although there are notable distinctions between the older and newer force fields and between the all-atom and united-atom force fields. The DOPC molecules in the simulations generally equilibrate on the time scales of tens to hundreds of nanoseconds. The equilibration is the slowest in the GAFF simulation and the fastest in the Slipids simulation. Somewhat unexpectedly, the equilibration in the united-atom force fields is generally slower than in the all-atom force fields. Overall, there is a clear separation between the more variable previous generation force fields and significantly more similar new generation force fields (CHARMM36, Lipid14, Slipids). We expect that the presented approaches will be useful for quantitative analysis of conformations and dynamics of individual lipid molecules in other simulations of lipid bilayers.
Computers for real time flight simulation: A market survey
NASA Technical Reports Server (NTRS)
Bekey, G. A.; Karplus, W. J.
1977-01-01
An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.
Survey of factors influencing learner engagement with simulation debriefing among nursing students.
Roh, Young Sook; Jang, Kie In
2017-12-01
Simulation-based education has escalated worldwide, yet few studies have rigorously explored predictors of learner engagement with simulation debriefing. The purpose of this cross-sectional, descriptive survey was to identify factors that determine learner engagement with simulation debriefing among nursing students. A convenience sample of 296 Korean nursing students enrolled in the simulation-based course completed the survey. A total of five instruments were used: (i) Characteristics of Debriefing; (ii) Debriefing Assessment for Simulation in Healthcare - Student Version; (iii) The Korean version of the Simulation Design Scale; (iv) Communication Skills Scale; and (v) Clinical-Based Stress Scale. Multiple regression analysis was performed using the variables to investigate the influencing factors. The results indicated that influencing factors of learning engagement with simulation debriefing were simulation design, confidentiality, stress, and number of students. Simulation design was the most important factor. Video-assisted debriefing was not a significant factor affecting learner engagement. Educators should organize and conduct debriefing activities while considering these factors to effectively induce learner engagement. Further study is needed to identify the effects of debriefing sessions targeting learners' needs and considering situational factors on learning outcomes. © 2017 John Wiley & Sons Australia, Ltd.
Transient hydrodynamic finite-size effects in simulations under periodic boundary conditions
NASA Astrophysics Data System (ADS)
Asta, Adelchi J.; Levesque, Maximilien; Vuilleumier, Rodolphe; Rotenberg, Benjamin
2017-06-01
We use lattice-Boltzmann and analytical calculations to investigate transient hydrodynamic finite-size effects induced by the use of periodic boundary conditions. These effects are inevitable in simulations at the molecular, mesoscopic, or continuum levels of description. We analyze the transient response to a local perturbation in the fluid and obtain the local velocity correlation function via linear response theory. This approach is validated by comparing the finite-size effects on the steady-state velocity with the known results for the diffusion coefficient. We next investigate the full time dependence of the local velocity autocorrelation function. We find at long times a crossover between the expected t-3 /2 hydrodynamic tail and an oscillatory exponential decay, and study the scaling with the system size of the crossover time, exponential rate and amplitude, and oscillation frequency. We interpret these results from the analytic solution of the compressible Navier-Stokes equation for the slowest modes, which are set by the system size. The present work not only provides a comprehensive analysis of hydrodynamic finite-size effects in bulk fluids, which arise regardless of the level of description and simulation algorithm, but also establishes the lattice-Boltzmann method as a suitable tool to investigate such effects in general.
Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion.
Fröhlich, Fabian; Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J; Grima, Ramon; Hasenauer, Jan
2016-07-01
Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity.
Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion
Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J.; Grima, Ramon; Hasenauer, Jan
2016-01-01
Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity. PMID:27447730
Physics-based statistical learning approach to mesoscopic model selection.
Taverniers, Søren; Haut, Terry S; Barros, Kipton; Alexander, Francis J; Lookman, Turab
2015-11-01
In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.
Airport-Noise Levels and Annoyance Model (ALAMO) user's guide
NASA Technical Reports Server (NTRS)
Deloach, R.; Donaldson, J. L.; Johnson, M. J.
1986-01-01
A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.
Kinetics of propagation of the lattice excitation in a swift heavy ion track
NASA Astrophysics Data System (ADS)
Lipp, V. P.; Volkov, A. E.; Sorokin, M. V.; Rethfeld, B.
2011-05-01
In this research we verify the applicability of the temperature and heat diffusion conceptions for the description of subpicosecond lattice excitations in nanometric tracks of swift heavy ions (SHI) decelerated in solids in the electronic stopping regime. The method is based on the molecular dynamics (MD) analysis of temporal evolutions of the local kinetic and configurational temperatures of a lattice. We used solid argon as the model system. MD simulations demonstrated that in a SHI track (a) thermalization of lattice excitations takes time of several picoseconds, and (b) application of the parabolic heat diffusion equations for the description of spatial and temporal propagation of lattice excitations is questionable at least up to 10 ps after the ion passage.
NASA Astrophysics Data System (ADS)
Hardie, Russell C.; Power, Jonathan D.; LeMaster, Daniel A.; Droege, Douglas R.; Gladysz, Szymon; Bose-Pillai, Santasri
2017-07-01
We present a numerical wave propagation method for simulating imaging of an extended scene under anisoplanatic conditions. While isoplanatic simulation is relatively common, few tools are specifically designed for simulating the imaging of extended scenes under anisoplanatic conditions. We provide a complete description of the proposed simulation tool, including the wave propagation method used. Our approach computes an array of point spread functions (PSFs) for a two-dimensional grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. The degradation includes spatially varying warping and blurring. To produce the PSF array, we generate a series of extended phase screens. Simulated point sources are numerically propagated from an array of positions on the object plane, through the phase screens, and ultimately to the focal plane of the simulated camera. Note that the optical path for each PSF will be different, and thus, pass through a different portion of the extended phase screens. These different paths give rise to a spatially varying PSF to produce anisoplanatic effects. We use a method for defining the individual phase screen statistics that we have not seen used in previous anisoplanatic simulations. We also present a validation analysis. In particular, we compare simulated outputs with the theoretical anisoplanatic tilt correlation and a derived differential tilt variance statistic. This is in addition to comparing the long- and short-exposure PSFs and isoplanatic angle. We believe this analysis represents the most thorough validation of an anisoplanatic simulation to date. The current work is also unique that we simulate and validate both constant and varying Cn2(z) profiles. Furthermore, we simulate sequences with both temporally independent and temporally correlated turbulence effects. Temporal correlation is introduced by generating even larger extended phase screens and translating this block of screens in front of the propagation area. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. Thus, we think this tool can be used effectively to study optical anisoplanatic turbulence and to aid in the development of image restoration methods.
Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J
2009-04-01
An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.
Sniper detection using infrared camera: technical possibilities and limitations
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.
2010-04-01
The paper discusses technical possibilities to build an effective system for sniper detection using infrared cameras. Descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. Cooled and uncooled detectors were considered. Three phases of sniper activities were taken into consideration: before, during and after the shot. On the basis of experimental data the parameters defining the target were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets. The simulation of detection ranges was done for the assumed scenario of sniper detection task. The infrared sniper detection system was discussed, capable of fulfilling the requirements. The discussion of the results of analysis and simulations was finally presented.
Zhang, Jun
To explore the subjective learning experiences of baccalaureate nursing students participating in simulation sessions in a Chinese nursing school. This was a qualitative descriptive study. We used semi-structured interviews to explore students' perception about simulation-assisted learning. Each interview was audio-taped and transcribed verbatim. Thematic analysis was used to identify the major themes or categories from the transcript and the field notes. Only 10 students were needed to achieve theoretical saturation, due to high group homogeneity. Three main themes which were found from the study included 1. Students' positive views of the new educational experience of simulation; 2. Factors currently making simulation less attractive to students; and 3. The teacher's role in insuring a positive learning experience. Simulation-assisted teaching has been a positive experience for majority nursing students. Further efforts are needed in developing quality simulation-based course curriculum as well as planning and structuring its teaching process. The pedagogy approach requires close collaboration between faculty and students. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.
2015-02-01
In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.
NASA Astrophysics Data System (ADS)
Olivieri, Giorgia; Parry, Krista M.; Powell, Cedric J.; Tobias, Douglas J.; Brown, Matthew A.
2016-04-01
Over the past decade, energy-dependent ambient pressure X-ray photoelectron spectroscopy (XPS) has emerged as a powerful analytical probe of the ion spatial distributions at the vapor (vacuum)-aqueous electrolyte interface. These experiments are often paired with complementary molecular dynamics (MD) simulations in an attempt to provide a complete description of the liquid interface. There is, however, no systematic protocol that permits a straightforward comparison of the two sets of results. XPS is an integrated technique that averages signals from multiple layers in a solution even at the lowest photoelectron kinetic energies routinely employed, whereas MD simulations provide a microscopic layer-by-layer description of the solution composition near the interface. Here, we use the National Institute of Standards and Technology database for the Simulation of Electron Spectra for Surface Analysis (SESSA) to quantitatively interpret atom-density profiles from MD simulations for XPS signal intensities using sodium and potassium iodide solutions as examples. We show that electron inelastic mean free paths calculated from a semi-empirical formula depend strongly on solution composition, varying by up to 30% between pure water and concentrated NaI. The XPS signal thus arises from different information depths in different solutions for a fixed photoelectron kinetic energy. XPS signal intensities are calculated using SESSA as a function of photoelectron kinetic energy (probe depth) and compared with a widely employed ad hoc method. SESSA simulations illustrate the importance of accounting for elastic-scattering events at low photoelectron kinetic energies (<300 eV) where the ad hoc method systematically underestimates the preferential enhancement of anions over cations. Finally, some technical aspects of applying SESSA to liquid interfaces are discussed.
Simulation-Based Abdominal Ultrasound Training - A Systematic Review.
Østergaard, M L; Ewertsen, C; Konge, L; Albrecht-Beste, E; Bachmann Nielsen, M
2016-06-01
The aim is to provide a complete overview of the different simulation-based training options for abdominal ultrasound and to explore the evidence of their effect. This systematic review was performed according to the PRISMA guidelines and Medline, Embase, Web of Science, and the Cochrane Library was searched. Articles were divided into three categories based on study design (randomized controlled trials, before-and-after studies and descriptive studies) and assessed for level of evidence using the Oxford Centre for Evidence Based Medicine (OCEBM) system and for bias using the Cochrane Collaboration risk of bias assessment tool. Seventeen studies were included in the analysis: four randomized controlled trials, eight before-and-after studies with pre- and post-test evaluations, and five descriptive studies. No studies scored the highest level of evidence, and 14 had the lowest level. Bias was high for 11 studies, low for four, and unclear for two. No studies used a test with established evidence of validity or examined the correlation between obtained skills on the simulators and real-life clinical skills. Only one study used blinded assessors. The included studies were heterogeneous in the choice of simulator, study design, participants, and outcome measures, and the level of evidence for effect was inadequate. In all studies simulation training was equally or more beneficial than other instructions or no instructions. Study designs had significant built-in bias and confounding issues; therefore, further research should be based on randomized controlled trials using tests with validity evidence and blinded assessors. © Georg Thieme Verlag KG Stuttgart · New York.
Engineering design and integration simulation utilization manual
NASA Technical Reports Server (NTRS)
Hirsch, G. N.
1976-01-01
A description of the Engineering Design Integration (EDIN) Simulation System as it exists at Johnson Space Center is provided. A discussion of the EDIN Simulation System capabilities and applications is presented.
Extensions and Adjuncts to the BRL-COMGEOM Program
1974-08-01
m MAGIC Code, GIFT Code, Computer Simulation, Target Description, Geometric Modeling Techniques, Vulnerability Analysis 20...Arbitrary Quadric Surf ace.. 0Oo „<>. 7 III. BRITL: A GEOMETRY PREPROCESSOR PROGRAM FOR INPUT TO THE GIFT SYSTEM „ 0 18 A. Introduction <, „. ° 18 B...the BRL- GIFT code. The tasks completed under this contract and described in the report are: Ao The addition to the list of available body types
McCarty, J; Clark, A J; Copperman, J; Guenza, M G
2014-05-28
Structural and thermodynamic consistency of coarse-graining models across multiple length scales is essential for the predictive role of multi-scale modeling and molecular dynamic simulations that use mesoscale descriptions. Our approach is a coarse-grained model based on integral equation theory, which can represent polymer chains at variable levels of chemical details. The model is analytical and depends on molecular and thermodynamic parameters of the system under study, as well as on the direct correlation function in the k → 0 limit, c0. A numerical solution to the PRISM integral equations is used to determine c0, by adjusting the value of the effective hard sphere diameter, dHS, to agree with the predicted equation of state. This single quantity parameterizes the coarse-grained potential, which is used to perform mesoscale simulations that are directly compared with atomistic-level simulations of the same system. We test our coarse-graining formalism by comparing structural correlations, isothermal compressibility, equation of state, Helmholtz and Gibbs free energies, and potential energy and entropy using both united atom and coarse-grained descriptions. We find quantitative agreement between the analytical formalism for the thermodynamic properties, and the results of Molecular Dynamics simulations, independent of the chosen level of representation. In the mesoscale description, the potential energy of the soft-particle interaction becomes a free energy in the coarse-grained coordinates which preserves the excess free energy from an ideal gas across all levels of description. The structural consistency between the united-atom and mesoscale descriptions means the relative entropy between descriptions has been minimized without any variational optimization parameters. The approach is general and applicable to any polymeric system in different thermodynamic conditions.
On the Need for Multidimensional Stirling Simulations
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako
2005-01-01
Given the cost and complication of simulating Stirling convertors, do we really need multidimensional modeling when one-dimensional capabilities exist? This paper provides a comprehensive description of when and why multidimensional simulation is needed.
Equilibration and analysis of first-principles molecular dynamics simulations of water
NASA Astrophysics Data System (ADS)
Dawson, William; Gygi, François
2018-03-01
First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.
Equilibration and analysis of first-principles molecular dynamics simulations of water.
Dawson, William; Gygi, François
2018-03-28
First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Kyle W.; Gauntt, Randall O.; Cardoni, Jeffrey N.
2013-11-01
Data, a brief description of key boundary conditions, and results of Sandia National Laboratories’ ongoing MELCOR analysis of the Fukushima Unit 2 accident are given for the reactor core isolation cooling (RCIC) system. Important assumptions and related boundary conditions in the current analysis additional to or different than what was assumed/imposed in the work of SAND2012-6173 are identified. This work is for the U.S. Department of Energy’s Nuclear Energy University Programs fiscal year 2014 Reactor Safety Technologies Research and Development Program RC-7: RCIC Performance under Severe Accident Conditions.
Development and validation of the Simulation Learning Effectiveness Scale for nursing students.
Pai, Hsiang-Chu
2016-11-01
To develop and validate the Simulation Learning Effectiveness Scale, which is based on Bandura's social cognitive theory. A simulation programme is a significant teaching strategy for nursing students. Nevertheless, there are few evidence-based instruments that validate the effectiveness of simulation learning in Taiwan. This is a quantitative descriptive design. In Study 1, a nonprobability convenience sample of 151 student nurses completed the Simulation Learning Effectiveness Scale. Exploratory factor analysis was used to examine the factor structure of the instrument. In Study 2, which involved 365 student nurses, confirmatory factor analysis and structural equation modelling were used to analyse the construct validity of the Simulation Learning Effectiveness Scale. In Study 1, exploratory factor analysis yielded three components: self-regulation, self-efficacy and self-motivation. The three factors explained 29·09, 27·74 and 19·32% of the variance, respectively. The final 12-item instrument with the three factors explained 76·15% of variance. Cronbach's alpha was 0·94. In Study 2, confirmatory factor analysis identified a second-order factor termed Simulation Learning Effectiveness Scale. Goodness-of-fit indices showed an acceptable fit overall with the full model (χ 2 /df (51) = 3·54, comparative fit index = 0·96, Tucker-Lewis index = 0·95 and standardised root-mean-square residual = 0·035). In addition, teacher's competence was found to encourage learning, and self-reflection and insight were significantly and positively associated with Simulation Learning Effectiveness Scale. Teacher's competence in encouraging learning also was significantly and positively associated with self-reflection and insight. Overall, theses variable explained 21·9% of the variance in the student's learning effectiveness. The Simulation Learning Effectiveness Scale is a reliable and valid means to assess simulation learning effectiveness for nursing students. The Simulation Learning Effectiveness Scale can be used to examine nursing students' learning effectiveness and serve as a basis to improve student's learning efficiency through simulation programmes. Future implementation research that focuses on the relationship between learning effectiveness and nursing competence in nursing students is recommended. © 2016 John Wiley & Sons Ltd.
Numerical aerodynamic simulation facility. Preliminary study extension
NASA Technical Reports Server (NTRS)
1978-01-01
The production of an optimized design of key elements of the candidate facility was the primary objective of this report. This was accomplished by effort in the following tasks: (1) to further develop, optimize and describe the function description of the custom hardware; (2) to delineate trade off areas between performance, reliability, availability, serviceability, and programmability; (3) to develop metrics and models for validation of the candidate systems performance; (4) to conduct a functional simulation of the system design; (5) to perform a reliability analysis of the system design; and (6) to develop the software specifications to include a user level high level programming language, a correspondence between the programming language and instruction set and outline the operation system requirements.
JAMS - a software platform for modular hydrological modelling
NASA Astrophysics Data System (ADS)
Kralisch, Sven; Fischer, Christian
2015-04-01
Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.
BFS Simulation and Experimental Analysis of the Effect of Ti Additions on the Structure of NiAl
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante,John; Garg, Anita; Honecy, Frank S.; Amador, Carlos
1999-01-01
The Bozzolo-Ferrante-Smith (BFS) method for alloy energetics is applied to the study of ternary additions to NiAl. A description of the method and its application to alloy design is given. Two different approaches are used in the analysis of the effect of Ti additions to NiAl. First, a thorough analytical study is performed, where the energy of formation, lattice parameter and bulk modulus are calculated for a large number of possible atomic distributions of Ni, Al and Ti. Substitutional site preference schemes and formation of precipitates are thus predicted and analyzed. The second approach used consists of the determination of temperature effects on the final results, as obtained by performing a number of large scale numerical simulations using the Monte Carlo-Metropolis procedure and BFS for the calculation of the energy at every step in the simulation. The results indicate a sharp preference of Ti for Al sites in Ni-rich NiAl alloys and the formation of ternary Heusler precipitates beyond the predicted solubility limit of 5 at. % Ti. Experimental analysis of three Ni-Al-Ti alloys confirms the theoretical predictions.
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, John; Garg, Anita; Amador, Carlos
1997-01-01
The Bozzolo-Ferrante-Smith (BFS) semiempirical method for alloy energetics is applied to the study of ternary additions to NiAl alloys. A detailed description of the method and its application to alloy design is given. Two different approaches are used in the analysis of the effect of Ti additions to NiAl. First, a thorough analytical study is performed, where the energy of formation, lattice parameter and bulk modulus are calculated for hundreds of possible atomic distributions of Ni, Al and Ti. Substitutional site preference schemes and formation of precipitates are thus predicted and analyzed. The second approach used consists of the determination of temperature effects on the final results, as obtained by performing a number of large scale numerical simulations using the Monte Carlo - Metropolis procedure and BFS for the calculation of the energy at every step in the simulation. The results indicate a sharp preference of Ti for Al sites in Ni-rich NiAl alloys and the formation of ternary Heusler precipitates beyond the predicted solubility limit of 5 at. % Ti. Experimental analysis of three NiAl+Ti alloys confirms the theoretical predictions.
Static friction between rigid fractal surfaces
NASA Astrophysics Data System (ADS)
Alonso-Marroquin, Fernando; Huang, Pengyu; Hanaor, Dorian A. H.; Flores-Johnson, E. A.; Proust, Gwénaëlle; Gan, Yixiang; Shen, Luming
2015-09-01
Using spheropolygon-based simulations and contact slope analysis, we investigate the effects of surface topography and atomic scale friction on the macroscopically observed friction between rigid blocks with fractal surface structures. From our mathematical derivation, the angle of macroscopic friction is the result of the sum of the angle of atomic friction and the slope angle between the contact surfaces. The latter is obtained from the determination of all possible contact slopes between the two surface profiles through an alternative signature function. Our theory is validated through numerical simulations of spheropolygons with fractal Koch surfaces and is applied to the description of frictional properties of Weierstrass-Mandelbrot surfaces. The agreement between simulations and theory suggests that for interpreting macroscopic frictional behavior, the descriptors of surface morphology should be defined from the signature function rather than from the slopes of the contacting surfaces.
Dynamical analysis of surface-insulated planar wire array Z-pinches
NASA Astrophysics Data System (ADS)
Li, Yang; Sheng, Liang; Hei, Dongwei; Li, Xingwen; Zhang, Jinhai; Li, Mo; Qiu, Aici
2018-05-01
The ablation and implosion dynamics of planar wire array Z-pinches with and without surface insulation are compared and discussed in this paper. This paper first presents a phenomenological model named the ablation and cascade snowplow implosion (ACSI) model, which accounts for the ablation and implosion phases of a planar wire array Z-pinch in a single simulation. The comparison between experimental data and simulation results shows that the ACSI model could give a fairly good description about the dynamical characteristics of planar wire array Z-pinches. Surface insulation introduces notable differences in the ablation phase of planar wire array Z-pinches. The ablation phase is divided into two stages: insulation layer ablation and tungsten wire ablation. The two-stage ablation process of insulated wires is simulated in the ACSI model by updating the formulas describing the ablation process.
Equbal, Asif; Leskes, Michal; Nielsen, Niels Chr; Madhu, P K; Vega, Shimon
2016-02-01
We present a bimodal Floquet analysis of the recently introduced refocused continuous wave (rCW) solid-state NMR heteronuclear dipolar decoupling method and compare it with the similar looking X-inverse X (XiX) scheme. The description is formulated in the rf interaction frame and is valid for both finite and ideal π pulse rCW irradiation that forms the refocusing element in the rCW scheme. The effective heteronuclear dipolar coupling Hamiltonian up to first order is described. The analysis delineates the difference between the two sequences to different orders of their Hamiltonians for both diagonal and off-diagonal parts. All the resonance conditions observed in experiments and simulations have been characterised and their influence on residual line broadening is highlighted. The theoretical comparison substantiates the numerical simulations and experimental results to a large extent. Copyright © 2016 Elsevier Inc. All rights reserved.
Preparing for in situ processing on upcoming leading-edge supercomputers
Kress, James; Churchill, Randy Michael; Klasky, Scott; ...
2016-10-01
High performance computing applications are producing increasingly large amounts of data and placing enormous stress on current capabilities for traditional post-hoc visualization techniques. Because of the growing compute and I/O imbalance, data reductions, including in situ visualization, are required. These reduced data are used for analysis and visualization in a variety of different ways. Many of he visualization and analysis requirements are known a priori, but when they are not, scientists are dependent on the reduced data to accurately represent the simulation in post hoc analysis. The contributions of this paper is a description of the directions we are pursuingmore » to assist a large scale fusion simulation code succeed on the next generation of supercomputers. Finally, these directions include the role of in situ processing for performing data reductions, as well as the tradeoffs between data size and data integrity within the context of complex operations in a typical scientific workflow.« less
The 3D model of debriefing: defusing, discovering, and deepening.
Zigmont, Jason J; Kappus, Liana J; Sudikoff, Stephanie N
2011-04-01
The experiential learning process involves participation in key experiences and analysis of those experiences. In health care, these experiences can occur through high-fidelity simulation or in the actual clinical setting. The most important component of this process is the postexperience analysis or debriefing. During the debriefing, individuals must reflect upon the experience, identify the mental models that led to behaviors or cognitive processes, and then build or enhance new mental models to be used in future experiences. On the basis of adult learning theory, the Kolb Experiential Learning Cycle, and the Learning Outcomes Model, we structured a framework for facilitators of debriefings entitled "the 3D Model of Debriefing: Defusing, Discovering, and Deepening." It incorporates common phases prevalent in the debriefing literature, including description of and reactions to the experience, analysis of behaviors, and application or synthesis of new knowledge into clinical practice. It can be used to enhance learning after real or simulated events. Copyright © 2011 Elsevier Inc. All rights reserved.
System analysis through bond graph modeling
NASA Astrophysics Data System (ADS)
McBride, Robert Thomas
2005-07-01
Modeling and simulation form an integral role in the engineering design process. An accurate mathematical description of a system provides the design engineer the flexibility to perform trade studies quickly and accurately to expedite the design process. Most often, the mathematical model of the system contains components of different engineering disciplines. A modeling methodology that can handle these types of systems might be used in an indirect fashion to extract added information from the model. This research examines the ability of a modeling methodology to provide added insight into system analysis and design. The modeling methodology used is bond graph modeling. An investigation into the creation of a bond graph model using the Lagrangian of the system is provided. Upon creation of the bond graph, system analysis is performed. To aid in the system analysis, an object-oriented approach to bond graph modeling is introduced. A framework is provided to simulate the bond graph directly. Through object-oriented simulation of a bond graph, the information contained within the bond graph can be exploited to create a measurement of system efficiency. A definition of system efficiency is given. This measurement of efficiency is used in the design of different controllers of varying architectures. Optimal control of a missile autopilot is discussed within the framework of the calculated system efficiency.
NASA Astrophysics Data System (ADS)
Christian, Paul M.
2002-07-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
1987-03-01
model is one in which words or numerical descriptions are used to represent an entity or process. An example of a symbolic model is a mathematical ...are the third type of model used in modeling combat attrition. Analytical models are symbolic models which use mathematical symbols and equations to...simplicity and the ease of tracing through the mathematical computations. In this section I will discuss some of the shortcoming which have been
Towards interactive narrative medicine.
Cavazza, Marc; Charles, Fred
2013-01-01
Interactive Storytelling technologies have attracted significant interest in the field of simulation and serious gaming for their potential to provide a principled approach to improve user engagement in training scenarios. In this paper, we explore the use of Interactive Storytelling to support Narrative Medicine as a reflective practice. We describe a workflow for the generation of virtual narratives from high-level descriptions of patients' experiences as perceived by physicians, which can help to objectivize such perceptions and support various forms of analysis.
NASA Technical Reports Server (NTRS)
1975-01-01
Results are presented of preliminary trade-off studies of operational SEASAT systems. The trade-off studies were used as the basis for the estimation of costs and net benefits of the operational SEASAT system. Also presented are the preliminary results of simulation studies that were designed to lead to a measure of the impact of SEASAT data through the use of numerical weather forecast models.
Shin, Jae Hyuk; Lee, Boreom; Park, Kwang Suk
2011-05-01
In this study, we developed an automated behavior analysis system using infrared (IR) motion sensors to assist the independent living of the elderly who live alone and to improve the efficiency of their healthcare. An IR motion-sensor-based activity-monitoring system was installed in the houses of the elderly subjects to collect motion signals and three different feature values, activity level, mobility level, and nonresponse interval (NRI). These factors were calculated from the measured motion signals. The support vector data description (SVDD) method was used to classify normal behavior patterns and to detect abnormal behavioral patterns based on the aforementioned three feature values. The simulation data and real data were used to verify the proposed method in the individual analysis. A robust scheme is presented in this paper for optimally selecting the values of different parameters especially that of the scale parameter of the Gaussian kernel function involving in the training of the SVDD window length, T of the circadian rhythmic approach with the aim of applying the SVDD to the daily behavior patterns calculated over 24 h. Accuracies by positive predictive value (PPV) were 95.8% and 90.5% for the simulation and real data, respectively. The results suggest that the monitoring system utilizing the IR motion sensors and abnormal-behavior-pattern detection with SVDD are effective methods for home healthcare of elderly people living alone.
Allain, Ariane; Chauvot de Beauchêne, Isaure; Langenfeld, Florent; Guarracino, Yann; Laine, Elodie; Tchertanov, Luba
2014-01-01
Allostery is a universal phenomenon that couples the information induced by a local perturbation (effector) in a protein to spatially distant regulated sites. Such an event can be described in terms of a large scale transmission of information (communication) through a dynamic coupling between structurally rigid (minimally frustrated) and plastic (locally frustrated) clusters of residues. To elaborate a rational description of allosteric coupling, we propose an original approach - MOdular NETwork Analysis (MONETA) - based on the analysis of inter-residue dynamical correlations to localize the propagation of both structural and dynamical effects of a perturbation throughout a protein structure. MONETA uses inter-residue cross-correlations and commute times computed from molecular dynamics simulations and a topological description of a protein to build a modular network representation composed of clusters of residues (dynamic segments) linked together by chains of residues (communication pathways). MONETA provides a brand new direct and simple visualization of protein allosteric communication. A GEPHI module implemented in the MONETA package allows the generation of 2D graphs of the communication network. An interactive PyMOL plugin permits drawing of the communication pathways between chosen protein fragments or residues on a 3D representation. MONETA is a powerful tool for on-the-fly display of communication networks in proteins. We applied MONETA for the analysis of communication pathways (i) between the main regulatory fragments of receptors tyrosine kinases (RTKs), KIT and CSF-1R, in the native and mutated states and (ii) in proteins STAT5 (STAT5a and STAT5b) in the phosphorylated and the unphosphorylated forms. The description of the physical support for allosteric coupling by MONETA allowed a comparison of the mechanisms of (a) constitutive activation induced by equivalent mutations in two RTKs and (b) allosteric regulation in the activated and non-activated STAT5 proteins. Our theoretical prediction based on results obtained with MONETA was validated for KIT by in vitro experiments. MONETA is a versatile analytical and visualization tool entirely devoted to the understanding of the functioning/malfunctioning of allosteric regulation in proteins - a crucial basis to guide the discovery of next-generation allosteric drugs.
NASA Astrophysics Data System (ADS)
Schneider, Sébastien; Jacques, Diederik; Mallants, Dirk
2010-05-01
Numerical models are of precious help for predicting water fluxes in the vadose zone and more specifically in Soil-Vegetation-Atmosphere (SVA) systems. For such simulations, robust models and representative soil hydraulic parameters are required. Calibration of unsaturated hydraulic properties is known to be a difficult optimization problem due to the high non-linearity of the water flow equations. Therefore, robust methods are needed to avoid the optimization process to lead to non-optimal parameters. Evolutionary algorithms and specifically genetic algorithms (GAs) are very well suited for those complex parameter optimization problems. Additionally, GAs offer the opportunity to assess the confidence in the hydraulic parameter estimations, because of the large number of model realizations. The SVA system in this study concerns a pine stand on a heterogeneous sandy soil (podzol) in the Campine region in the north of Belgium. Throughfall and other meteorological data and water contents at different soil depths have been recorded during one year at a daily time step in two lysimeters. The water table level, which is varying between 95 and 170 cm, has been recorded with intervals of 0.5 hour. The leaf area index was measured as well at some selected time moments during the year in order to evaluate the energy which reaches the soil and to deduce the potential evaporation. Water contents at several depths have been recorded. Based on the profile description, five soil layers have been distinguished in the podzol. Two models have been used for simulating water fluxes: (i) a mechanistic model, the HYDRUS-1D model, which solves the Richards' equation, and (ii) a compartmental model, which treats the soil profile as a bucket into which water flows until its maximum capacity is reached. A global sensitivity analysis (Morris' one-at-a-time sensitivity analysis) was run previously to the calibration, in order to check the sensitivity in the chosen parameter search space. For the inversion procedure a genetical algorithm (GA) was used. Specific features such as elitism, roulette-wheel process for selection operator and island theory were implemented. Optimization was based on the water content measurements recorded at several depths. Ten scenarios have been elaborated and applied on the two lysimeters in order to investigate the impact of the conceptual model in terms of processes description (mechanistic or compartmental) and geometry (number of horizons in the profile description) on the calibration accuracy. Calibration leads to a good agreement with the measured water contents. The most critical parameters for improving the goodness of fit are the number of horizons and the type of process description. Best fit are found for a mechanistic model with 5 horizons resulting in absolute differences between observed and simulated water contents less than 0.02 cm3cm-3 in average. Parameter estimate analysis shows that layers thicknesses are poorly constrained whereas hydraulic parameters are much well defined.
Equation-free multiscale computation: algorithms and applications.
Kevrekidis, Ioannis G; Samaey, Giovanni
2009-01-01
In traditional physicochemical modeling, one derives evolution equations at the (macroscopic, coarse) scale of interest; these are used to perform a variety of tasks (simulation, bifurcation analysis, optimization) using an arsenal of analytical and numerical techniques. For many complex systems, however, although one observes evolution at a macroscopic scale of interest, accurate models are only given at a more detailed (fine-scale, microscopic) level of description (e.g., lattice Boltzmann, kinetic Monte Carlo, molecular dynamics). Here, we review a framework for computer-aided multiscale analysis, which enables macroscopic computational tasks (over extended spatiotemporal scales) using only appropriately initialized microscopic simulation on short time and length scales. The methodology bypasses the derivation of macroscopic evolution equations when these equations conceptually exist but are not available in closed form-hence the term equation-free. We selectively discuss basic algorithms and underlying principles and illustrate the approach through representative applications. We also discuss potential difficulties and outline areas for future research.
Computer Series, 17: Bits and Pieces, 5.
ERIC Educational Resources Information Center
Moore, John W., Ed.
1981-01-01
Contains short descriptions of computer programs or hardware that simulate laboratory instruments or results of kinetics experiments, including ones that include experiment error, numerical simulation, first-order kinetic mechanisms, a game for decisionmaking, and simulated mass spectrophotometers. (CS)
Discrete-Event Simulation in Chemical Engineering.
ERIC Educational Resources Information Center
Schultheisz, Daniel; Sommerfeld, Jude T.
1988-01-01
Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)
A precision device needs precise simulation: Software description of the CBM Silicon Tracking System
NASA Astrophysics Data System (ADS)
Malygina, Hanna; Friese, Volker;
2017-10-01
Precise modelling of detectors in simulations is the key to the understanding of their performance, which, in turn, is a prerequisite for the proper design choice and, later, for the achievement of valid physics results. In this report, we describe the implementation of the Silicon Tracking System (STS), the main tracking device of the CBM experiment, in the CBM software environment. The STS makes uses of double-sided silicon micro-strip sensors with double metal layers. We present a description of transport and detector response simulation, including all relevant physical effects like charge creation and drift, charge collection, cross-talk and digitization. Of particular importance and novelty is the description of the time behaviour of the detector, since its readout will not be externally triggered but continuous. We also cover some aspects of local reconstruction, which in the CBM case has to be performed in real-time and thus requires high-speed algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan, M.E.; Wilson, M.L.; Wightman, J.
1996-12-31
The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity & permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based onmore » marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic & petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan, M.E.; Wilson, M.L.; Wightman, J.
1996-01-01
The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on markermore » correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less
Aeroelastic analysis for propellers - mathematical formulations and program user's manual
NASA Technical Reports Server (NTRS)
Bielawa, R. L.; Johnson, S. A.; Chi, R. M.; Gangwani, S. T.
1983-01-01
Mathematical development is presented for a specialized propeller dedicated version of the G400 rotor aeroelastic analysis. The G400PROP analysis simulates aeroelastic characteristics particular to propellers such as structural sweep, aerodynamic sweep and high subsonic unsteady airloads (both stalled and unstalled). Formulations are presented for these expanded propeller related methodologies. Results of limited application of the analysis to realistic blade configurations and operating conditions which include stable and unstable stall flutter test conditions are given. Sections included for enhanced program user efficiency and expanded utilization include descriptions of: (1) the structuring of the G400PROP FORTRAN coding; (2) the required input data; and (3) the output results. General information to facilitate operation and improve efficiency is also provided.
In-Flight Stability Analysis of the X-48B Aircraft
NASA Technical Reports Server (NTRS)
Regan, Christopher D.
2008-01-01
This report presents the system description, methods, and sample results of the in-flight stability analysis for the X-48B, Blended Wing Body Low-Speed Vehicle. The X-48B vehicle is a dynamically scaled, remotely piloted vehicle developed to investigate the low-speed control characteristics of a full-scale blended wing body. Initial envelope clearance was conducted by analyzing the stability margin estimation resulting from the rigid aircraft response during flight and comparing it to simulation data. Short duration multisine signals were commanded onboard to simultaneously excite the primary rigid body axes. In-flight stability analysis has proven to be a critical component of the initial envelope expansion.
NASA Astrophysics Data System (ADS)
Bechtold, S.; Höfle, B.
2016-06-01
In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.
TASS Model Application for Testing the TDWAP Model
NASA Technical Reports Server (NTRS)
Switzer, George F.
2009-01-01
One of the operational modes of the Terminal Area Simulation System (TASS) model simulates the three-dimensional interaction of wake vortices within turbulent domains in the presence of thermal stratification. The model allows the investigation of turbulence and stratification on vortex transport and decay. The model simulations for this work all assumed fully-periodic boundary conditions to remove the effects from any surface interaction. During the Base Period of this contract, NWRA completed generation of these datasets but only presented analysis for the neutral stratification runs of that set (Task 3.4.1). Phase 1 work began with the analysis of the remaining stratification datasets, and in the analysis we discovered discrepancies with the vortex time to link predictions. This finding necessitated investigating the source of the anomaly, and we found a problem with the background turbulence. Using the most up to date version TASS with some important defect fixes, we regenerated a larger turbulence domain, and verified the vortex time to link with a few cases before proceeding to regenerate the entire 25 case set (Task 3.4.2). The effort of Phase 2 (Task 3.4.3) concentrated on analysis of several scenarios investigating the effects of closely spaced aircraft. The objective was to quantify the minimum aircraft separations necessary to avoid vortex interactions between neighboring aircraft. The results consist of spreadsheets of wake data and presentation figures prepared for NASA technical exchanges. For these formation cases, NASA carried out the actual TASS simulations and NWRA performed the analysis of the results by making animations, line plots, and other presentation figures. This report contains the description of the work performed during this final phase of the contract, the analysis procedures adopted, and sample plots of the results from the analysis performed.
Thermal Modeling and Analysis of a Cryogenic Tank Design Exposed to Extreme Heating Profiles
NASA Technical Reports Server (NTRS)
Stephens, Craig A.; Hanna, Gregory J.
1991-01-01
A cryogenic test article, the Generic Research Cryogenic Tank, was designed to qualitatively simulate the thermal response of transatmospheric vehicle fuel tanks exposed to the environment of hypersonic flight. One-dimensional and two-dimensional finite-difference thermal models were developed to simulate the thermal response and assist in the design of the Generic Research Cryogenic Tank. The one-dimensional thermal analysis determined the required insulation thickness to meet the thermal design criteria and located the purge jacket to eliminate the liquefaction of air. The two-dimensional thermal analysis predicted the temperature gradients developed within the pressure-vessel wall, estimated the cryogen boiloff, and showed the effects the ullage condition has on pressure-vessel temperatures. The degree of ullage mixing, location of the applied high-temperature profile, and the purge gas influence on insulation thermal conductivity had significant effects on the thermal behavior of the Generic Research Cryogenic Tank. In addition to analysis results, a description of the Generic Research Cryogenic Tank and the role it will play in future thermal structures and transatmospheric vehicle research at the NASA Dryden Flight Research Facility is presented.
MEANS: python package for Moment Expansion Approximation, iNference and Simulation
Fan, Sisi; Geissmann, Quentin; Lakatos, Eszter; Lukauskas, Saulius; Ale, Angelique; Babtie, Ann C.; Kirk, Paul D. W.; Stumpf, Michael P. H.
2016-01-01
Motivation: Many biochemical systems require stochastic descriptions. Unfortunately these can only be solved for the simplest cases and their direct simulation can become prohibitively expensive, precluding thorough analysis. As an alternative, moment closure approximation methods generate equations for the time-evolution of the system’s moments and apply a closure ansatz to obtain a closed set of differential equations; that can become the basis for the deterministic analysis of the moments of the outputs of stochastic systems. Results: We present a free, user-friendly tool implementing an efficient moment expansion approximation with parametric closures that integrates well with the IPython interactive environment. Our package enables the analysis of complex stochastic systems without any constraints on the number of species and moments studied and the type of rate laws in the system. In addition to the approximation method our package provides numerous tools to help non-expert users in stochastic analysis. Availability and implementation: https://github.com/theosysbio/means Contacts: m.stumpf@imperial.ac.uk or e.lakatos13@imperial.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153663
MEANS: python package for Moment Expansion Approximation, iNference and Simulation.
Fan, Sisi; Geissmann, Quentin; Lakatos, Eszter; Lukauskas, Saulius; Ale, Angelique; Babtie, Ann C; Kirk, Paul D W; Stumpf, Michael P H
2016-09-15
Many biochemical systems require stochastic descriptions. Unfortunately these can only be solved for the simplest cases and their direct simulation can become prohibitively expensive, precluding thorough analysis. As an alternative, moment closure approximation methods generate equations for the time-evolution of the system's moments and apply a closure ansatz to obtain a closed set of differential equations; that can become the basis for the deterministic analysis of the moments of the outputs of stochastic systems. We present a free, user-friendly tool implementing an efficient moment expansion approximation with parametric closures that integrates well with the IPython interactive environment. Our package enables the analysis of complex stochastic systems without any constraints on the number of species and moments studied and the type of rate laws in the system. In addition to the approximation method our package provides numerous tools to help non-expert users in stochastic analysis. https://github.com/theosysbio/means m.stumpf@imperial.ac.uk or e.lakatos13@imperial.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valkenburg, Wessel; Hu, Bin, E-mail: valkenburg@lorentz.leidenuniv.nl, E-mail: hu@lorentz.leidenuniv.nl
2015-09-01
We present a description for setting initial particle displacements and field values for simulations of arbitrary metric theories of gravity, for perfect and imperfect fluids with arbitrary characteristics. We extend the Zel'dovich Approximation to nontrivial theories of gravity, and show how scale dependence implies curved particle paths, even in the entirely linear regime of perturbations. For a viable choice of Effective Field Theory of Modified Gravity, initial conditions set at high redshifts are affected at the level of up to 5% at Mpc scales, which exemplifies the importance of going beyond Λ-Cold Dark Matter initial conditions for modifications of gravitymore » outside of the quasi-static approximation. In addition, we show initial conditions for a simulation where a scalar modification of gravity is modelled in a Lagrangian particle-like description. Our description paves the way for simulations and mock galaxy catalogs under theories of gravity beyond the standard model, crucial for progress towards precision tests of gravity and cosmology.« less
Segmental Analysis of Cardiac Short-Axis Views Using Lagrangian Radial and Circumferential Strain
Ma, Chi; Wang, Xiao; Varghese, Tomy
2016-01-01
Accurate description of myocardial deformation in the left ventricle is a three-dimensional problem, requiring three normal strain components along its natural axis, that is, longitudinal, radial, and circumferential strains. Although longitudinal strains are best estimated from long-axis views, radial and circumferential strains are best depicted in short-axis views. An algorithm that utilizes a polar grid for short-axis views previously developed in our laboratory for a Lagrangian description of tissue deformation is utilized for radial and circumferential displacement and strain estimation. Deformation of the myocardial wall, utilizing numerical simulations with ANSYS, and a finite-element analysis–based canine heart model were adapted as the input to a frequency-domain ultrasound simulation program to generate radiofrequency echo signals. Clinical in vivo data were also acquired from a healthy volunteer. Local displacements estimated along and perpendicular to the ultrasound beam propagation direction are then transformed into radial and circumferential displacements and strains using the polar grid based on a pre-determined centroid location. Lagrangian strain variations demonstrate good agreement with the ideal strain when compared with Eulerian results. Lagrangian radial and circumferential strain estimation results are also demonstrated for experimental data on a healthy volunteer. Lagrangian radial and circumferential strain tracking provide accurate results with the assistance of the polar grid, as demonstrated using both numerical simulations and in vivo study. PMID:26578642
Econ Simulation Cited as Success
ERIC Educational Resources Information Center
Workman, Robert; Maher, John
1973-01-01
A brief description of a computerized economics simulation model which provides students with an opportunity to apply microeconomic principles along with elementary accounting and statistical techniques.'' (Author/AK)
Ofaim, Shany; Ofek-Lalzar, Maya; Sela, Noa; Jinag, Jiandong; Kashi, Yechezkel; Minz, Dror; Freilich, Shiri
2017-01-01
Advances in metagenomics enable high resolution description of complex bacterial communities in their natural environments. Consequently, conceptual approaches for community level functional analysis are in high need. Here, we introduce a framework for a metagenomics-based analysis of community functions. Environment-specific gene catalogs, derived from metagenomes, are processed into metabolic-network representation. By applying established ecological conventions, network-edges (metabolic functions) are assigned with taxonomic annotations according to the dominance level of specific groups. Once a function-taxonomy link is established, prediction of the impact of dominant taxa on the overall community performances is assessed by simulating removal or addition of edges (taxa associated functions). This approach is demonstrated on metagenomic data describing the microbial communities from the root environment of two crop plants – wheat and cucumber. Predictions for environment-dependent effects revealed differences between treatments (root vs. soil), corresponding to documented observations. Metabolism of specific plant exudates (e.g., organic acids, flavonoids) was linked with distinct taxonomic groups in simulated root, but not soil, environments. These dependencies point to the impact of these metabolite families as determinants of community structure. Simulations of the activity of pairwise combinations of taxonomic groups (order level) predicted the possible production of complementary metabolites. Complementation profiles allow formulating a possible metabolic role for observed co-occurrence patterns. For example, production of tryptophan-associated metabolites through complementary interactions is unique to the tryptophan-deficient cucumber root environment. Our approach enables formulation of testable predictions for species contribution to community activity and exploration of the functional outcome of structural shifts in complex bacterial communities. Understanding community-level metabolism is an essential step toward the manipulation and optimization of microbial function. Here, we introduce an analysis framework addressing three key challenges of such data: producing quantified links between taxonomy and function; contextualizing discrete functions into communal networks; and simulating environmental impact on community performances. New technologies will soon provide a high-coverage description of biotic and a-biotic aspects of complex microbial communities such as these found in gut and soil. This framework was designed to allow the integration of high-throughput metabolomic and metagenomic data toward tackling the intricate associations between community structure, community function, and metabolic inputs. PMID:28878756
Aerodynamic preliminary analysis system. Part 2: User's manual and program description
NASA Technical Reports Server (NTRS)
Divan, P.; Dunn, K.; Kojima, J.
1978-01-01
A comprehensive aerodynamic analysis program based on linearized potential theory is described. The solution treats thickness and attitude problems at subsonic and supersonic speeds. Three dimensional configurations with or without jet flaps having multiple nonplanar surfaces of arbitrary planform and open or closed slender bodies or noncircular contour are analyzed. Longitudinal and lateral-directional static and rotary derivative solutions are generated. The analysis is implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis problem. Nominal case computation time of 45 CPU seconds on the CDC 175 for a 200 panel simulation indicates the program provides an efficient analysis for systematically performing various aerodynamic configuration tradeoff and evaluation studies.
NASA Technical Reports Server (NTRS)
1992-01-01
The purpose of QASE RT is to enable system analysts and software engineers to evaluate performance and reliability implications of design alternatives. The program resulted from two Small Business Innovation Research (SBIR) projects. After receiving a description of the system architecture and workload from the user, QASE RT translates the system description into simulation models and executes them. Simulation provides detailed performance evaluation. The results of the evaluations are service and response times, offered load and device utilizations and functional availability.
Index of Nuclear Weapon Effects Simulators. Sanitized
1983-06-01
124 TRESTLE Facility ..................................................... 125 Vertical EMP Simulator ( VEMPS ...82171 SIMULATOR: Vertical EMP Simulator ( VEMPS ) TYPE: EMP AGENCY: US Army LOCATION: HOL1.0’od ridge, V, Research Facility POINT OF CONTACT...DESCRIPTION: The VEMPS facility is I radiating elect, asettc pulse (EMP) stilateor used to expose test obJects to the simulated effects of high altitude EIP
Ignition sensitivity study of an energetic train configuration using experiments and simulation
NASA Astrophysics Data System (ADS)
Kim, Bohoon; Yu, Hyeonju; Yoh, Jack J.
2018-06-01
A full scale hydrodynamic simulation intended for the accurate description of shock-induced detonation transition was conducted as a part of an ignition sensitivity analysis of an energetic component system. The system is composed of an exploding foil initiator (EFI), a donor explosive unit, a stainless steel gap, and an acceptor explosive. A series of velocity interferometer system for any reflector measurements were used to validate the hydrodynamic simulations based on the reactive flow model that describes the initiation of energetic materials arranged in a train configuration. A numerical methodology with ignition and growth mechanisms for tracking multi-material boundary interactions as well as severely transient fluid-structure coupling between high explosive charges and metal gap is described. The free surface velocity measurement is used to evaluate the sensitivity of energetic components that are subjected to strong pressure waves. Then, the full scale hydrodynamic simulation is performed on the flyer impacted initiation of an EFI driven pyrotechnical system.
SAFSIM theory manual: A computer program for the engineering simulation of flow systems
NASA Astrophysics Data System (ADS)
Dobranich, Dean
1993-12-01
SAFSIM (System Analysis Flow SIMulator) is a FORTRAN computer program for simulating the integrated performance of complex flow systems. SAFSIM provides sufficient versatility to allow the engineering simulation of almost any system, from a backyard sprinkler system to a clustered nuclear reactor propulsion system. In addition to versatility, speed and robustness are primary SAFSIM development goals. SAFSIM contains three basic physics modules: (1) a fluid mechanics module with flow network capability; (2) a structure heat transfer module with multiple convection and radiation exchange surface capability; and (3) a point reactor dynamics module with reactivity feedback and decay heat capability. Any or all of the physics modules can be implemented, as the problem dictates. SAFSIM can be used for compressible and incompressible, single-phase, multicomponent flow systems. Both the fluid mechanics and structure heat transfer modules employ a one-dimensional finite element modeling approach. This document contains a description of the theory incorporated in SAFSIM, including the governing equations, the numerical methods, and the overall system solution strategies.
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Munk, Michelle M.; Powell, Richard W.
2002-01-01
The Mars 2001 Odyssey Orbiter successfully completed the aerobraking phase of its mission on January 11, 2002. This paper discusses the support provided by NASA's Langley Research Center to the navigation team at the Jet Propulsion Laboratory in the planning and operational support of Mars Odyssey Aerobraking. Specifically, the development of a three-degree-of-freedom aerobraking trajectory simulation and its application to pre-flight planning activities as well as operations is described. The importance of running the simulation in a Monte Carlo fashion to capture the effects of mission and atmospheric uncertainties is demonstrated, and the utility of including predictive logic within the simulation that could mimic operational maneuver decision-making is shown. A description is also provided of how the simulation was adapted to support flight operations as both a validation and risk reduction tool and as a means of obtaining a statistical basis for maneuver strategy decisions. This latter application was the first use of Monte Carlo trajectory analysis in an aerobraking mission.
NASA Technical Reports Server (NTRS)
Green, F. M.; Resnick, D. R.
1979-01-01
An FMP (Flow Model Processor) was designed for use in the Numerical Aerodynamic Simulation Facility (NASF). The NASF was developed to simulate fluid flow over three-dimensional bodies in wind tunnel environments and in free space. The facility is applicable to studying aerodynamic and aircraft body designs. The following general topics are discussed in this volume: (1) FMP functional computer specifications; (2) FMP instruction specification; (3) standard product system components; (4) loosely coupled network (LCN) specifications/description; and (5) three appendices: performance of trunk allocation contention elimination (trace) method, LCN channel protocol and proposed LCN unified second level protocol.
NASA Technical Reports Server (NTRS)
1974-01-01
The manual for the use of the computer program SYSTID under the Univac operating system is presented. The computer program is used in the simulation and evaluation of the space shuttle orbiter electric power supply. The models described in the handbook are those which were available in the original versions of SYSTID. The subjects discussed are: (1) program description, (2) input language, (3) node typing, (4) problem submission, and (5) basic and power system SYSTID libraries.
1983-02-01
s.,ccesstully modeled to enhance future computer design simulations; (2) a new methodology for conduc*n dynamic analysis of vehicle mechanics was...to prelminary design methodology for tilt rotors, advancing blade concepts configuration helicopters, and compound helicopters in conjunction with...feasibility of low-level personnel parachutes has been demon- strated. A study was begun to design a free-fall water contalner. An experimental program to
Flight dynamics analysis and simulation of heavy lift airships. Volume 3: User's manual
NASA Technical Reports Server (NTRS)
Emmen, R. D.; Tischler, M. B.
1982-01-01
The User's Manual provides the basic information necessary to run the programs. This includes descriptions of the various data files necessary for the program, the various outputs from the program and the options available to the user when executing the program. Additional data file information is contained in the three appendices to the manual. These appendices list all input variables and their permissible values, an example listing of these variables, and all output variables available to the user.
The Use of a Gyroless Wheel-Tach Controller in SDO Safehold Mode
NASA Technical Reports Server (NTRS)
Bourkland, Kristin L.; Starin, Scott R.; Mangus, David J.; Starin, Scott (Technical Monitor)
2005-01-01
This paper describes the progression of the Safehold mode design on the Solar Dynamics Observatory satellite. Safehold uses coarse Sun sensors and reaction wheel tachometers to keep the spacecraft in a thermally safe and power-positive attitude. The control algorithm is described, and simulation results shown. Specific control issues arose when the spacecraft entered eclipse, and a description of the trade study which added gyroscopes to the mode is included. The paper concludes with the results from the linear and nonlinear stability analysis.
NASA Technical Reports Server (NTRS)
Fields, Chris
1989-01-01
Continuous dynamical systems intuitively seem capable of more complex behavior than discrete systems. If analyzed in the framework of the traditional theory of computation, a continuous dynamical system with countably many quasistable states has at least the computational power of a universal Turing machine. Such an analysis assumes, however, the classical notion of measurement. If measurement is viewed nonclassically, a continuous dynamical system cannot, even in principle, exhibit behavior that cannot be simulated by a universal Turing machine.
Analytical solutions for coagulation and condensation kinetics of composite particles
NASA Astrophysics Data System (ADS)
Piskunov, Vladimir N.
2013-04-01
The processes of composite particles formation consisting of a mixture of different materials are essential for many practical problems: for analysis of the consequences of accidental releases in atmosphere; for simulation of precipitation formation in clouds; for description of multi-phase processes in chemical reactors and industrial facilities. Computer codes developed for numerical simulation of these processes require optimization of computational methods and verification of numerical programs. Kinetic equations of composite particle formation are given in this work in a concise form (impurity integrated). Coagulation, condensation and external sources associated with nucleation are taken into account. Analytical solutions were obtained in a number of model cases. The general laws for fraction redistribution of impurities were defined. The results can be applied to develop numerical algorithms considerably reducing the simulation effort, as well as to verify the numerical programs for calculation of the formation kinetics of composite particles in the problems of practical importance.
Molecular-dynamics simulations of urea nucleation from aqueous solution
Salvalaglio, Matteo; Perego, Claudio; Giberti, Federico; Mazzotti, Marco; Parrinello, Michele
2015-01-01
Despite its ubiquitous character and relevance in many branches of science and engineering, nucleation from solution remains elusive. In this framework, molecular simulations represent a powerful tool to provide insight into nucleation at the molecular scale. In this work, we combine theory and molecular simulations to describe urea nucleation from aqueous solution. Taking advantage of well-tempered metadynamics, we compute the free-energy change associated to the phase transition. We find that such a free-energy profile is characterized by significant finite-size effects that can, however, be accounted for. The description of the nucleation process emerging from our analysis differs from classical nucleation theory. Nucleation of crystal-like clusters is in fact preceded by large concentration fluctuations, indicating a predominant two-step process, whereby embryonic crystal nuclei emerge from dense, disordered urea clusters. Furthermore, in the early stages of nucleation, two different polymorphs are seen to compete. PMID:25492932
Molecular-dynamics simulations of urea nucleation from aqueous solution.
Salvalaglio, Matteo; Perego, Claudio; Giberti, Federico; Mazzotti, Marco; Parrinello, Michele
2015-01-06
Despite its ubiquitous character and relevance in many branches of science and engineering, nucleation from solution remains elusive. In this framework, molecular simulations represent a powerful tool to provide insight into nucleation at the molecular scale. In this work, we combine theory and molecular simulations to describe urea nucleation from aqueous solution. Taking advantage of well-tempered metadynamics, we compute the free-energy change associated to the phase transition. We find that such a free-energy profile is characterized by significant finite-size effects that can, however, be accounted for. The description of the nucleation process emerging from our analysis differs from classical nucleation theory. Nucleation of crystal-like clusters is in fact preceded by large concentration fluctuations, indicating a predominant two-step process, whereby embryonic crystal nuclei emerge from dense, disordered urea clusters. Furthermore, in the early stages of nucleation, two different polymorphs are seen to compete.
Human strength simulations for one and two-handed tasks in zero gravity
NASA Technical Reports Server (NTRS)
1972-01-01
A description is given of a three dimensional hand force capability model for the seated operator and a biomechanical model for analysis of symmetric sagittal plane activities. The models are used to simulate and study human strengths for one and two handed tasks in zero gravity. Specific conditions considered include: (1) one hand active, (2) both hands active but with different force directions on each, (3) body bracing situations provided by portable foot restraint when standing and lap belt when seated, (4) static or slow movement tasks with maximum length of 4 seconds and a minimum rest of 5 minutes between exertions, and (5) wide range of hand positions relative to either the feet or bisection of a line connecting the hip centers. Simulations were also made for shirt sleeved individuals and for the male population strengths with anthropometry matching that of astronauts.
Estimation of Lightning Levels on a Launcher Using a BEM-Compressed Model
NASA Astrophysics Data System (ADS)
Silly, J.; Chaigne, B.; Aspas-Puertolas, J.; Herlem, Y.
2016-05-01
As development cycles in the space industry are being considerably reduced, it seems mandatory to deploy in parallel fast analysis methods for engineering purposes, but without sacrificing accuracy. In this paper we present the application of such methods to early Phase A-B [1] evaluation of lightning constraints on a launch vehicle.A complete 3D parametric model of a launcher has been thus developed and simulated with a Boundary Element Method (BEM)-frequency simulator (equipped with a low frequency algorithm). The time domain values of the observed currents and fields are obtained by post-treatment using an inverse discrete Fourier transform (IDFT).This model is used for lightning studies, especially the simulation are useful to analyse the influence of lightning injected currents on resulting circulated currents on external cable raceways. The description of the model and some of those results are presented in this article.
Low, Diana H P; Motakis, Efthymios
2013-10-01
Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.
Krzysztof, Naus; Aleksander, Nowak
2016-01-01
The article presents a study of the accuracy of estimating the position coordinates of BAUV (Biomimetic Autonomous Underwater Vehicle) by the extended Kalman filter (EKF) method. The fusion of movement parameters measurements and position coordinates fixes was applied. The movement parameters measurements are carried out by on-board navigation devices, while the position coordinates fixes are done by the USBL (Ultra Short Base Line) system. The problem of underwater positioning and the conceptual design of the BAUV navigation system constructed at the Naval Academy (Polish Naval Academy—PNA) are presented in the first part of the paper. The second part consists of description of the evaluation results of positioning accuracy, the genesis of the problem of selecting method for underwater positioning, and the mathematical description of the method of estimating the position coordinates using the EKF method by the fusion of measurements with on-board navigation and measurements obtained with the USBL system. The main part contains a description of experimental research. It consists of a simulation program of navigational parameter measurements carried out during the BAUV passage along the test section. Next, the article covers the determination of position coordinates on the basis of simulated parameters, using EKF and DR methods and the USBL system, which are then subjected to a comparative analysis of accuracy. The final part contains systemic conclusions justifying the desirability of applying the proposed fusion method of navigation parameters for the BAUV positioning. PMID:27537884
Krzysztof, Naus; Aleksander, Nowak
2016-08-15
The article presents a study of the accuracy of estimating the position coordinates of BAUV (Biomimetic Autonomous Underwater Vehicle) by the extended Kalman filter (EKF) method. The fusion of movement parameters measurements and position coordinates fixes was applied. The movement parameters measurements are carried out by on-board navigation devices, while the position coordinates fixes are done by the USBL (Ultra Short Base Line) system. The problem of underwater positioning and the conceptual design of the BAUV navigation system constructed at the Naval Academy (Polish Naval Academy-PNA) are presented in the first part of the paper. The second part consists of description of the evaluation results of positioning accuracy, the genesis of the problem of selecting method for underwater positioning, and the mathematical description of the method of estimating the position coordinates using the EKF method by the fusion of measurements with on-board navigation and measurements obtained with the USBL system. The main part contains a description of experimental research. It consists of a simulation program of navigational parameter measurements carried out during the BAUV passage along the test section. Next, the article covers the determination of position coordinates on the basis of simulated parameters, using EKF and DR methods and the USBL system, which are then subjected to a comparative analysis of accuracy. The final part contains systemic conclusions justifying the desirability of applying the proposed fusion method of navigation parameters for the BAUV positioning.
Skrivanek, Zachary; Berry, Scott; Berry, Don; Chien, Jenny; Geiger, Mary Jane; Anderson, James H.; Gaydos, Brenda
2012-01-01
Background Dulaglutide (dula, LY2189265), a long-acting glucagon-like peptide-1 analog, is being developed to treat type 2 diabetes mellitus. Methods To foster the development of dula, we designed a two-stage adaptive, dose-finding, inferentially seamless phase 2/3 study. The Bayesian theoretical framework is used to adaptively randomize patients in stage 1 to 7 dula doses and, at the decision point, to either stop for futility or to select up to 2 dula doses for stage 2. After dose selection, patients continue to be randomized to the selected dula doses or comparator arms. Data from patients assigned the selected doses will be pooled across both stages and analyzed with an analysis of covariance model, using baseline hemoglobin A1c and country as covariates. The operating characteristics of the trial were assessed by extensive simulation studies. Results Simulations demonstrated that the adaptive design would identify the correct doses 88% of the time, compared to as low as 6% for a fixed-dose design (the latter value based on frequentist decision rules analogous to the Bayesian decision rules for adaptive design). Conclusions This article discusses the decision rules used to select the dula dose(s); the mathematical details of the adaptive algorithm—including a description of the clinical utility index used to mathematically quantify the desirability of a dose based on safety and efficacy measurements; and a description of the simulation process and results that quantify the operating characteristics of the design. PMID:23294775
Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D; Cox, Kenneth R; Chapman, Walter G
2017-04-28
We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.
NASA Astrophysics Data System (ADS)
Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D.; Cox, Kenneth R.; Chapman, Walter G.
2017-04-01
We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.
NASA Astrophysics Data System (ADS)
Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin
2018-01-01
Physicists at the Large Hadron Collider (LHC) rely on detailed simulations of particle collisions to build expectations of what experimental data may look like under different theoretical modeling assumptions. Petabytes of simulated data are needed to develop analysis techniques, though they are expensive to generate using existing algorithms and computing resources. The modeling of detectors and the precise description of particle cascades as they interact with the material in the calorimeter are the most computationally demanding steps in the simulation pipeline. We therefore introduce a deep neural network-based generative model to enable high-fidelity, fast, electromagnetic calorimeter simulation. There are still challenges for achieving precision across the entire phase space, but our current solution can reproduce a variety of particle shower properties while achieving speedup factors of up to 100 000 × . This opens the door to a new era of fast simulation that could save significant computing time and disk space, while extending the reach of physics searches and precision measurements at the LHC and beyond.
Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.
Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M
2014-12-01
In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.
Dynamic Model of the BIO-Plex Air Revitalization System
NASA Technical Reports Server (NTRS)
Finn, Cory; Meyers, Karen; Duffield, Bruce; Luna, Bernadette (Technical Monitor)
2000-01-01
The BIO-Plex facility will need to support a variety of life support system designs and operation strategies. These systems will be tested and evaluated in the BIO-Plex facility. An important goal of the life support program is to identify designs that best meet all size and performance constraints for a variety of possible future missions. Integrated human testing is a necessary step in reaching this goal. System modeling and analysis will also play an important role in this endeavor. Currently, simulation studies are being used to estimate air revitalization buffer and storage requirements in order to develop the infrastructure requirements of the BIO-Plex facility. Simulation studies are also being used to verify that the envisioned operation strategy will be able to meet all performance criteria. In this paper, a simulation study is presented for a nominal BIO-Plex scenario with a high-level of crop growth. A general description of the dynamic mass flow model is provided, along with some simulation results. The paper also discusses sizing and operations issues and describes plans for future simulation studies.
Led, Santiago; Azpilicueta, Leire; Aguirre, Erik; de Espronceda, Miguel Martínez; Serrano, Luis; Falcone, Francisco
2013-01-01
In this work, a novel ambulatory ECG monitoring device developed in-house called HOLTIN is analyzed when operating in complex indoor scenarios. The HOLTIN system is described, from the technological platform level to its functional model. In addition, by using in-house 3D ray launching simulation code, the wireless channel behavior, which enables ubiquitous operation, is performed. The effect of human body presence is taken into account by a novel simplified model embedded within the 3D Ray Launching code. Simulation as well as measurement results are presented, showing good agreement. These results may aid in the adequate deployment of this novel device to automate conventional medical processes, increasing the coverage radius and optimizing energy consumption. PMID:23584122
Building and occupant characteristics as determinants of residential energy consumption
NASA Astrophysics Data System (ADS)
Nieves, L. A.; Nieves, A. L.
1981-10-01
The probable effects of building energy performance standards on energy consumption were studied. Observations of actual residential energy consumption that could affirm or disaffirm consumption estimates of the Department of Energy's 2.0A simulation model were obtained. Home owner's conservation investments and home purchase decisions were investigated. The investigation of determinants of household energy consumption is described. The underlying economic theory and its implications are given as well as a description of the data collection procedures, of the formulation of variables, and then of data analysis and findings. The assumptions and limitations of the energy use projections generated by the DOE 2.0A model are discussed. Actual electricity data for the houses are then compared with results of the simulation.
Numerical Simulation of the ``Fluid Mechanical Sewing Machine''
NASA Astrophysics Data System (ADS)
Brun, Pierre-Thomas; Audoly, Basile; Ribe, Neil
2011-11-01
A thin thread of viscous fluid falling onto a moving conveyor belt generates a wealth of complex ``stitch'' patterns depending on the belt speed and the fall height. To understand the rich nonlinear dynamics of this system, we have developed a new numerical code for simulating unsteady viscous threads, based on a discrete description of the geometry and a variational formulation for the viscous stresses. The code successfully reproduces all major features of the experimental state diagram of Morris et al. (Phys. Rev. E 2008). Fourier analysis of the motion of the thread's contact point with the belt suggests a new classification of the observed patterns, and reveals that the system behaves as a nonlinear oscillator coupling the pendulum modes of the thread.
Novel probabilistic neuroclassifier
NASA Astrophysics Data System (ADS)
Hong, Jiang; Serpen, Gursel
2003-09-01
A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.
NASA Technical Reports Server (NTRS)
Segal, M.; Pielke, R. A.; Mcnider, R. T.; Mcdougal, D. S.
1982-01-01
The mesoscale numerical model of the University of Virginia (UVMM), has been applied to the greater Chesapeake Bay area in order to provide a detailed description of the air pollution meteorology during a typical summer day. This model provides state of the art simulations for land-sea thermally induced circulations. The model-predicted results agree favorably with available observed data. The effects of synoptic flow and sea breeze coupling on air pollution meteorological characteristics in this region, are demonstrated by a spatial and temporal presentation of various model predicted fields. A transport analysis based on predicted wind velocities indicated possible recirculation of pollutants back onto the Atlantic coast due to the sea breeze circulation.
OpenFOAM Modeling of Particle Heating and Acceleration in Cold Spraying
NASA Astrophysics Data System (ADS)
Leitz, K.-H.; O'Sullivan, M.; Plankensteiner, A.; Kestler, H.; Sigl, L. S.
2018-01-01
In cold spraying, a powder material is accelerated and heated in the gas flow of a supersonic nozzle to velocities and temperatures that are sufficient to obtain cohesion of the particles to a substrate. The deposition efficiency of the particles is significantly determined by their velocity and temperature. Particle velocity correlates with the amount of kinetic energy that is converted to plastic deformation and thermal heating. The initial particle temperature significantly influences the mechanical properties of the particle. Velocity and temperature of the particles have nonlinear dependence on the pressure and temperature of the gas at the nozzle entrance. In this contribution, a simulation model based on the reactingParcelFoam solver of OpenFOAM is presented and applied for an analysis of particle velocity and temperature in the cold spray nozzle. The model combines a compressible description of the gas flow in the nozzle with a Lagrangian particle tracking. The predictions of the simulation model are verified based on an analytical description of the gas flow, the particle acceleration and heating in the nozzle. Based on experimental data, the drag model according to Plessis and Masliyah is identified to be best suited for OpenFOAM modeling particle heating and acceleration in cold spraying.
NASA Astrophysics Data System (ADS)
Bourke, Jason Michael
This study seeks to restore the internal anatomy within the nasal passages of dinosaurs via the use of comparative anatomical methods along with computational fluid dynamic simulations. Nasal airway descriptions and airflow simulations are described for extant birds, crocodylians, and lizards. These descriptions served as a baseline for airflow within the nasal passages of diapsids. The presence of shared airflow and soft-tissue properties found in the nasal passages of extant diapsids, were used to restore soft tissues within the airways of dinosaurs under the assumption that biologically unfeasible airflow patterns (e.g., lack of air movement in olfactory recess) can serve as signals for missing soft tissues. This methodology was tested on several dinosaur taxa. Restored airways in some taxa revealed the potential presence and likely shape of nasal turbinates. Heat transfer efficiency was tested in two dinosaur species with elaborated nasal passages. Results of that analysis revealed that dinosaur noses were efficient heat exchangers that likely played an integral role in maintaining cephalic thermoregulation. Brain cooling via nasal expansion appears to have been necessary for dinosaurs to have achieved their immense body sizes without overheating their brains.
ERIC Educational Resources Information Center
Abrams, Macy L.; And Others
A prototype arc welding training simulator was designed to provide immediate, discriminative feedback and the capacity for concentrated practice. Two randomly selected groups of welding trainees were compared to evaluate the simulator, one group being trained using the simulator and the other using conventional practice. Preliminary data indicated…
Meaningful Use of Simulation as an Educational Method in Nursing Programs
ERIC Educational Resources Information Center
Thompson, Teri L.
2011-01-01
The purpose of this descriptive study was to examine the use of simulation technology within nursing programs leading to licensure as registered nurses. In preparation for this study the Use of Simulation Technology Inventory (USTI) was developed and based in the structure, processes, outcomes model and the current literature on simulation. The…
Description and performance of the Langley differential maneuvering simulator
NASA Technical Reports Server (NTRS)
Ashworth, B. R.; Kahlbaum, W. M., Jr.
1973-01-01
The differential maneuvering simulator for simulating two aircraft or spacecraft operating in a differential mode is described. Tests made to verify that the system could provide the required simulated aircraft motions are given. The mathematical model which converts computed aircraft motions into the required motions of the various projector gimbals is described.
NASA Technical Reports Server (NTRS)
Burgin, G. H.; Owens, A. J.
1975-01-01
A detailed description is presented of the computer programs in order to provide an understanding of the mathematical and geometrical relationships as implemented in the programs. The individual sbbroutines and their underlying mathematical relationships are described, and the required input data and the output provided by the program are explained. The relationship of the adaptive maneuvering logic program with the program to drive the differential maneuvering simulator is discussed.
Materials by numbers: Computations as tools of discovery
Landman, Uzi
2005-01-01
Current issues pertaining to theoretical simulations of materials, with a focus on systems of nanometer-scale dimensions, are discussed. The use of atomistic simulations as high-resolution numerical experiments, enabling and guiding formulation and testing of analytic theoretical descriptions, is demonstrated through studies of the generation and breakup of nanojets, which have led to the derivation of a stochastic hydrodynamic description. Subsequently, I illustrate the use of computations and simulations as tools of discovery, with examples that include the self-organized formation of nanowires, the surprising nanocatalytic activity of small aggregates of gold that, in the bulk form, is notorious for being chemically inert, and the emergence of rotating electron molecules in two-dimensional quantum dots. I conclude with a brief discussion of some key challenges in nanomaterials simulations. PMID:15870210
NASA Astrophysics Data System (ADS)
Brown, R.; Pasternack, G. B.
2011-12-01
The description of fluvial form has evolved from anecdotal descriptions to artistic renderings to 2D plots of cross section or longitudinal profiles and more recently 3D digital models. Synthetic river valleys, artificial 3D topographic models of river topography, have a plethora of potential applications in fluvial geomorphology, and the earth sciences in general, as well as in computer science and ecology. Synthetic river channels have existed implicitly since approximately the 1970s and can be simulated from a variety of approaches spanning the artistic and numerical. An objective method of synthesizing 3D stream topography based on reach scale attributes would be valuable for sizing 3D flumes in the physical and numerical realms, as initial input topography for morphodynamic models, stream restoration design, historical reconstruction, and mechanistic testing of interactions of channel geometric elements. Quite simply - simulation of synthetic channel geometry of prescribed conditions can allow systematic evaluation of the dominant relationships between river flow and geometry. A new model, the control curve method, is presented that uses hierarchically scaled parametric curves in over-lapping 2D planes to create synthetic river valleys. The approach is able to simulate 3D stream geometry from paired 2D descriptions and can allow experimental insight into form-process relationships in addition to visualizing past measurements of channel form that are limited to two dimension descriptions. Results are presented that illustrate the models ability to simulate fluvial topography representative of real world rivers as well as how channel geometric elements can be adjusted. The testing of synthetic river valleys would open up a wealth of knowledge as to why some 3D attributes of river channels are more prevalent than others as well as bridging the gap between the 2D descriptions that have dominated fluvial geomorphology the past century and modern, more complete, 3D treatments.
Novel 3D/VR interactive environment for MD simulations, visualization and analysis.
Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P
2014-12-18
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.
Potter, Julie Elizabeth; Gatward, Jonathan J; Kelly, Michelle A; McKay, Leigh; McCann, Ellie; Elliott, Rosalind M; Perry, Lin
2017-12-01
The approach, communication skills, and confidence of clinicians responsible for raising deceased organ donation may influence families' donation decisions. The aim of this study was to increase the preparedness and confidence of intensive care clinicians allocated to work in a "designated requester" role. We conducted a posttest evaluation of an innovative simulation-based training program. Simulation-based training enabled clinicians to rehearse the "balanced approach" to family donation conversations (FDCs) in the designated requester role. Professional actors played family members in simulated clinical settings using authentic scenarios, with video-assisted reflective debriefing. Participants completed an evaluation after the workshop. Simple descriptive statistical analysis and content analysis were performed. Between January 2013 and July 2015, 25 workshops were undertaken with 86 participants; 82 (95.3%) returned evaluations. Respondents were registered practicing clinicians; over half (44/82; 53.7%) were intensivists. Most attended a single workshop. Evaluations were overwhelmingly positive with the majority rating workshops as outstanding (64/80; 80%). Scenario fidelity, competence of the actors, opportunity to practice and receive feedback on performance, and feedback from actors, both in and out of character, were particularly valued. Most (76/78; 97.4%) reported feeling more confident about their designated requester role. Simulation-based communication training for the designated requester role in FDCs increased the knowledge and confidence of clinicians to raise the topic of donation.
Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis
Doblack, Benjamin N.; Allis, Tim; Dávila, Lilian P.
2014-01-01
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced. PMID:25549300
Ievlev, Anton V; Jesse, Stephen; Cochell, Thomas J; Unocic, Raymond R; Protopopescu, Vladimir A; Kalinin, Sergei V
2015-12-22
Recent advances in liquid cell (scanning) transmission electron microscopy (S)TEM has enabled in situ nanoscale investigations of controlled nanocrystal growth mechanisms. Here, we experimentally and quantitatively investigated the nucleation and growth mechanisms of Pt nanostructures from an aqueous solution of K2PtCl6. Averaged statistical, network, and local approaches have been used for the data analysis and the description of both collective particles dynamics and local growth features. In particular, interaction between neighboring particles has been revealed and attributed to reduction of the platinum concentration in the vicinity of the particle boundary. The local approach for solving the inverse problem showed that particles dynamics can be simulated by a stationary diffusional model. The obtained results are important for understanding nanocrystal formation and growth processes and for optimization of synthesis conditions.
Li, Xianfeng; Hassan, Sergio A.; Mehler, Ernest L.
2006-01-01
Long dynamics simulations were carried out on the B1 immunoglobulin-binding domain of streptococcal protein G (ProtG) and bovine pancreatic trypsin inhibitor (BPTI) using atomistic descriptions of the proteins and a continuum representation of solvent effects. To mimic frictional and random collision effects, Langevin dynamics (LD) were used. The main goal of the calculations was to explore the stability of tens-of-nanosecond trajectories as generated by this molecular mechanics approximation and to analyze in detail structural and dynamical properties. Conformational fluctuations, order parameters, cross correlation matrices, residue solvent accessibilities, pKa values of titratable groups, and hydrogen-bonding (HB) patterns were calculated from all of the trajectories and compared with available experimental data. The simulations comprised over 40 ns per trajectory for ProtG and over 30 ns per trajectory for BPTI. For comparison, explicit water molecular dynamics simulations (EW/MD) of 3 ns and 4 ns, respectively, were also carried out. Two continuum simulations were performed on each protein using the CHARMM program, one with the all-atom PAR22 representation of the protein force field (here referred to as PAR22/LD simulations) and the other with the modifications introduced by the recently developed CMAP potential (CMAP/LD simulations). The explicit solvent simulations were performed with PAR22 only. Solvent effects are described by a continuum model based on screened Coulomb potentials (SCP) reported earlier, i.e., the SCP-based implicit solvent model (SCP–ISM). For ProtG, both the PAR22/LD and the CMAP/LD 40-ns trajectories were stable, yielding Cα root mean square deviations (RMSD) of about 1.0 and 0.8 Å respectively along the entire simulation time, compared to 0.8 Å for the EW/MD simulation. For BPTI, only the CMAP/LD trajectory was stable for the entire 30-ns simulation, with a Cα RMSD of ≈ 1.4 Å, while the PAR22/LD trajectory became unstable early in the simulation, reaching a Cα RMSD of about 2.7 Å and remaining at this value until the end of the simulation; the Cα RMSD of the EW/MD simulation was about 1.5 Å. The source of the instabilities of the BPTI trajectories in the PAR22/LD simulations was explored by an analysis of the backbone torsion angles. To further validate the findings from this analysis of BPTI, a 35-ns SCP–ISM simulation of Ubiquitin (Ubq) was carried out. For this protein, the CMAP/LD simulation was stable for the entire simulation time (Cα RMSD of ≈1.0 Å), while the PAR22/LD trajectory showed a trend similar to that in BPTI, reaching a Cα RMSD of ≈1.5 Å at 7 ns. All the calculated properties were found to be in agreement with the corresponding experimental values, although local deviations were also observed. HB patterns were also well reproduced by all the continuum solvent simulations with the exception of solvent-exposed side chain–side chain (sc–sc) HB in ProtG, where several of the HB interactions observed in the crystal structure and in the EW/MD simulation were lost. The overall analysis reported in this work suggests that the combination of an atomistic representation of a protein with a CMAP/CHARMM force field and a continuum representation of solvent effects such as the SCP–ISM provides a good description of structural and dynamic properties obtained from long computer simulations. Although the SCP–ISM simulations (CMAP/LD) reported here were shown to be stable and the properties well reproduced, further refinement is needed to attain a level of accuracy suitable for more challenging biological applications, particularly the study of protein–protein interactions. PMID:15959866
LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors
NASA Astrophysics Data System (ADS)
Snider, E. L.; Petrillo, G.
2017-10-01
LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.
Kinetic theory-based numerical modeling and analysis of bi-disperse segregated mixture fluidized bed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konan, N. A.; Huckaby, E. D.
We discuss a series of continuum Euler-Euler simulations of an initially mixed bi-disperse fluidized bed which segregates under certain operating conditions. The simulations use the multi-phase kinetic theory-based description of the momentum and energy exchanges between the phases by Simonin’s Group [see e.g. Gourdel, Simonin and Brunier (1999). Proceedings of 6th International Conference on Circulating Fluidized Beds, Germany, pp. 205-210]. The discussion and analysis of the results focus on the fluid-particle momentum exchange (i.e. drag). Simulations using mono- and poly-disperse fluid-particle drag correlations are analyzed for the Geldart D-type size bi-disperse gas-solid experiments performed by Goldschmidt et al. [Powder Tech.,more » pp. 135-159 (2003)]. The poly-disperse gas-particle drag correlations account for the local particle size distribution by using an effective mixture diameter when calculating the Reynolds number and then correcting the resulting force coefficient. Simulation results show very good predictions of the segregation index for bidisperse beds with the mono-disperse drag correlations contrary to the poly-disperse drag correlations for which the segregation rate is systematically under-predicted. The statistical analysis of the results shows a clear separation in the distribution of the gas-particle mean relaxation times of the small and large particles with simulations using the mono-disperse drag. In contrast, the poly-disperse drag simulations have a significant overlap and also a smaller difference in the mean particle relaxation times. This results in the small and large particles in the bed to respond to the gas similarly without enough relative time lag. The results suggest that the difference in the particle response time induce flow dynamics favorable to a force imbalance which results in the segregation.« less
Kinetic theory-based numerical modeling and analysis of bi-disperse segregated mixture fluidized bed
Konan, N. A.; Huckaby, E. D.
2017-06-21
We discuss a series of continuum Euler-Euler simulations of an initially mixed bi-disperse fluidized bed which segregates under certain operating conditions. The simulations use the multi-phase kinetic theory-based description of the momentum and energy exchanges between the phases by Simonin’s Group [see e.g. Gourdel, Simonin and Brunier (1999). Proceedings of 6th International Conference on Circulating Fluidized Beds, Germany, pp. 205-210]. The discussion and analysis of the results focus on the fluid-particle momentum exchange (i.e. drag). Simulations using mono- and poly-disperse fluid-particle drag correlations are analyzed for the Geldart D-type size bi-disperse gas-solid experiments performed by Goldschmidt et al. [Powder Tech.,more » pp. 135-159 (2003)]. The poly-disperse gas-particle drag correlations account for the local particle size distribution by using an effective mixture diameter when calculating the Reynolds number and then correcting the resulting force coefficient. Simulation results show very good predictions of the segregation index for bidisperse beds with the mono-disperse drag correlations contrary to the poly-disperse drag correlations for which the segregation rate is systematically under-predicted. The statistical analysis of the results shows a clear separation in the distribution of the gas-particle mean relaxation times of the small and large particles with simulations using the mono-disperse drag. In contrast, the poly-disperse drag simulations have a significant overlap and also a smaller difference in the mean particle relaxation times. This results in the small and large particles in the bed to respond to the gas similarly without enough relative time lag. The results suggest that the difference in the particle response time induce flow dynamics favorable to a force imbalance which results in the segregation.« less
Toward Theory Building in the Field of Instructional Games and Simulations
ERIC Educational Resources Information Center
Cruickshank, Donald R.; Mager, Gerald M.
1976-01-01
Three suggestions are made for improving on the present uncoordinated state of games and simulations: establish precise vocabulary, understand the relationships between simulation/gaming and other instructional alternatives, and instigate systematic research based on the descriptive--correlational--experimental loop model. (Author/LS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Provost, G.; Stone, H.; McClintock, M.
2008-01-01
To meet the growing demand for education and experience with the analysis, operation, and control of commercial-scale Integrated Gasification Combined Cycle (IGCC) plants, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a collaborative R&D project with participants from government, academia, and industry. One of the goals of this project is to develop a generic, full-scope, real-time generic IGCC dynamic plant simulator for use in establishing a world-class research and training center, as well as to promote and demonstrate the technology to power industry personnel. The NETL IGCC dynamic plant simulator will combine for the first timemore » a process/gasification simulator and a power/combined-cycle simulator together in a single dynamic simulation framework for use in training applications as well as engineering studies. As envisioned, the simulator will have the following features and capabilities: A high-fidelity, real-time, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke Full-scope training simulator capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, and trainee performance monitoring The ability to enhance and modify the plant model to facilitate studies of changes in plant configuration and equipment and to support future R&D efforts To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which will form the basis of the simulator development. These plant sections include: Slurry Preparation Air Separation Unit Gasifiers Syngas Scrubbers Shift Reactors Gas Cooling, Medium Pressure (MP) and Low Pressure (LP) Steam Generation, and Knockout Sour Water Stripper Mercury Removal Selexol™ Acid Gas Removal System CO2 Compression Syngas Reheat and Expansion Claus Plant Hydrogenation Reactor and Gas Cooler Combustion Turbine (CT)-Generator Assemblies Heat Recovery Steam Generators (HRSGs) and Steam Turbine (ST)-Generator In this paper, process descriptions, control strategies, and Process & Instrumentation Diagram (P&ID) drawings for key sections of the generic IGCC plant are presented, along with discussions of some of the operating procedures and representative faults that the simulator will cover. Some of the intended future applications for the simulator are discussed, including plant operation and control demonstrations as well as education and training services such as IGCC familiarization courses.« less
A Computational Approach for Probabilistic Analysis of LS-DYNA Water Impact Simulations
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.
2010-01-01
NASA s development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. Because of the computational cost, these tools are often used to evaluate specific conditions and rarely used for statistical analysis. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. For this problem, response surface models are used to predict the system time responses to a water landing as a function of capsule speed, direction, attitude, water speed, and water direction. Furthermore, these models can also be used to ascertain the adequacy of the design in terms of probability measures. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.
Roberts, Fiona E; Goodhand, Kate
2018-03-01
The most memorable learning occurs during placement. Simulated interprofessional learning is a logical learning opportunity to help healthcare professionals work beyond their professional silos. In this qualitative study, we investigated the perceived learning of students from six health professions (adult nursing, diagnostic radiography, occupational therapy, physiotherapy, dietetics, and pharmacy) from their participation in a 45 min interprofessional ward simulation. Semistructured focus groups were undertaken, and data were analyzed using framework analysis. Two overarching themes were evident, each of which had subthemes: (i) the ward simulation as an interprofessional education opportunity (subthemes: reality of situations and interactions); and (ii) the perceived learning achieved (subthemes: professional roles, priorities, respect, communication, teamwork, and quality of care). The results indicated that a short interprofessional ward simulation, unsupported by additional learning opportunities or directed study, is a useful and engaging interprofessional learning opportunity. Students appear to have learnt important key messages central to the interprofessional education curricula to help develop practitioners who can effectively work together as an interprofessional team, and that this learning is partly due to simulation allowing things to go wrong. © 2017 John Wiley & Sons Australia, Ltd.
A comparative analysis of errors in long-term econometric forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tepel, R.
1986-04-01
The growing body of literature that documents forecast accuracy falls generally into two parts. The first is prescriptive and is carried out by modelers who use simulation analysis as a tool for model improvement. These studies are ex post, that is, they make use of known values for exogenous variables and generate an error measure wholly attributable to the model. The second type of analysis is descriptive and seeks to measure errors, identify patterns among errors and variables and compare forecasts from different sources. Most descriptive studies use an ex ante approach, that is, they evaluate model outputs based onmore » estimated (or forecasted) exogenous variables. In this case, it is the forecasting process, rather than the model, that is under scrutiny. This paper uses an ex ante approach to measure errors in forecast series prepared by Data Resources Incorporated (DRI), Wharton Econometric Forecasting Associates (Wharton), and Chase Econometrics (Chase) and to determine if systematic patterns of errors can be discerned between services, types of variables (by degree of aggregation), length of forecast and time at which the forecast is made. Errors are measured as the percent difference between actual and forecasted values for the historical period of 1971 to 1983.« less
[3-dimensional cephalometry in orthodontics. The current possibilities of Cepha 3DT software].
Faure, Jacques; Treil, Jacques; Borianne, Philippe; Casteigt, Jean; Baron, Pascal
2002-03-01
A 3D cephalometric analysis method from a scanner acquisition, has been developed thanks to a long collaboration between the CIRAD modeling Laboratory and Jacques Treil. The model of skeletal description is based on eight landmarks related to the neuromatrical axis of facial growth (heads of the mallei, supraorbital, suborbital, submental points); it has been abundantly described. The purpose of this work consists in presenting the dentoalveolar level of the analysis. The description and the marking of the arches and the teeth mainly rest on the systematic use of a mathematical tool, the calculation of the central matrix of inertia, and on three fundamental choices: the identification of the dental arches from their constituting teeth leaving aside any alveolar marking, the marking of each tooth relative to the arch, as it can be observed by the orthodontist's eye, and not relative to the craniofacial architecture, the definition of the position of each tooth by the orientation of its coronoradicular axis and not its sole buccal side, Their uses in orthodontics are numerous: diagnosis, choice of the mechanics, therapeutic simulation, therapeutic follow up, analysis of the findings... Clinical applications illustrate the theoretical presentation.
The Azimuth Structure of Nuclear Collisions — I
NASA Astrophysics Data System (ADS)
Trainor, Thomas A.; Kettler, David T.
We describe azimuth structure commonly associated with elliptic and directed flow in the context of 2D angular autocorrelations for the purpose of precise separation of so-called nonflow (mainly minijets) from flow. We extend the Fourier-transform description of azimuth structure to include power spectra and autocorrelations related by the Wiener-Khintchine theorem. We analyze several examples of conventional flow analysis in that context and question the relevance of reaction plane estimation to flow analysis. We introduce the 2D angular autocorrelation with examples from data analysis and describe a simulation exercise which demonstrates precise separation of flow and nonflow using the 2D autocorrelation method. We show that an alternative correlation measure based on Pearson's normalized covariance provides a more intuitive measure of azimuth structure.
NASA Technical Reports Server (NTRS)
Burgin, G. H.; Fogel, L. J.; Phelps, J. P.
1975-01-01
A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.
Digital systems design language
NASA Technical Reports Server (NTRS)
Shiva, S. G.
1979-01-01
Digital Systems Design Language (DDL) is implemented on the SEL-32 Computer Systems. The detaileds of the language, the translator, and the simulator, and the smulator programs are given. Several example descriptions and a tutorial on hardware description languages are provided, to guide the user.
Software Geometry in Simulations
NASA Astrophysics Data System (ADS)
Alion, Tyler; Viren, Brett; Junk, Tom
2015-04-01
The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).
Cafferkey, Aine; Coyle, Elizabeth; Greaney, David; Harte, Sinead; Hayes, Niamh; Langdon, Miriam; Straub, Birgitt; Burlacu, Crina
2018-03-20
Simulation-based education is a modern training modality that allows healthcare professionals to develop knowledge and practice skills in a safe learning environment. The College of Anaesthetists of Ireland (CAI) was the first Irish postgraduate medical training body to introduce mandatory simulation training into its curriculum. Extensive quality assurance and improvement data has been collected on all simulation courses to date. Describe The College of Anaesthetists of Ireland Simulation Training (CAST) programme and report the analysis of course participants' feedback. A retrospective review of feedback forms from four simulation courses from March 2010 to August 2016 took place. Qualitative and quantitative data from 1069 participants who attended 112 courses was analysed. Feedback was overall very positive. Course content and delivery were deemed to be appropriate. Participants agreed that course participation would influence their future practice. A statistically significant difference (P < 0.001) between self-reported pre- and post-course confidence scores was observed in 19 out of 25 scenarios. The learning environment, learning method and debrief were highlighted as aspects of the courses that participants liked most. The mandatory integration of CAST has been welcomed with widespread enthusiasm among specialist anaesthesia trainees. Intuitively, course participation instils confidence in trainees and better equips them to manage anaesthesia emergencies in the clinical setting. It remains to be seen if translational outcomes result from this increase in confidence. Nevertheless, the findings of this extensive review have cemented the place of mandatory simulation training in specialist anaesthesia training in Ireland.
NASA Technical Reports Server (NTRS)
Plitau, Denis; Prasad, Narasimha S.
2012-01-01
The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.
NASA Astrophysics Data System (ADS)
Tamazian, A.; Nguyen, V. D.; Markelov, O. A.; Bogachev, M. I.
2016-07-01
We suggest a universal phenomenological description for the collective access patterns in the Internet traffic dynamics both at local and wide area network levels that takes into account erratic fluctuations imposed by cooperative user behaviour. Our description is based on the superstatistical approach and leads to the q-exponential inter-session time and session size distributions that are also in perfect agreement with empirical observations. The validity of the proposed description is confirmed explicitly by the analysis of complete 10-day traffic traces from the WIDE backbone link and from the local campus area network downlink from the Internet Service Provider. Remarkably, the same functional forms have been observed in the historic access patterns from single WWW servers. The suggested approach effectively accounts for the complex interplay of both “calm” and “bursty” user access patterns within a single-model setting. It also provides average sojourn time estimates with reasonable accuracy, as indicated by the queuing system performance simulation, this way largely overcoming the failure of Poisson modelling of the Internet traffic dynamics.
NASA Technical Reports Server (NTRS)
Bodley, C. S.; Devers, D. A.; Park, C. A.
1975-01-01
A theoretical development and associated digital computer program system is presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system may be used to investigate total system dynamic characteristics including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. Additionally, the program system may be used for design of attitude control systems and for evaluation of total dynamic system performance including time domain response and frequency domain stability analyses. Volume 1 presents the theoretical developments including a description of the physical system, the equations of dynamic equilibrium, discussion of kinematics and system topology, a complete treatment of momentum wheel coupling, and a discussion of gravity gradient and environmental effects. Volume 2, is a program users' guide and includes a description of the overall digital program code, individual subroutines and a description of required program input and generated program output. Volume 3 presents the results of selected demonstration problems that illustrate all program system capabilities.
Heat Flow Measurement and Analysis of Thermal Vacuum Insulation
NASA Astrophysics Data System (ADS)
Laa, C.; Hirschl, C.; Stipsitz, J.
2008-03-01
A new kind of calorimeter has been developed at Austrian Aerospace to measure specific material parameters needed for the analysis of thermal vacuum insulation. A detailed description of the measuring device and the measurement results will be given in this paper. This calorimeter facility allows to measure the heat flow through the insulation under vacuum conditions in a wide temperature range from liquid nitrogen to ambient. Both boundary temperatures can be chosen within this range. Furthermore the insulation can be characterized at high vacuum or under degraded vacuum, the latter is simulated by using helium or nitrogen gas. The mechanisms of heat transfer have been investigated, namely infrared radiation between the reflective layers of the insulation and conduction through the interleaving spacer material. A mathematical description of the heat flow through the insulation has been derived. Based on this, the heat flow for a typical insulation material has been calculated by finite element analysis by use of the sotware tool Ansys®. Such a transient calculation is needed to determine the time to reach thermal equilibrium, which is mandatory for a proper interpretation and evaluation of the measurement. The new insulation measurement results combined with the proposed type of analysis can be applied to better understand the thermal behavior of any kind of cryogenic system.
Data Association Algorithms for Tracking Satellites
2013-03-27
validation of the new tools. The description provided here includes the mathematical back ground and description of the models implemented, as well as a...simulation development. This work includes the addition of higher-fidelity models in CU-TurboProp and validation of the new tools. The description...ode45(), used in Ananke, and (3) provide the necessary inputs to the bidirectional reflectance distribution function ( BRDF ) model provided by Pacific
Information system analysis of an e-learning system used for dental restorations simulation.
Bogdan, Crenguţa M; Popovici, Dorin M
2012-09-01
The goal of using virtual and augmented reality technologies in therapeutic interventions simulation, in the fixed prosthodontics (VirDenT) project, is to increase the quality of the educational process in dental faculties, by assisting students in learning how to prepare teeth for all-ceramic restorations. Its main component is an e-learning virtual reality-based software system that will be used for the developing skills in grinding teeth, needed in all-ceramic restorations. The complexity of the domain problem that the software system dealt with made the analysis of the information system supported by VirDenT necessary. The analysis contains the following activities: identification and classification of the system stakeholders, description of the business processes, formulation of the business rules, and modelling of business objects. During this stage, we constructed the context diagram, the business use case diagram, the activity diagrams and the class diagram of the domain model. These models are useful for the further development of the software system that implements the VirDenT information system. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Physiological Environment Induces Quick Response – Slow Exhaustion Reactions
Hiroi, Noriko; Lu, James; Iba, Keisuke; Tabira, Akito; Yamashita, Shuji; Okada, Yasunori; Flamm, Christoph; Oka, Kotaro; Köhler, Gottfried; Funahashi, Akira
2011-01-01
In vivo environments are highly crowded and inhomogeneous, which may affect reaction processes in cells. In this study we examined the effects of intracellular crowding and an inhomogeneity on the behavior of in vivo reactions by calculating the spectral dimension (ds), which can be translated into the reaction rate function. We compared estimates of anomaly parameters obtained from fluorescence correlation spectroscopy (FCS) data with fractal dimensions derived from transmission electron microscopy (TEM) image analysis. FCS analysis indicated that the anomalous property was linked to physiological structure. Subsequent TEM analysis provided an in vivo illustration; soluble molecules likely percolate between intracellular clusters, which are constructed in a self-organizing manner. We estimated a cytoplasmic spectral dimension ds to be 1.39 ± 0.084. This result suggests that in vivo reactions initially run faster than the same reactions in a homogeneous space; this conclusion is consistent with the anomalous character indicated by FCS analysis. We further showed that these results were compatible with our Monte-Carlo simulation in which the anomalous behavior of mobile molecules correlates with the intracellular environment, leading to description as a percolation cluster, as demonstrated using TEM analysis. We confirmed by the simulation that the above-mentioned in vivo like properties are different from those of homogeneously concentrated environments. Additionally, simulation results indicated that crowding level of an environment might affect diffusion rate of reactant. Such knowledge of the spatial information enables us to construct realistic models for in vivo diffusion and reaction systems. PMID:21960972
78 FR 66099 - Petition for Exemption; Summary of Petition Received
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-04
..., Office of Rulemaking. Petition for Exemption Docket No.: FAA-2013-0818. Petitioner: ELITE Simulation Solutions. Section of 14 CFR Affected: Sec. 61.65(i) Description of Relief Sought: ELITE Simulation...
Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V
2009-01-01
The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.
ERIC Educational Resources Information Center
Akiba, Y.; And Others
This user's manual for the simulation program Graphical Evaluation and Review Technique (GERT) GQ contains sections on nodes, branches, program input description and format, and program output, as well as examples. Also included is a programmer's manual which contains information on scheduling, subroutine descriptions, COMMON Variables, and…
Digital systems design language. Design synthesis of digital systems
NASA Technical Reports Server (NTRS)
Shiva, S. G.
1979-01-01
The Digital Systems Design Language (DDL) is implemented on the SEL-32 computer systems. The details of the language, translator and simulator programs are included. Several example descriptions and a tutorial on hardware description languages are provided, to guide the user.
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
Transport properties at fluids interfaces: a molecular study for a macroscopic modelling
NASA Astrophysics Data System (ADS)
Russo, Antonio; Morciano, Matteo; Sibley, David N.; Nold, Andreas; Goddard, Benjamin D.; Asinari, Pietro; Kalliadasis, Serafim
2017-11-01
Rapid developments in the field of micro- and nano-fluidics require detailed analysis of the properties of matter at the molecular level. But despite numerous works in the literature, appropriate macroscopic relations able to integrate a microscopic description of fluid and soft matter properties at liquid-vapour and multi-fluid interfaces are missing. As a consequence, studies on interfacial phenomena and micro-device designs often rely on oversimplified assumptions, e.g. that the viscosities can be considered constant across interfaces. In our work, we present non-equilibrium MD simulations to scrutinise efficiently and systematically, through the tools of statistical mechanics, the anisotropic properties of fluids, namely density variations, stress tensor, and shear viscosity, at the fluid interfaces between liquid and vapour and between two partially miscible fluids. Our analysis has led to the formulation of a general relation between shear viscosity and density variations validated for a wide spectrum of interfacial fluid problems. In addition, it provides a rational description of other interfacial quantities of interest, including surface tension and its origins, and more generally, it offers valuable insight of molecular transport phenomena at interfaces.
Ravik, Monika; Havnes, Anton; Bjørk, Ida Torunn
2017-12-01
To explore, describe and compare learning actions that nursing students used during peripheral vein cannulation training on a latex arm or each other's arms in a clinical skills centre. Simulation-based training is thought to enhance learning and transfer of learning from simulation to the clinical setting and is commonly recommended in nursing education. What students actually are doing during simulation-based training is, however, less explored. The analysis of learning actions used during simulation-based training could contribute to development and improvement of simulation as a learning strategy in nursing education. A qualitative explorative and descriptive research design, involving content analysis of video recordings, was used. Video-supported observation of nine nursing students practicing vein cannulation was conducted in a clinical skills centre in late 2012. The students engaged in various learning actions. Students training on a latex arm used a considerably higher number of learning actions relative to those training on each other's arms. In both groups, students' learning actions consisted mainly of seeking and giving support. The teacher provided students training on each other's arms with detailed feedback regarding insertion of the cannula into the vein, while those training on a latex arm received sparse feedback from the teacher and fellow students. The teacher played an important role in facilitating nursing students' practical skill learning during simulation. The provision of support from both teachers and students should be emphasised to ensure that nursing students' learning needs are met. This study suggest that student nurses may be differently and inadequately prepared in peripheral vein cannulation in two simulation modalities used in the academic setting; training on a latex arm and on each other's arms. © 2017 John Wiley & Sons Ltd.
A trial of e-simulation of sudden patient deterioration (FIRST2ACT WEB) on student learning.
Bogossian, Fiona E; Cooper, Simon J; Cant, Robyn; Porter, Joanne; Forbes, Helen
2015-10-01
High-fidelity simulation pedagogy is of increasing importance in health professional education; however, face-to-face simulation programs are resource intensive and impractical to implement across large numbers of students. To investigate undergraduate nursing students' theoretical and applied learning in response to the e-simulation program-FIRST2ACT WEBTM, and explore predictors of virtual clinical performance. Multi-center trial of FIRST2ACT WEBTM accessible to students in five Australian universities and colleges, across 8 campuses. A population of 489 final-year nursing students in programs of study leading to license to practice. Participants proceeded through three phases: (i) pre-simulation-briefing and assessment of clinical knowledge and experience; (ii) e-simulation-three interactive e-simulation clinical scenarios which included video recordings of patients with deteriorating conditions, interactive clinical tasks, pop up responses to tasks, and timed performance; and (iii) post-simulation feedback and evaluation. Descriptive statistics were followed by bivariate analysis to detect any associations, which were further tested using standard regression analysis. Of 409 students who commenced the program (83% response rate), 367 undergraduate nursing students completed the web-based program in its entirety, yielding a completion rate of 89.7%; 38.1% of students achieved passing clinical performance across three scenarios, and the proportion achieving passing clinical knowledge increased from 78.15% pre-simulation to 91.6% post-simulation. Knowledge was the main independent predictor of clinical performance in responding to a virtual deteriorating patient R(2)=0.090, F(7, 352)=4.962, p<0.001. The use of web-based technology allows simulation activities to be accessible to a large number of participants and completion rates indicate that 'Net Generation' nursing students were highly engaged with this mode of learning. The web-based e-simulation program FIRST2ACTTM effectively enhanced knowledge, virtual clinical performance, and self-assessed knowledge, skills, confidence, and competence in final-year nursing students. Copyright © 2015 Elsevier Ltd. All rights reserved.
Learning from avatars: Learning assistants practice physics pedagogy in a classroom simulator
NASA Astrophysics Data System (ADS)
Chini, Jacquelyn J.; Straub, Carrie L.; Thomas, Kevin H.
2016-06-01
[This paper is part of the Focused Collection on Preparing and Supporting University Physics Educators.] Undergraduate students are increasingly being used to support course transformations that incorporate research-based instructional strategies. While such students are typically selected based on strong content knowledge and possible interest in teaching, they often do not have previous pedagogical training. The current training models make use of real students or classmates role playing as students as the test subjects. We present a new environment for facilitating the practice of physics pedagogy skills, a highly immersive mixed-reality classroom simulator, and assess its effectiveness for undergraduate physics learning assistants (LAs). LAs prepared, taught, and reflected on a lesson about motion graphs for five highly interactive computer generated student avatars in the mixed-reality classroom simulator. To assess the effectiveness of the simulator for this population, we analyzed the pedagogical skills LAs intended to practice and exhibited during their lessons and explored LAs' descriptions of their experiences with the simulator. Our results indicate that the classroom simulator created a safe, effective environment for LAs to practice a variety of skills, such as questioning styles and wait time. Additionally, our analysis revealed areas for improvement in our preparation of LAs and use of the simulator. We conclude with a summary of research questions this environment could facilitate.
A software framework for pipelined arithmetic algorithms in field programmable gate arrays
NASA Astrophysics Data System (ADS)
Kim, J. B.; Won, E.
2018-03-01
Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.
Papaleo, Elena; Mereghetti, Paolo; Fantucci, Piercarlo; Grandori, Rita; De Gioia, Luca
2009-01-01
Several molecular dynamics (MD) simulations were used to sample conformations in the neighborhood of the native structure of holo-myoglobin (holo-Mb), collecting trajectories spanning 0.22 micros at 300 K. Principal component (PCA) and free-energy landscape (FEL) analyses, integrated by cluster analysis, which was performed considering the position and structures of the individual helices of the globin fold, were carried out. The coherence between the different structural clusters and the basins of the FEL, together with the convergence of parameters derived by PCA indicates that an accurate description of the Mb conformational space around the native state was achieved by multiple MD trajectories spanning at least 0.14 micros. The integration of FEL, PCA, and structural clustering was shown to be a very useful approach to gain an overall view of the conformational landscape accessible to a protein and to identify representative protein substates. This method could be also used to investigate the conformational and dynamical properties of Mb apo-, mutant, or delete versions, in which greater conformational variability is expected and, therefore identification of representative substates from the simulations is relevant to disclose structure-function relationship.
A Facility and Architecture for Autonomy Research
NASA Technical Reports Server (NTRS)
Pisanich, Greg; Clancy, Daniel (Technical Monitor)
2002-01-01
Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.
Kurkal-Siebert, Vandana; Smith, Jeremy C
2006-02-22
An understanding of low-frequency, collective protein dynamics at low temperatures can furnish valuable information on functional protein energy landscapes, on the origins of the protein glass transition and on protein-protein interactions. Here, molecular dynamics (MD) simulations and normal-mode analyses are performed on various models of crystalline myoglobin in order to characterize intra- and interprotein vibrations at 150 K. Principal component analysis of the MD trajectories indicates that the Boson peak, a broad peak in the dynamic structure factor centered at about approximately 2-2.5 meV, originates from approximately 10(2) collective, harmonic vibrations. An accurate description of the environment is found to be essential in reproducing the experimental Boson peak form and position. At lower energies other strong peaks are found in the calculated dynamic structure factor. Characterization of these peaks shows that they arise from harmonic vibrations of proteins relative to each other. These vibrations are likely to furnish valuable information on the physical nature of protein-protein interactions.
Revealing spatially heterogeneous relaxation in a model nanocomposite.
Cheng, Shiwang; Mirigian, Stephen; Carrillo, Jan-Michael Y; Bocharova, Vera; Sumpter, Bobby G; Schweizer, Kenneth S; Sokolov, Alexei P
2015-11-21
The detailed nature of spatially heterogeneous dynamics of glycerol-silica nanocomposites is unraveled by combining dielectric spectroscopy with atomistic simulation and statistical mechanical theory. Analysis of the spatial mobility gradient shows no "glassy" layer, but the α-relaxation time near the nanoparticle grows with cooling faster than the α-relaxation time in the bulk and is ∼20 times longer at low temperatures. The interfacial layer thickness increases from ∼1.8 nm at higher temperatures to ∼3.5 nm upon cooling to near bulk Tg. A real space microscopic description of the mobility gradient is constructed by synergistically combining high temperature atomistic simulation with theory. Our analysis suggests that the interfacial slowing down arises mainly due to an increase of the local cage scale barrier for activated hopping induced by enhanced packing and densification near the nanoparticle surface. The theory is employed to predict how local surface densification can be manipulated to control layer dynamics and shear rigidity over a wide temperature range.
Schroeder, Indra
2015-01-01
Abstract A main ingredient for the understanding of structure/function correlates of ion channels is the quantitative description of single-channel gating and conductance. However, a wealth of information provided from fast current fluctuations beyond the temporal resolution of the recording system is often ignored, even though it is close to the time window accessible to molecular dynamics simulations. This kind of current fluctuations provide a special technical challenge, because individual opening/closing or blocking/unblocking events cannot be resolved, and the resulting averaging over undetected events decreases the single-channel current. Here, I briefly summarize the history of fast-current fluctuation analysis and focus on the so-called “beta distributions.” This tool exploits characteristics of current fluctuation-induced excess noise on the current amplitude histograms to reconstruct the true single-channel current and kinetic parameters. A guideline for the analysis and recent applications demonstrate that a construction of theoretical beta distributions by Markov Model simulations offers maximum flexibility as compared to analytical solutions. PMID:26368656
NASA Astrophysics Data System (ADS)
Erokhin, Sergey; Berkov, Dmitry; Ito, Masaaki; Kato, Akira; Yano, Masao; Michels, Andreas
2018-03-01
We demonstrate how micromagnetic simulations can be employed in order to characterize and analyze the magnetic microstructure of nanocomposites. For the example of nanocrystalline Nd-Fe-B, which is a potential material for future permanent-magnet applications, we have compared three different models for the micromagnetic analysis of this material class: (i) a description of the nanocomposite microstructure in terms of Stoner-Wohlfarth particles with and without the magnetodipolar interaction; (ii) a model based on the core-shell representation of the nanograins; (iii) the latter model including a contribution of superparamagnetic clusters. The relevant parameter spaces have been systematically scanned with the aim to establish which micromagnetic approach can most adequately describe experimental data for this material. According to our results, only the last, most sophisticated model is able to provide an excellent agreement with the measured hysteresis loop. The presented methodology is generally applicable to multiphase magnetic nanocomposites and it highligths the complex interrelationship between the microstructure, magnetic interactions, and the macroscopic magnetic properties.
Revealing spatially heterogeneous relaxation in a model nanocomposite
Cheng, Shiwang; Mirigian, Stephen; Carrillo, Jan-Michael Y.; ...
2015-11-18
The detailed nature of spatially heterogeneous dynamics of glycerol-silica nanocomposites is unraveled by combining dielectric spectroscopy with atomistic simulation and statistical mechanical theory. Analysis of the spatial mobility gradient shows no glassy layer, but the -relaxation time near the nanoparticle grows with cooling faster than the -relaxation time in the bulk and is ~20 times longer at low temperatures. The interfacial layer thickness increases from ~1.8 nm at higher temperatures to ~3.5 nm upon cooling to near bulk T g. A real space microscopic description of the mobility gradient is constructed by synergistically combining high temperature atomistic simulation with theory.more » Our analysis suggests that the interfacial slowing down arises mainly due to an increase of the local cage scale barrier for activated hopping induced by enhanced packing and densification near the nanoparticle surface. As a result, the theory is employed to predict how local surface densification can be manipulated to control layer dynamics and shear rigidity over a wide temperature range.« less
Analytical investigation of the dynamics of tethered constellations in Earth orbit, phase 2
NASA Technical Reports Server (NTRS)
Lorenzini, Enrico C.; Gullahorn, Gordon E.; Cosmo, Mario L.; Estes, Robert D.; Grossi, Mario D.
1994-01-01
This final report covers nine years of research on future tether applications and on the actual flights of the Small Expendable Deployment System (SEDS). Topics covered include: (1) a description of numerical codes used to simulate the orbital and attitude dynamics of tethered systems during station keeping and deployment maneuvers; (2) a comparison of various tethered system simulators; (3) dynamics analysis, conceptual design, potential applications and propagation of disturbances and isolation from noise of a variable gravity/microgravity laboratory tethered to the Space Station; (4) stability of a tethered space centrifuge; (5) various proposed two-dimensional tethered structures for low Earth orbit for use as planar array antennas; (6) tethered high gain antennas; (7) numerical calculation of the electromagnetic wave field on the Earth's surface on an electrodynamically tethered satellite; (8) reentry of tethered capsules; (9) deployment dynamics of SEDS-1; (10) analysis of SEDS-1 flight data; and (11) dynamics and control of SEDS-2.
Mean-field theory of active electrolytes: Dynamic adsorption and overscreening
NASA Astrophysics Data System (ADS)
Frydel, Derek; Podgornik, Rudolf
2018-05-01
We investigate active electrolytes within the mean-field level of description. The focus is on how the double-layer structure of passive, thermalized charges is affected by active dynamics of constituting ions. One feature of active dynamics is that particles adhere to hard surfaces, regardless of chemical properties of a surface and specifically in complete absence of any chemisorption or physisorption. To carry out the mean-field analysis of the system that is out of equilibrium, we develop the "mean-field simulation" technique, where the simulated system consists of charged parallel sheets moving on a line and obeying active dynamics, with the interaction strength rescaled by the number of sheets. The mean-field limit becomes exact in the limit of an infinite number of movable sheets.
A dynamic model of the human postural control system.
NASA Technical Reports Server (NTRS)
Hill, J. C.
1971-01-01
Description of a digital simulation of the pitch axis dynamics of a stick man. The difficulties encountered in linearizing the equations of motion are discussed; the conclusion reached is that a completely linear simulation is of such restricted validity that only a nonlinear simulation is of any practical use. Typical simulation results obtained from the full nonlinear model are illustrated.
NASA Technical Reports Server (NTRS)
hoelzer, H. D.; Fourroux, K. A.; Rickman, D. L.; Schrader, C. M.
2011-01-01
Figures of Merit (FoMs) and the FoM software provide a method for quantitatively evaluating the quality of a regolith simulant by comparing the simulant to a reference material. FoMs may be used for comparing a simulant to actual regolith material, specification by stating the value a simulant s FoMs must attain to be suitable for a given application and comparing simulants from different vendors or production runs. FoMs may even be used to compare different simulants to each other. A single FoM is conceptually an algorithm that computes a single number for quantifying the similarity or difference of a single characteristic of a simulant material and a reference material and provides a clear measure of how well a simulant and reference material match or compare. FoMs have been constructed to lie between zero and 1, with zero indicating a poor or no match and 1 indicating a perfect match. FoMs are defined for modal composition, particle size distribution, particle shape distribution, (aspect ratio and angularity), and density. This TM covers the mathematics, use, installation, and licensing for the existing FoM code in detail.
Demonstration Advanced Avionics System (DAAS), Phase 1
NASA Technical Reports Server (NTRS)
Bailey, A. J.; Bailey, D. G.; Gaabo, R. J.; Lahn, T. G.; Larson, J. C.; Peterson, E. M.; Schuck, J. W.; Rodgers, D. L.; Wroblewski, K. A.
1981-01-01
Demonstration advanced anionics system (DAAS) function description, hardware description, operational evaluation, and failure mode and effects analysis (FMEA) are provided. Projected advanced avionics system (PAAS) description, reliability analysis, cost analysis, maintainability analysis, and modularity analysis are discussed.
NASA Astrophysics Data System (ADS)
Packard, Corey D.; Klein, Mark D.; Viola, Timothy S.; Hepokoski, Mark A.
2016-10-01
The ability to predict electro-optical (EO) signatures of diverse targets against cluttered backgrounds is paramount for signature evaluation and/or management. Knowledge of target and background signatures is essential for a variety of defense-related applications. While there is no substitute for measured target and background signatures to determine contrast and detection probability, the capability to simulate any mission scenario with desired environmental conditions is a tremendous asset for defense agencies. In this paper, a systematic process for the thermal and visible-through-infrared simulation of camouflaged human dismounts in cluttered outdoor environments is presented. This process, utilizing the thermal and EO/IR radiance simulation tool TAIThermIR (and MuSES), provides a repeatable and accurate approach for analyzing contrast, signature and detectability of humans in multiple wavebands. The engineering workflow required to combine natural weather boundary conditions and the human thermoregulatory module developed by ThermoAnalytics is summarized. The procedure includes human geometry creation, human segmental physiology description and transient physical temperature prediction using environmental boundary conditions and active thermoregulation. Radiance renderings, which use Sandford-Robertson BRDF optical surface property descriptions and are coupled with MODTRAN for the calculation of atmospheric effects, are demonstrated. Sensor effects such as optical blurring and photon noise can be optionally included, increasing the accuracy of detection probability outputs that accompany each rendering. This virtual evaluation procedure has been extensively validated and provides a flexible evaluation process that minimizes the difficulties inherent in human-subject field testing. Defense applications such as detection probability assessment, camouflage pattern evaluation, conspicuity tests and automatic target recognition are discussed.
Phase walk analysis of leptokurtic time series.
Schreiber, Korbinian; Modest, Heike I; Räth, Christoph
2018-06-01
The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.
Optical communication for space missions
NASA Technical Reports Server (NTRS)
Firtmaurice, M.
1991-01-01
Activities performed at NASA/GSFC (Goddard Space Flight Center) related to direct detection optical communications for space applications are discussed. The following subject areas are covered: (1) requirements for optical communication systems (data rates and channel quality; spatial acquisition; fine tracking and pointing; and transmit point-ahead correction); (2) component testing and development (laser diodes performance characterization and life testing; and laser diode power combining); (3) system development and simulations (The GSFC pointing, acquisition and tracking system; hardware description; preliminary performance analysis; and high data rate transmitter/receiver systems); and (4) proposed flight demonstration of optical communications.
Description, Analysis and Simulation of a New Realization of Digital Filters.
1987-09-01
together with its staircase representation h,.(t) . ..... .. ... ... .. 79 6.3 The-RDC LPF transfer function when Td includes 2 zeroes of hc(t) 81 6.4 The...RDC LPF transfer function when Td includes 6 zeroes of hc(t) 82 6.5 The RDC LPF transfer function when Td includes 8 zeroes of h,(t) 83 6.6 The RDC LPF...transfer function when Td includes 6 zeroes of h,(t) and when rectangular and Hamming windows are used ........ ... 84 6.7 The input z(t) and its
Roux-Rouquié, Magali; Caritey, Nicolas; Gaubert, Laurent; Rosenthal-Sabroux, Camille
2004-07-01
One of the main issues in Systems Biology is to deal with semantic data integration. Previously, we examined the requirements for a reference conceptual model to guide semantic integration based on the systemic principles. In the present paper, we examine the usefulness of the Unified Modelling Language (UML) to describe and specify biological systems and processes. This makes unambiguous representations of biological systems, which would be suitable for translation into mathematical and computational formalisms, enabling analysis, simulation and prediction of these systems behaviours.
Equations of state for detonation products of high energy PBX explosives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, E. L.; Helm, F. H.; Finger, M.
1977-08-01
It has become apparent that the accumulated changes in the analysis of cylinder test data, in the material specifications, and in the hydrodynamic code simulation of the cylinder test necessitated an update of the detonation product EOS description for explosives in common use at LLL. The explosives reviewed are PBX-9404-3, LX-04-1, LX-10-1, LX-14-0 and LX-09-1. In order to maintain the proper relation of predicted performance of these standard explosives, they have been revised as a single set.
A two-scale model for dynamic damage evolution
NASA Astrophysics Data System (ADS)
Keita, Oumar; Dascalu, Cristian; François, Bertrand
2014-03-01
This paper presents a new micro-mechanical damage model accounting for inertial effect. The two-scale damage model is fully deduced from small-scale descriptions of dynamic micro-crack propagation under tensile loading (mode I). An appropriate micro-mechanical energy analysis is combined with homogenization based on asymptotic developments in order to obtain the macroscopic evolution law for damage. Numerical simulations are presented in order to illustrate the ability of the model to describe known behaviors like size effects for the structural response, strain-rate sensitivity, brittle-ductile transition and wave dispersion.
The ACP Special Issue is being organized to draw together analysis of a set of cooperative modeling experiments (referred to as HTAP2). The purpose of this technical note is to provide a common description of the experimental design and set up for HTAP2 that can be referred to b...
NASA Astrophysics Data System (ADS)
Zhang, L.; van Eersel, H.; Bobbert, P. A.; Coehoorn, R.
2016-10-01
Using a novel method for analyzing transient photoluminescence (PL) experiments, a microscopic description is obtained for the dye concentration dependence of triplet-triplet annihilation (TTA) in phosphorescent host-guest systems. It is demonstrated that the TTA-mechanism, which could be a single-step dominated process or a diffusion-mediated multi-step process, can be deduced for any given dye concentration from a recently proposed PL intensity analysis. A comparison with the results of kinetic Monte Carlo simulations provides the TTA-Förster radius and shows that the TTA enhancement due to triplet diffusion can be well described in a microscopic manner assuming Förster- or Dexter-type energy transfer.
Reconstruction of bar {p}p events in PANDA
NASA Astrophysics Data System (ADS)
Spataro, S.
2012-08-01
The PANDA experiment will study anti-proton proton and anti-proton nucleus collisions in the HESR complex of the facility FAIR, in a beam momentum range from 2 GeV jc up to 15 GeV/c. In preparation for the experiment, a software framework based on ROOT (PandaRoot) is being developed for the simulation, reconstruction and analysis of physics events, running also on a GRID infrastructure. Detailed geometry descriptions and different realistic reconstruction algorithms are implemented, currently used for the realization of the Technical Design Reports. The contribution will report about the reconstruction capabilities of the Panda spectrometer, focusing mainly on the performances of the tracking system and the results for the analysis of physics benchmark channels.
The new Langley Research Center advanced real-time simulation (ARTS) system
NASA Technical Reports Server (NTRS)
Crawford, D. J.; Cleveland, J. I., II
1986-01-01
Based on a survey of current local area network technology with special attention paid to high bandwidth and very low transport delay requirements, NASA's Langley Research Center designed a new simulation subsystem using the computer automated measurement and control (CAMAC) network. This required significant modifications to the standard CAMAC system and development of a network switch, a clocking system, new conversion equipment, new consoles, supporting software, etc. This system is referred to as the advanced real-time simulation (ARTS) system. It is presently being built at LaRC. This paper provides a functional and physical description of the hardware and a functional description of the software. The requirements which drove the design are presented as well as present performance figures and status.
A new theoretical approach to analyze complex processes in cytoskeleton proteins.
Li, Xin; Kolomeisky, Anatoly B
2014-03-20
Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.
Analysis of Mathematical Modelling on Potentiometric Biosensors
Mehala, N.; Rajendran, L.
2014-01-01
A mathematical model of potentiometric enzyme electrodes for a nonsteady condition has been developed. The model is based on the system of two coupled nonlinear time-dependent reaction diffusion equations for Michaelis-Menten formalism that describes the concentrations of substrate and product within the enzymatic layer. Analytical expressions for the concentration of substrate and product and the corresponding flux response have been derived for all values of parameters using the new homotopy perturbation method. Furthermore, the complex inversion formula is employed in this work to solve the boundary value problem. The analytical solutions obtained allow a full description of the response curves for only two kinetic parameters (unsaturation/saturation parameter and reaction/diffusion parameter). Theoretical descriptions are given for the two limiting cases (zero and first order kinetics) and relatively simple approaches for general cases are presented. All the analytical results are compared with simulation results using Scilab/Matlab program. The numerical results agree with the appropriate theories. PMID:25969765
Analysis of mathematical modelling on potentiometric biosensors.
Mehala, N; Rajendran, L
2014-01-01
A mathematical model of potentiometric enzyme electrodes for a nonsteady condition has been developed. The model is based on the system of two coupled nonlinear time-dependent reaction diffusion equations for Michaelis-Menten formalism that describes the concentrations of substrate and product within the enzymatic layer. Analytical expressions for the concentration of substrate and product and the corresponding flux response have been derived for all values of parameters using the new homotopy perturbation method. Furthermore, the complex inversion formula is employed in this work to solve the boundary value problem. The analytical solutions obtained allow a full description of the response curves for only two kinetic parameters (unsaturation/saturation parameter and reaction/diffusion parameter). Theoretical descriptions are given for the two limiting cases (zero and first order kinetics) and relatively simple approaches for general cases are presented. All the analytical results are compared with simulation results using Scilab/Matlab program. The numerical results agree with the appropriate theories.
VS2DI: Model use, calibration, and validation
Healy, Richard W.; Essaid, Hedeff I.
2012-01-01
VS2DI is a software package for simulating water, solute, and heat transport through soils or other porous media under conditions of variable saturation. The package contains a graphical preprocessor for constructing simulations, a postprocessor for displaying simulation results, and numerical models that solve for flow and solute transport (VS2DT) and flow and heat transport (VS2DH). Flow is described by the Richards equation, and solute and heat transport are described by advection-dispersion equations; the finite-difference method is used to solve these equations. Problems can be simulated in one, two, or three (assuming radial symmetry) dimensions. This article provides an overview of calibration techniques that have been used with VS2DI; included is a detailed description of calibration procedures used in simulating the interaction between groundwater and a stream fed by drainage from agricultural fields in central Indiana. Brief descriptions of VS2DI and the various types of problems that have been addressed with the software package are also presented.
Simulations of inspiraling and merging double neutron stars using the Spectral Einstein Code
NASA Astrophysics Data System (ADS)
Haas, Roland; Ott, Christian D.; Szilagyi, Bela; Kaplan, Jeffrey D.; Lippuner, Jonas; Scheel, Mark A.; Barkett, Kevin; Muhlberger, Curran D.; Dietrich, Tim; Duez, Matthew D.; Foucart, Francois; Pfeiffer, Harald P.; Kidder, Lawrence E.; Teukolsky, Saul A.
2016-06-01
We present results on the inspiral, merger, and postmerger evolution of a neutron star-neutron star (NSNS) system. Our results are obtained using the hybrid pseudospectral-finite volume Spectral Einstein Code (SpEC). To test our numerical methods, we evolve an equal-mass system for ≈22 orbits before merger. This waveform is the longest waveform obtained from fully general-relativistic simulations for NSNSs to date. Such long (and accurate) numerical waveforms are required to further improve semianalytical models used in gravitational wave data analysis, for example, the effective one body models. We discuss in detail the improvements to SpEC's ability to simulate NSNS mergers, in particular mesh refined grids to better resolve the merger and postmerger phases. We provide a set of consistency checks and compare our results to NSNS merger simulations with the independent bam code. We find agreement between them, which increases confidence in results obtained with either code. This work paves the way for future studies using long waveforms and more complex microphysical descriptions of neutron star matter in SpEC.
Chen, Lin; Li, Xue; Wang, Ruige; Fang, Fengqin; Yang, Wanli; Kan, Wei
2016-07-01
The ribose binding protein (RBP), a sugar-binding periplasmic protein, is involved in the transport and signaling processes in both prokaryotes and eukaryotes. Although several cellular and structural studies have been reported, a description of the thermostability of RBP at the molecular level remains elusive. Focused on the hyperthermophilic Thermoytoga maritima RBP (tmRBP) and mesophilic Escherichia coli homolog (ecRBP), we applied molecular dynamics simulations at four different temperatures (300, 380, 450, and 500 K) to obtain a deeper insight into the structural features responsible for the reduced thermostability of the ecRBP. The simulations results indicate that there are distinct structural differences in the unfolding pathway between the two homologs and the ecRBP unfolds faster than the hyperthermophilic homologs at certain temperatures in accordance with the lower thermal stability found experimentally. Essential dynamics analysis uncovers that the essential subspaces of ecRBP and tmRBP are non-overlapping and these two proteins show different directions of motion within the simulations trajectories. Such an understanding is required for designing efficient proteins with characteristics for a particular application.
Kuniansky, E.L.
1990-01-01
A computer program based on the Galerkin finite-element method was developed to simulate two-dimensional steady-state ground-water flow in either isotropic or anisotropic confined aquifers. The program may also be used for unconfined aquifers of constant saturated thickness. Constant head, constant flux, and head-dependent flux boundary conditions can be specified in order to approximate a variety of natural conditions, such as a river or lake boundary, and pumping well. The computer program was developed for the preliminary simulation of ground-water flow in the Edwards-Trinity Regional aquifer system as part of the Regional Aquifer-Systems Analysis Program. Results of the program compare well to analytical solutions and simulations .from published finite-difference models. A concise discussion of the Galerkin method is presented along with a description of the program. Provided in the Supplemental Data section are a listing of the computer program, definitions of selected program variables, and several examples of data input and output used in verifying the accuracy of the program.
NASA Astrophysics Data System (ADS)
Islam, Md Mahbubul; Strachan, Alejandro
A detailed atomistic-level understanding of the ultrafast chemistry of detonation processes of high energy materials is crucial to understand their performance and safety. Recent advances in laser shocks and ultra-fast spectroscopy is yielding the first direct experimental evidence of chemistry at extreme conditions. At the same time, reactive molecular dynamics (MD) in current high-performance computing platforms enable an atomic description of shock-induced chemistry with length and timescales approaching those of experiments. We use MD simulations with the reactive force field ReaxFF to investigate the shock-induced chemical decomposition mechanisms of polyvinyl nitrate (PVN) and nitromethane (NM). The effect of shock pressure on chemical reaction mechanisms and kinetics of both the materials are investigated. For direct comparison of our simulation results with experimentally derived IR absorption data, we performed spectral analysis using atomistic velocity at various shock conditions. The combination of reactive MD simulations and ultrafast spectroscopy enables both the validation of ReaxFF at extreme conditions and contributes to the interpretation of the experimental data relating changes in spectral features to atomic processes. Office of Naval Research MURI program.
Nuclear Thermal Rocket Element Environmental Simulator (NTREES)
NASA Technical Reports Server (NTRS)
Schoenfeld, Michael
2009-01-01
A detailed description of the Nuclear Thermal Rocket Element Environmental Simulator (NTREES) is presented. The contents include: 1) Design Requirements; 2) NTREES Layout; 3) Data Acquisition and Control System Schematics; 4) NTREES System Schematic; and 5) NTREES Setup.
Studies of particle wake potentials in plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian N.; Graziani, Frank R.; Glosli, James N.; Strozzi, David J.; Surh, Michael P.; Richards, David F.; Decyk, Viktor K.; Mori, Warren B.
2011-09-01
A detailed understanding of electron stopping and scattering in plasmas with variable values for the number of particles within a Debye sphere is still not at hand. Presently, there is some disagreement in the literature concerning the proper description of these processes. Theoretical models assume electrostatic (Coulomb force) interactions between particles and neglect magnetic effects. Developing and validating proper descriptions requires studying the processes using first-principle plasma simulations. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and BEPS. In this paper, we compare the wakes observed in these simulations with each other and predictions from collisionless kinetic theory. The relevance of the work to Fast Ignition is discussed.
Realistic finite temperature simulations of magnetic systems using quantum statistics
NASA Astrophysics Data System (ADS)
Bergqvist, Lars; Bergman, Anders
2018-01-01
We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.
Education as Simulation Game: A Critical Hermeneutic.
ERIC Educational Resources Information Center
Palermo, James
1979-01-01
This paper examines a specific educational game called "Popcorn Factory." First, it gives a detailed description of the game, then shifts the description into a critical hermeneutical framework, analyzing the deep structures at work in the "Popcorn Factory" according to the theories of Freud and Marcuse. (Author/SJL)
Natural Language Description of Emotion
ERIC Educational Resources Information Center
Kazemzadeh, Abe
2013-01-01
This dissertation studies how people describe emotions with language and how computers can simulate this descriptive behavior. Although many non-human animals can express their current emotions as social signals, only humans can communicate about emotions symbolically. This symbolic communication of emotion allows us to talk about emotions that we…
DOT National Transportation Integrated Search
1976-12-01
The manual presents the complete ILSLOC computer program package. In addition to including a thorough description of the program itself and a commented listing, the manual contains a brief description of the ILS system and antenna patterns. To illust...
Simulated lumped-parameter system reduced-order adaptive control studies
NASA Technical Reports Server (NTRS)
Johnson, C. R., Jr.; Lawrence, D. A.; Taylor, T.; Malakooti, M. V.
1981-01-01
Two methods of interpreting the misbehavior of reduced order adaptive controllers are discussed. The first method is based on system input-output description and the second is based on state variable description. The implementation of the single input, single output, autoregressive, moving average system is considered.
10 CFR 434.517 - HVAC systems and equipment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... simulation, except that excess capacity provided to meet process loads need not be modeled unless the process... Reference Buildings. The zones in the simulation shall correspond to the zones provided by the controls in... simulation. Table 517.4.1—HVAC System Description for Prototype and Reference Buildings 1,2 HVAC component...
Combining Simulation and Optimization Models for Hardwood Lumber Production
G.A. Mendoza; R.J. Meimban; W.G. Luppold; Philip A. Araman
1991-01-01
Published literature contains a number of optimization and simulation models dealing with the primary processing of hardwood and softwood logs. Simulation models have been developed primarily as descriptive models for characterizing the general operations and performance of a sawmill. Optimization models, on the other hand, were developed mainly as analytical tools for...
Atomic-level description of ubiquitin folding
Piana, Stefano; Lindorff-Larsen, Kresten; Shaw, David E.
2013-01-01
Equilibrium molecular dynamics simulations, in which proteins spontaneously and repeatedly fold and unfold, have recently been used to help elucidate the mechanistic principles that underlie the folding of fast-folding proteins. The extent to which the conclusions drawn from the analysis of such proteins, which fold on the microsecond timescale, apply to the millisecond or slower folding of naturally occurring proteins is, however, unclear. As a first attempt to address this outstanding issue, we examine here the folding of ubiquitin, a 76-residue-long protein found in all eukaryotes that is known experimentally to fold on a millisecond timescale. Ubiquitin folding has been the subject of many experimental studies, but its slow folding rate has made it difficult to observe and characterize the folding process through all-atom molecular dynamics simulations. Here we determine the mechanism, thermodynamics, and kinetics of ubiquitin folding through equilibrium atomistic simulations. The picture emerging from the simulations is in agreement with a view of ubiquitin folding suggested from previous experiments. Our findings related to the folding of ubiquitin are also consistent, for the most part, with the folding principles derived from the simulation of fast-folding proteins, suggesting that these principles may be applicable to a wider range of proteins. PMID:23503848
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; auBuchon, M.
2000-01-01
Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of propulsion systems for aircraft and space vehicles called the Numerical Propulsion System Simulation (NPSS). The NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer, along with the concept of numerical zooming between 0- Dimensional to 1-, 2-, and 3-dimensional component engine codes. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Current "state-of-the-art" engine simulations are 0-dimensional in that there is there is no axial, radial or circumferential resolution within a given component (e.g. a compressor or turbine has no internal station designations). In these 0-dimensional cycle simulations the individual component performance characteristics typically come from a table look-up (map) with adjustments for off-design effects such as variable geometry, Reynolds effects, and clearances. Zooming one or more of the engine components to a higher order, physics-based analysis means a higher order code is executed and the results from this analysis are used to adjust the 0-dimensional component performance characteristics within the system simulation. By drawing on the results from more predictive, physics based higher order analysis codes, "cycle" simulations are refined to closely model and predict the complex physical processes inherent to engines. As part of the overall development of the NPSS, NASA and industry began the process of defining and implementing an object class structure that enables Numerical Zooming between the NPSS Version I (0-dimension) and higher order 1-, 2- and 3-dimensional analysis codes. The NPSS Version I preserves the historical cycle engineering practices but also extends these classical practices into the area of numerical zooming for use within a companies' design system. What follows here is a description of successfully zooming I-dimensional (row-by-row) high pressure compressor results back to a NPSS engine 0-dimension simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the fidelity of the engine system simulation and enable the engine system to be "pre-validated" prior to commitment to engine hardware.
Barczi, Jean-François; Rey, Hervé; Griffon, Sébastien; Jourdan, Christophe
2018-04-18
Many studies exist in the literature dealing with mathematical representations of root systems, categorized, for example, as pure structure description, partial derivative equations or functional-structural plant models. However, in these studies, root architecture modelling has seldom been carried out at the organ level with the inclusion of environmental influences that can be integrated into a whole plant characterization. We have conducted a multidisciplinary study on root systems including field observations, architectural analysis, and formal and mathematical modelling. This integrative and coherent approach leads to a generic model (DigR) and its software simulator. Architecture analysis applied to root systems helps at root type classification and architectural unit design for each species. Roots belonging to a particular type share dynamic and morphological characteristics which consist of topological and geometric features. The DigR simulator is integrated into the Xplo environment, with a user interface to input parameter values and make output ready for dynamic 3-D visualization, statistical analysis and saving to standard formats. DigR is simulated in a quasi-parallel computing algorithm and may be used either as a standalone tool or integrated into other simulation platforms. The software is open-source and free to download at http://amapstudio.cirad.fr/soft/xplo/download. DigR is based on three key points: (1) a root-system architectural analysis, (2) root type classification and modelling and (3) a restricted set of 23 root type parameters with flexible values indexed in terms of root position. Genericity and botanical accuracy of the model is demonstrated for growth, branching, mortality and reiteration processes, and for different root architectures. Plugin examples demonstrate the model's versatility at simulating plastic responses to environmental constraints. Outputs of the model include diverse root system structures such as tap-root, fasciculate, tuberous, nodulated and clustered root systems. DigR is based on plant architecture analysis which leads to specific root type classification and organization that are directly linked to field measurements. The open source simulator of the model has been included within a friendly user environment. DigR accuracy and versatility are demonstrated for growth simulations of complex root systems for both annual and perennial plants.
High-Performance First-Principles Molecular Dynamics for Predictive Theory and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gygi, Francois; Galli, Giulia; Schwegler, Eric
This project focused on developing high-performance software tools for First-Principles Molecular Dynamics (FPMD) simulations, and applying them in investigations of materials relevant to energy conversion processes. FPMD is an atomistic simulation method that combines a quantum-mechanical description of electronic structure with the statistical description provided by molecular dynamics (MD) simulations. This reliance on fundamental principles allows FPMD simulations to provide a consistent description of structural, dynamical and electronic properties of a material. This is particularly useful in systems for which reliable empirical models are lacking. FPMD simulations are increasingly used as a predictive tool for applications such as batteries, solarmore » energy conversion, light-emitting devices, electro-chemical energy conversion devices and other materials. During the course of the project, several new features were developed and added to the open-source Qbox FPMD code. The code was further optimized for scalable operation of large-scale, Leadership-Class DOE computers. When combined with Many-Body Perturbation Theory (MBPT) calculations, this infrastructure was used to investigate structural and electronic properties of liquid water, ice, aqueous solutions, nanoparticles and solid-liquid interfaces. Computing both ionic trajectories and electronic structure in a consistent manner enabled the simulation of several spectroscopic properties, such as Raman spectra, infrared spectra, and sum-frequency generation spectra. The accuracy of the approximations used allowed for direct comparisons of results with experimental data such as optical spectra, X-ray and neutron diffraction spectra. The software infrastructure developed in this project, as applied to various investigations of solids, liquids and interfaces, demonstrates that FPMD simulations can provide a detailed, atomic-scale picture of structural, vibrational and electronic properties of complex systems relevant to energy conversion devices.« less
Vlasov analysis of microbunching instability for magnetized beams
Tsai, C. -Y.; Derbenev, Ya. S.; Douglas, D.; ...
2017-05-19
For a high-brightness electron beam with low energy and high bunch charge traversing a recirculation beamline, coherent synchrotron radiation and space charge effect may result in the microbunching instability (MBI). Both tracking simulation and Vlasov analysis for an early design of Circulator Cooler Ring for the Jefferson Lab Electron Ion Collider reveal significant MBI. It is envisioned these could be substantially suppressed by using a magnetized beam. In this work, we extend the existing Vlasov analysis, originally developed for a non-magnetized beam, to the description of transport of a magnetized beam including relevant collective effects. As a result, the newmore » formulation will be further employed to confirm prediction of microbunching suppression for a magnetized beam transport in a recirculating machine design.« less
NASA Astrophysics Data System (ADS)
Legenne, J.; Broca, R.; Alby, F.
1992-08-01
A review is presented of the French Guidance, Navigation and Control Simulator (GNC) developed to analyze and validate the global phasing strategy of the Hermes spaceplane. The phasing strategy, the events management, and the requirements for simulation are described, and a functional description of the simulator is given. The simulator will be able to assess the performance of the strategy in terms of fuel consumption and duration (mean values, standard deviations).
On the Helix Propensity in Generalized Born Solvent Descriptions of Modeling the Dark Proteome
2017-01-10
benchmarks of conformational sampling methods and their all-atom force fields plus solvent descriptions to accurately model structural transitions on a...atom simulations of proteins is the replacement of explicit water interactions with a continuum description of treating implicitly the bulk physical... structure was reported by Amarasinghe and coworkers (Leung et al., 2015) of the Ebola nucleoprotein NP in complex with a 28-residue peptide extracted
Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y; Schwegler, Eric
2016-10-21
Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. Such simulations are often performed at elevated temperatures to artificially "correct" for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. To address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na + , K + , and Cl - ions. We show that simulations at 390-400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. Our results suggest that an elevated temperature around 390-400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.
Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface.
Knodel, Markus M; Nägel, Arne; Reiter, Sebastian; Vogel, Andreas; Targett-Adams, Paul; McLauchlan, John; Herrmann, Eva; Wittum, Gabriel
2018-01-08
Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER)-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE) upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP) time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles.
Rosas-Trigueros, Jorge Luis; Correa-Basurto, José; Guadalupe Benítez-Cardoza, Claudia; Zamorano-Carrillo, Absalom
2011-01-01
Bax is a member of the Bcl-2 protein family that participates in mitochondrion-mediated apoptosis. In the early stages of the apoptotic pathway, this protein migrates from the cytosol to the outer mitochondrial membrane, where it is inserted and usually oligomerizes, making cytochrome c-compatible pores. Although several cellular and structural studies have been reported, a description of the stability of Bax at the molecular level remains elusive. This article reports molecular dynamics simulations of monomeric Bax at 300, 400, and 500 K, focusing on the most relevant structural changes and relating them to biological experimental results. Bax gradually loses its α-helices when it is submitted to high temperatures, yet it maintains its globular conformation. The resistance of Bax to adopt an extended conformation could be due to several interactions that were found to be responsible for maintaining the structural stability of this protein. Among these interactions, we found salt bridges, hydrophobic interactions, and hydrogen bonds. Remarkably, salt bridges were the most relevant to prevent the elongation of the structure. In addition, the analysis of our results suggests which conformational movements are implicated in the activation/oligomerization of Bax. This atomistic description might have important implications for understanding the functionality and stability of Bax in vitro as well as within the cellular environment. PMID:21936009
NASA Astrophysics Data System (ADS)
Eichinger, M.; Tavan, P.; Hutter, J.; Parrinello, M.
1999-06-01
We present a hybrid method for molecular dynamics simulations of solutes in complex solvents as represented, for example, by substrates within enzymes. The method combines a quantum mechanical (QM) description of the solute with a molecular mechanics (MM) approach for the solvent. The QM fragment of a simulation system is treated by ab initio density functional theory (DFT) based on plane-wave expansions. Long-range Coulomb interactions within the MM fragment and between the QM and the MM fragment are treated by a computationally efficient fast multipole method. For the description of covalent bonds between the two fragments, we introduce the scaled position link atom method (SPLAM), which removes the shortcomings of related procedures. The various aspects of the hybrid method are scrutinized through test calculations on liquid water, the water dimer, ethane and a small molecule related to the retinal Schiff base. In particular, the extent to which vibrational spectra obtained by DFT for the solute can be spoiled by the lower quality force field of the solvent is checked, including cases in which the two fragments are covalently joined. The results demonstrate that our QM/MM hybrid method is especially well suited for the vibrational analysis of molecules in condensed phase.
Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface
Nägel, Arne; Reiter, Sebastian; Vogel, Andreas; McLauchlan, John; Herrmann, Eva; Wittum, Gabriel
2018-01-01
Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER)-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE) upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP) time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles. PMID:29316722
An unconditionally stable method for numerically solving solar sail spacecraft equations of motion
NASA Astrophysics Data System (ADS)
Karwas, Alex
Solar sails use the endless supply of the Sun's radiation to propel spacecraft through space. The sails use the momentum transfer from the impinging solar radiation to provide thrust to the spacecraft while expending zero fuel. Recently, the first solar sail spacecraft, or sailcraft, named IKAROS completed a successful mission to Venus and proved the concept of solar sail propulsion. Sailcraft experimental data is difficult to gather due to the large expenses of space travel, therefore, a reliable and accurate computational method is needed to make the process more efficient. Presented in this document is a new approach to simulating solar sail spacecraft trajectories. The new method provides unconditionally stable numerical solutions for trajectory propagation and includes an improved physical description over other methods. The unconditional stability of the new method means that a unique numerical solution is always determined. The improved physical description of the trajectory provides a numerical solution and time derivatives that are continuous throughout the entire trajectory. The error of the continuous numerical solution is also known for the entire trajectory. Optimal control for maximizing thrust is also provided within the framework of the new method. Verification of the new approach is presented through a mathematical description and through numerical simulations. The mathematical description provides details of the sailcraft equations of motion, the numerical method used to solve the equations, and the formulation for implementing the equations of motion into the numerical solver. Previous work in the field is summarized to show that the new approach can act as a replacement to previous trajectory propagation methods. A code was developed to perform the simulations and it is also described in this document. Results of the simulations are compared to the flight data from the IKAROS mission. Comparison of the two sets of data show that the new approach is capable of accurately simulating sailcraft motion. Sailcraft and spacecraft simulations are compared to flight data and to other numerical solution techniques. The new formulation shows an increase in accuracy over a widely used trajectory propagation technique. Simulations for two-dimensional, three-dimensional, and variable attitude trajectories are presented to show the multiple capabilities of the new technique. An element of optimal control is also part of the new technique. An additional equation is added to the sailcraft equations of motion that maximizes thrust in a specific direction. A technical description and results of an example optimization problem are presented. The spacecraft attitude dynamics equations take the simulation a step further by providing control torques using the angular rate and acceleration outputs of the numerical formulation.
'The Diamond': a structure for simulation debrief.
Jaye, Peter; Thomas, Libby; Reedy, Gabriel
2015-06-01
Despite debriefing being found to be the most important element in providing effective learning in simulation-based medical education reviews, there are only a few examples in the literature to help guide a debriefer. The diamond debriefing method is based on the technique of description, analysis and application, along with aspects of the advocacy-inquiry approach and of debriefing with good judgement. It is specifically designed to allow an exploration of the non-technical aspects of a simulated scenario. The debrief diamond, a structured visual reminder of the debrief process, was developed through teaching simulation debriefing to hundreds of faculty members over several years. The diamond shape visually represents the idealised process of a debrief: opening out a facilitated discussion about the scenario, before bringing the learning back into sharp focus with specific learning points. Debriefing is the most important element in providing effective learning in simulation-based medical education reviews The Diamond is a two-sided prompt sheet: the first contains the scaffolding, with a series of specifically constructed questions for each phase of the debrief; the second lays out the theory behind the questions and the process. The Diamond encourages a standardised approach to high-quality debriefing on non-technical skills. Feedback from learners and from debriefing faculty members has indicated that the Diamond is useful and valuable as a debriefing tool, benefiting both participants and faculty members. It can be used by junior and senior faculty members debriefing in pairs, allowing the junior faculty member to conduct the description phase, while the more experienced faculty member leads the later and more challenging phases. The Diamond gives an easy but pedagogically sound structure to follow and specific prompts to use in the moment. © 2015 The Authors. The Clinical Teacher published by Association for the Study of Medical Education and John Wiley & Sons Ltd.
Freytag, Julia; Stroben, Fabian; Hautz, Wolf E; Eisenmann, Dorothea; Kämmer, Juliane E
2017-01-01
Introduction Medical errors have an incidence of 9% and may lead to worse patient outcome. Teamwork training has the capacity to significantly reduce medical errors and therefore improve patient outcome. One common framework for teamwork training is crisis resource management, adapted from aviation and usually trained in simulation settings. Debriefing after simulation is thought to be crucial to learning teamwork-related concepts and behaviours but it remains unclear how best to debrief these aspects. Furthermore, teamwork-training sessions and studies examining education effects on undergraduates are rare. The study aims to evaluate the effects of two teamwork-focused debriefings on team performance after an extensive medical student teamwork training. Methods and analyses A prospective experimental study has been designed to compare a well-established three-phase debriefing method (gather–analyse–summarise; the GAS method) to a newly developed and more structured debriefing approach that extends the GAS method with TeamTAG (teamwork techniques analysis grid). TeamTAG is a cognitive aid listing preselected teamwork principles and descriptions of behavioural anchors that serve as observable patterns of teamwork and is supposed to help structure teamwork-focused debriefing. Both debriefing methods will be tested during an emergency room teamwork-training simulation comprising six emergency medicine cases faced by 35 final-year medical students in teams of five. Teams will be randomised into the two debriefing conditions. Team performance during simulation and the number of principles discussed during debriefing will be evaluated. Learning opportunities, helpfulness and feasibility will be rated by participants and instructors. Analyses will include descriptive, inferential and explorative statistics. Ethics and dissemination The study protocol was approved by the institutional office for data protection and the ethics committee of Charité Medical School Berlin and registered under EA2/172/16. All students will participate voluntarily and will sign an informed consent after receiving written and oral information about the study. Results will be published. PMID:28667224
Advective transport observations with MODPATH-OBS--documentation of the MODPATH observation process
Hanson, R.T.; Kauffman, L.K.; Hill, M.C.; Dickinson, J.E.; Mehl, S.W.
2013-01-01
The MODPATH-OBS computer program described in this report is designed to calculate simulated equivalents for observations related to advective groundwater transport that can be represented in a quantitative way by using simulated particle-tracking data. The simulated equivalents supported by MODPATH-OBS are (1) distance from a source location at a defined time, or proximity to an observed location; (2) time of travel from an initial location to defined locations, areas, or volumes of the simulated system; (3) concentrations used to simulate groundwater age; and (4) percentages of water derived from contributing source areas. Although particle tracking only simulates the advective component of conservative transport, effects of non-conservative processes such as retardation can be approximated through manipulation of the effective-porosity value used to calculate velocity based on the properties of selected conservative tracers. This program can also account for simple decay or production, but it cannot account for diffusion. Dispersion can be represented through direct simulation of subsurface heterogeneity and the use of many particles. MODPATH-OBS acts as a postprocessor to MODPATH, so that the sequence of model runs generally required is MODFLOW, MODPATH, and MODPATH-OBS. The version of MODFLOW and MODPATH that support the version of MODPATH-OBS presented in this report are MODFLOW-2005 or MODFLOW-LGR, and MODPATH-LGR. MODFLOW-LGR is derived from MODFLOW-2005, MODPATH 5, and MODPATH 6 and supports local grid refinement. MODPATH-LGR is derived from MODPATH 5. It supports the forward and backward tracking of particles through locally refined grids and provides the output needed for MODPATH_OBS. For a single grid and no observations, MODPATH-LGR results are equivalent to MODPATH 5. MODPATH-LGR and MODPATH-OBS simulations can use nearly all of the capabilities of MODFLOW-2005 and MODFLOW-LGR; for example, simulations may be steady-state, transient, or a combination. Though the program name MODPATH-OBS specifically refers to observations, the program also can be used to calculate model prediction of observations. MODPATH-OBS is primarily intended for use with separate programs that conduct sensitivity analysis, data needs assessment, parameter estimation, and uncertainty analysis, such as UCODE_2005, and PEST. In many circumstances, refined grids in selected parts of a model are important to simulated hydraulics, detailed inflows and outflows, or other system characteristics. MODFLOW-LGR and MODPATH-LGR support accurate local grid refinement in which both mass (flows) and energy (head) are conserved across the local grid boundary. MODPATH-OBS is designed to take advantage of these capabilities. For example, particles tracked between a pumping well and a nearby stream, which are simulated poorly if a river and well are located in a single large grid cell, can be simulated with improved accuracy using a locally refined grid in MODFLOW-LGR, MODPATH-LGR, and MODPATH-OBS. The locally-refined-grid approach can provide more accurate simulated equivalents to observed transport between the well and the river. The documentation presented here includes a brief discussion of previous work, description of the methods, and detailed descriptions of the required input files and how the output files are typically used.
DOT National Transportation Integrated Search
1973-08-01
The manual presents the complete ILSLOC computer program package. In addition to including a thorough description of the program itself and a commented listing, the manual contains a brief description of the ILS system and antenna patterns. To illust...
Column descriptions ID : Unique integer for each control time simulation LABEL : Description unique to each ID (see paper) Z : Redshift TIMEAREA : Observer-frame control time x area at 'Z' (year-arcmin ^2) Z2 : Second redshift TIMEVOL : Total rest-frame control time x volume between 'Z' and 'Z2' (year
Hardware acceleration and verification of systems designed with hardware description languages (HDL)
NASA Astrophysics Data System (ADS)
Wisniewski, Remigiusz; Wegrzyn, Marek
2005-02-01
Hardware description languages (HDLs) allow creating bigger and bigger designs nowadays. The size of prototyped systems very often exceeds million gates. Therefore verification process of the designs takes several hours or even days. The solution for this problem can be solved by hardware acceleration of simulation.
Reactive collisions for NO(2Π) + N(4S) at temperatures relevant to the hypersonic flight regime.
Denis-Alpizar, Otoniel; Bemish, Raymond J; Meuwly, Markus
2017-01-18
The NO(X 2 Π) + N( 4 S) reaction which occurs entirely in the triplet manifold of N 2 O is investigated using quasiclassical trajectories and quantum simulations. Fully-dimensional potential energy surfaces for the 3 A' and 3 A'' states are computed at the MRCI+Q level of theory and are represented using a reproducing kernel Hilbert space. The N-exchange and N 2 -formation channels are followed by using the multi-state adiabatic reactive molecular dynamics method. Up to 5000 K these reactions occur predominantly on the N 2 O 3 A'' surface. However, for higher temperatures the contributions of the 3 A' and 3 A'' states are comparable and the final state distributions are far from thermal equilibrium. From the trajectory simulations a new set of thermal rate coefficients of up to 20 000 K is determined. Comparison of the quasiclassical trajectory and quantum simulations shows that a classical description is a good approximation as determined from the final state analysis.
NASA Astrophysics Data System (ADS)
Nakayama, Akira; Arai, Gaku; Yamazaki, Shohei; Taketsugu, Tetsuya
2013-12-01
On-the-fly excited-state quantum mechanics/molecular mechanics molecular dynamics (QM/MM-MD) simulations of thymine in aqueous solution are performed to investigate the role of solvent water molecules on the nonradiative deactivation process. The complete active space second-order perturbation theory (CASPT2) method is employed for a thymine molecule as the QM part in order to provide a reliable description of the excited-state potential energies. It is found that, in addition to the previously reported deactivation pathway involving the twisting of the C-C double bond in the pyrimidine ring, another efficient deactivation pathway leading to conical intersections that accompanies the out-of-plane displacement of the carbonyl group is observed in aqueous solution. Decay through this pathway is not observed in the gas phase simulations, and our analysis indicates that the hydrogen bonds with solvent water molecules play a key role in stabilizing the potential energies of thymine in this additional decay pathway.
A method to identify and analyze biological programs through automated reasoning
Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen
2016-01-01
Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090
1030/1090 MHz Interference Simulator Technical Description and Initial Results
DOT National Transportation Integrated Search
2001-04-27
The 1030/1090 MHz Interference Simulator has been under development since March 1999, and currently replicates the interference production and operation of the existing surveillance systems and several proposed new Mode S applications. Efforts are on...
QAARM: quasi-anharmonic autoregressive model reveals molecular recognition pathways in ubiquitin
Savol, Andrej J.; Burger, Virginia M.; Agarwal, Pratul K.; Ramanathan, Arvind; Chennubhotla, Chakra S.
2011-01-01
Motivation: Molecular dynamics (MD) simulations have dramatically improved the atomistic understanding of protein motions, energetics and function. These growing datasets have necessitated a corresponding emphasis on trajectory analysis methods for characterizing simulation data, particularly since functional protein motions and transitions are often rare and/or intricate events. Observing that such events give rise to long-tailed spatial distributions, we recently developed a higher-order statistics based dimensionality reduction method, called quasi-anharmonic analysis (QAA), for identifying biophysically-relevant reaction coordinates and substates within MD simulations. Further characterization of conformation space should consider the temporal dynamics specific to each identified substate. Results: Our model uses hierarchical clustering to learn energetically coherent substates and dynamic modes of motion from a 0.5 μs ubiqutin simulation. Autoregressive (AR) modeling within and between states enables a compact and generative description of the conformational landscape as it relates to functional transitions between binding poses. Lacking a predictive component, QAA is extended here within a general AR model appreciative of the trajectory's temporal dependencies and the specific, local dynamics accessible to a protein within identified energy wells. These metastable states and their transition rates are extracted within a QAA-derived subspace using hierarchical Markov clustering to provide parameter sets for the second-order AR model. We show the learned model can be extrapolated to synthesize trajectories of arbitrary length. Contact: ramanathana@ornl.gov; chakracs@pitt.edu PMID:21685101
Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility
NASA Astrophysics Data System (ADS)
Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.
2017-12-01
The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.
NASA Astrophysics Data System (ADS)
DePaolo, D. J.; Steefel, C. I.; Bourg, I. C.
2013-12-01
This talk will review recent research relating to pore scale reactive transport effects done in the context of the Department of Energy-sponsored Energy Frontier Research Center led by Lawrence Berkeley National Laboratory with several other laboratory and University partners. This Center, called the Center for Nanoscale Controls on Geologic CO2 (NCGC) has focused effort on the behavior of supercritical CO2 being injected into and/or residing as capillary trapped-bubbles in sandstone and shale, with particular emphasis on the description of nanoscale to pore scale processes that could provide the basis for advanced simulations. In general, simulation of reservoir-scale behavior of CO2 sequestration assumes a number of mostly qualitative relationships that are defensible as nominal first-order descriptions of single-fluid systems, but neglect the many complications that are associated with a two-phase or three-phase reactive system. The contrasts in properties, and the mixing behavior of scCO2 and brine provide unusual conditions for water-rock interaction, and the NCGC has investigated the underlying issues by a combination of approaches including theoretical and experimental studies of mineral nucleation and growth, experimental studies of brine films, mineral wetting properties, dissolution-precipitation rates and infiltration patterns, molecular dynamic simulations and neutron scattering experiments of fluid properties for fluid confined in nanopores, and various approaches to numerical simulation of reactive transport processes. The work to date has placed new constraints on the thickness of brine films, and also on the wetting properties of CO2 versus brine, a property that varies between minerals and with salinity, and may also change with time as a result of the reactivity of CO2-saturated brine. Mineral dissolution is dependent on reactive surface area, which can be shown to vary by a large factor for various minerals, especially when correlated with interconnected pore space. High-resolution numerical simulations of reactive transport can ultimate lead to quantitative descriptions of pore scale chemistry and flow, and examples of recent developments will be presented. However, only a limited description of the processes can realistically be treated in such simulations, and only for chemically simple systems. Whether and when more complete simulations will be achievable is yet to be determined.
The Use of Finite Element Analysis to Enhance Research and Clinical Practice in Orthopedics.
Pfeiffer, Ferris M
2016-02-01
Finite element analysis (FEA) is a very powerful tool for the evaluation of biomechanics in orthopedics. Finite element (FE) simulations can effectively and efficiently evaluate thousands of variables (such as implant variation, surgical techniques, and various pathologies) to optimize design, screening, prediction, and treatment in orthopedics. Additionally, FEA can be used to retrospectively evaluate and troubleshoot complications or failures to prevent similar future occurrences. Finally, FE simulations are used to evaluate implants, procedures, and techniques in a time- and cost-effective manner. In this work, an overview of the development of FE models is provided and an example application is presented to simulate knee biomechanics for a specimen with medial meniscus insufficiency. FE models require the development of the geometry of interest, determination of the material properties of the tissues simulated, and an accurate application of a numerical solver to produce an accurate solution and representation of the field variables. The objectives of this work are to introduce the reader to the application of FEA in orthopedic analysis of the knee joint. A brief description of the model development process as well as a specific application to the investigation of knee joint stability in geometries with normal or compromised medial meniscal attachment is included. Significant increases in stretch of the anterior cruciate ligament were predicted in specimens with medial meniscus insufficiency (such behavior was confirmed in corresponding biomechanical testing). It can be concluded from this work that FE analysis of the knee can provide significant new information with which more effective clinical decisions can be made. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Simulation supported POD for RT test case-concept and modeling
NASA Astrophysics Data System (ADS)
Gollwitzer, C.; Bellon, C.; Deresch, A.; Ewert, U.; Jaenisch, G.-R.; Zscherpel, U.; Mistral, Q.
2012-05-01
Within the framework of the European project PICASSO, the radiographic simulator aRTist (analytical Radiographic Testing inspection simulation tool) developed by BAM has been extended for reliability assessment of film and digital radiography. NDT of safety relevant components of aerospace industry requires the proof of probability of detection (POD) of the inspection. Modeling tools can reduce the expense of such extended, time consuming NDT trials, if the result of simulation fits to the experiment. Our analytic simulation tool consists of three modules for the description of the radiation source, the interaction of radiation with test pieces and flaws, and the detection process with special focus on film and digital industrial radiography. It features high processing speed with near-interactive frame rates and a high level of realism. A concept has been developed as well as a software extension for reliability investigations, completed by a user interface for planning automatic simulations with varying parameters and defects. Furthermore, an automatic image analysis procedure is included to evaluate the defect visibility. The radiographic modeling from 3D CAD of aero engine components and quality test samples are compared as a precondition for real trials. This enables the evaluation and optimization of film replacement for application of modern digital equipment for economical NDT and defined POD.
Three-Dimensional Imaging in Rhinoplasty: A Comparison of the Simulated versus Actual Result.
Persing, Sarah; Timberlake, Andrew; Madari, Sarika; Steinbacher, Derek
2018-05-22
Computer imaging has become increasingly popular for rhinoplasty. Three-dimensional (3D) analysis permits a more comprehensive view from multiple vantage points. However, the predictability and concordance between the simulated and actual result have not been morphometrically studied. The purpose of this study was to aesthetically and quantitatively compare the simulated to actual rhinoplasty result. A retrospective review of 3D images (VECTRA, Canfield) for rhinoplasty patients was performed. Images (preop, simulated, and actual) were randomized. A blinded panel of physicians rated the images (1 = poor, 5 = excellent). The image series considered "best" was also recorded. A quantitative assessment of nasolabial angle and tip projection was compared. Paired and two-sample t tests were performed for statistical analysis (P < 0.05 as significant). Forty patients were included. 67.5% of preoperative images were rated as poor (mean = 1.7). The simulation received a mean score of 2.9 (good in 60% of cases). 82.5% of actual cases were rated good to excellent (mean 3.4) (P < 0.001). Overall, the panel significantly preferred the actual postoperative result in 77.5% of cases compared to the simulation in 22.5% of cases (P < 0.001). The actual nasal tip was more projected compared to the simulations for both males and females. There was no significant difference in nasal tip rotation between simulated and postoperative groups. 3D simulation is a powerful communication and planning tool in rhinoplasty. In this study, the actual result was deemed more aesthetic than the simulated image. Surgeon experience is important to translate the plan and achieve favorable postoperative results. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
Multifractal spectrum and lacunarity as measures of complexity of osseointegration.
de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko
2016-07-01
The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.
Randles, C A; Da Silva, A M; Buchard, V; Colarco, P R; Darmenov, A; Govindaraju, R; Smirnov, A; Holben, B; Ferrare, R; Hair, J; Shinozuka, Y; Flynn, C J
2017-09-01
The Modern-Era Retrospective Analysis for Research and Applications, Version 2 (MERRA-2) updates NASA's previous satellite era (1980 - onward) reanalysis system to include additional observations and improvements to the Goddard Earth Observing System, Version 5 (GEOS-5) Earth system model. As a major step towards a full Integrated Earth Systems Analysis (IESA), in addition to meteorological observations, MERRA-2 now includes assimilation of aerosol optical depth (AOD) from various ground- and space-based remote sensing platforms. Here, in the first of a pair of studies, we document the MERRA-2 aerosol assimilation, including a description of the prognostic model (GEOS-5 coupled to the GOCART aerosol module), aerosol emissions, and the quality control of ingested observations. We provide initial validation and evaluation of the analyzed AOD fields using independent observations from ground, aircraft, and shipborne instruments. We demonstrate the positive impact of the AOD assimilation on simulated aerosols by comparing MERRA-2 aerosol fields to an identical control simulation that does not include AOD assimilation. Having shown the AOD evaluation, we take a first look at aerosol-climate interactions by examining the shortwave, clear-sky aerosol direct radiative effect. In our companion paper, we evaluate and validate available MERRA-2 aerosol properties not directly impacted by the AOD assimilation (e.g. aerosol vertical distribution and absorption). Importantly, while highlighting the skill of the MERRA-2 aerosol assimilation products, both studies point out caveats that must be considered when using this new reanalysis product for future studies of aerosols and their interactions with weather and climate.
Zgarbová, Marie; Luque, F. Javier; Šponer, Jiří; Cheatham, Thomas E.; Otyepka, Michal; Jurečka, Petr
2013-01-01
We present a refinement of the backbone torsion parameters ε and ζ of the Cornell et al. AMBER force field for DNA simulations. The new parameters, denoted as εζOL1, were derived from quantum-mechanical calculations with inclusion of conformation-dependent solvation effects according to the recently reported methodology (J. Chem. Theory Comput. 2012, 7(9), 2886-2902). The performance of the refined parameters was analyzed by means of extended molecular dynamics (MD) simulations for several representative systems. The results showed that the εζOL1 refinement improves the backbone description of B-DNA double helices and G-DNA stem. In B-DNA simulations, we observed an average increase of the helical twist and narrowing of the major groove, thus achieving better agreement with X-ray and solution NMR data. The balance between populations of BI and BII backbone substates was shifted towards the BII state, in better agreement with ensemble-refined solution experimental results. Furthermore, the refined parameters decreased the backbone RMS deviations in B-DNA MD simulations. In the antiparallel guanine quadruplex (G-DNA) the εζOL1 modification improved the description of non-canonical α/γ backbone substates, which were shown to be coupled to the ε/ζ torsion potential. Thus, the refinement is suggested as a possible alternative to the current ε/ζ torsion potential, which may enable more accurate modeling of nucleic acids. However, long-term testing is recommended before its routine application in DNA simulations. PMID:24058302
Documentation of the Benson Diesel Engine Simulation Program
NASA Technical Reports Server (NTRS)
Vangerpen, Jon
1988-01-01
This report documents the Benson Diesel Engine Simulation Program and explains how it can be used to predict the performance of diesel engines. The program was obtained from the Garrett Turbine Engine Company but has been extensively modified since. The program is a thermodynamic simulation of the diesel engine cycle which uses a single zone combustion model. It can be used to predict the effect of changes in engine design and operating parameters such as valve timing, speed and boost pressure. The most significan change made to this program is the addition of a more detailed heat transfer model to predict metal part temperatures. This report contains a description of the sub-models used in the Benson program, a description of the input parameters and sample program runs.
Baseline process description for simulating plutonium oxide production for precalc project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pike, J. A.
Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO 2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as wellmore » as process and facility design details necessary for multi-scale, multi-physics models are provided.« less
Pre- and postprocessing for reservoir simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, W.L.; Ingalls, L.J.; Prasad, S.J.
1991-05-01
This paper describes the functionality and underlying programing paradigms of Shell's simulator-related reservoir-engineering graphics system. THis system includes the simulation postprocessing programs Reservoir Display System (RDS) and Fast Reservoir Engineering Displays (FRED), a hypertext-like on-line documentation system (DOC), and a simulator input preprocessor (SIMPLSIM). RDS creates displays of reservoir simulation results. These displays represent the areal or cross-section distribution of computer reservoir parameters, such as pressure, phase saturation, or temperature. Generation of these images at real-time animation rates is discussed. FRED facilitates the creation of plot files from reservoir simulation output. The use of dynamic memory allocation, asynchronous I/O, amore » table-driven screen manager, and mixed-language (FORTRAN and C) programming are detailed. DOC is used to create and access on-line documentation for the pre-and post-processing programs and the reservoir simulators. DOC can be run by itself or can be accessed from within any other graphics or nongraphics application program. DOC includes a text editor, which is that basis for a reservoir simulation tutorial and greatly simplifies the preparation of simulator input. The use of sharable images, graphics, and the documentation file network are described. Finally, SIMPLSIM is a suite of program that uses interactive graphics in the preparation of reservoir description data for input into reservoir simulators. The SIMPLSIM user-interface manager (UIM) and its graphic interface for reservoir description are discussed.« less
ERIC Educational Resources Information Center
Zuckerman, David W.; Horn, Robert E.
Simulation games are classed in this guide by subject area: business, domestic politics, economics, ecology, education, geography, history, international relations, psychology, skill development, sociology, social studies, and urban affairs. A summary description (of roles, objectives, decisions, and purposes), cost producer, playing data (age…
Chapter 2: Fire and Fuels Extension: Model description
Sarah J. Beukema; Elizabeth D. Reinhardt; Julee A. Greenough; Donald C. E. Robinson; Werner A. Kurz
2003-01-01
The Fire and Fuels Extension to the Forest Vegetation Simulator is a model that simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. Existing models are used to represent forest stand development (the Forest Vegetation Simulator, Wykoff and others 1982), fire behavior (Rothermel 1972, Van Wagner 1977, and...
HYDRA: High Speed Simulation Architecture for Precision Spacecraft Formation Flying
NASA Technical Reports Server (NTRS)
Martin, Bryan J.; Sohl, Garett A.
2003-01-01
This viewgraph presentation describes HYDRA, which is architecture to facilitate high-fidelity and real-time simulation of formation flying missions. The contents include: 1) Motivation; 2) Objective; 3) HYDRA-Description and Overview; 4) HYDRA-Hierarchy; 5) Communication in HYDRA; 6) Simulation Specific Concerns in HYDRA; 7) Example application (Formation Acquisition); and 8) Sample Problem Results.
Protection for the U.S. Automobile Industry: A Joint Class Simulation in Trade Policy.
ERIC Educational Resources Information Center
Hess, Peter N.; Ortmayer, Louis M.
A description of a joint class simulation in trade policy undertaken by an international economics class and a political science class at Davidson College (Pennsylvania) is presented in three sections. Section I describes the structure of the simulation. Students were divided into groups of United States auto manufacturers, the United Auto…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neilson, James R.; McQueen, Tyrel M.
With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less
Neilson, James R.; McQueen, Tyrel M.
2015-09-20
With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less
Numerical Simulation of High-Speed Turbulent Reacting Flows
NASA Technical Reports Server (NTRS)
Givi, P.; Taulbee, D. B.; Madnia, C. K.; Jaberi, F. A.; Colucci, P. J.; Gicquel, L. Y. M.; Adumitroaie, V.; James, S.
1999-01-01
The objectives of this research are: (1) to develop and implement a new methodology for large eddy simulation of (LES) of high-speed reacting turbulent flows. (2) To develop algebraic turbulence closures for statistical description of chemically reacting turbulent flows.
Advanced EUV mask and imaging modeling
NASA Astrophysics Data System (ADS)
Evanschitzky, Peter; Erdmann, Andreas
2017-10-01
The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.
Simulated infrared spectra of triflic acid during proton dissociation.
Laflamme, Patrick; Beaudoin, Alexandre; Chapaton, Thomas; Spino, Claude; Soldera, Armand
2012-05-05
Vibrational analysis of triflic acid (TfOH) at different water uptakes was conducted. This molecule mimics the sulfonate end of the Nafion side-chain. As the proton leaves the sulfonic acid group, structural changes within the Nafion side-chain take place. They are revealed by signal shifts in the infrared spectrum. Molecular modeling is used to follow structural modifications that occur during proton dissociation. To confirm the accuracy of the proposed structures, infrared spectra were computed via quantum chemical modeling based on density functional theory. The requirement to use additional diffuse functions in the basis set is discussed. Comparison between simulated infrared spectra of 1 and 2 acid molecules with different water contents and experimental data was performed. An accurate description of infrared spectra for systems containing 2 TfOH was obtained. Copyright © 2012 Wiley Periodicals, Inc.
Scaling and efficiency determine the irreversible evolution of a market
Baldovin, F.; Stella, A. L.
2007-01-01
In setting up a stochastic description of the time evolution of a financial index, the challenge consists in devising a model compatible with all stylized facts emerging from the analysis of financial time series and providing a reliable basis for simulating such series. Based on constraints imposed by market efficiency and on an inhomogeneous-time generalization of standard simple scaling, we propose an analytical model which accounts simultaneously for empirical results like the linear decorrelation of successive returns, the power law dependence on time of the volatility autocorrelation function, and the multiscaling associated to this dependence. In addition, our approach gives a justification and a quantitative assessment of the irreversible character of the index dynamics. This irreversibility enters as a key ingredient in a novel simulation strategy of index evolution which demonstrates the predictive potential of the model.
New developments in the McStas neutron instrument simulation package
NASA Astrophysics Data System (ADS)
Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.
2014-07-01
The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.
Application of variable-gain output feedback for high-alpha control
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.
1990-01-01
A variable-gain, optimal, discrete, output feedback design approach that is applied to a nonlinear flight regime is described. The flight regime covers a wide angle-of-attack range that includes stall and post stall. The paper includes brief descriptions of the variable-gain formulation, the discrete-control structure and flight equations used to apply the design approach, and the high performance airplane model used in the application. Both linear and nonlinear analysis are shown for a longitudinal four-model design case with angles of attack of 5, 15, 35, and 60 deg. Linear and nonlinear simulations are compared for a single-point longitudinal design at 60 deg angle of attack. Nonlinear simulations for the four-model, multi-mode, variable-gain design include a longitudinal pitch-up and pitch-down maneuver and high angle-of-attack regulation during a lateral maneuver.
NASA Technical Reports Server (NTRS)
Stock, Thomas A.
1995-01-01
Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intraply level, and the related effects of these on composite properties.
Hardware description languages
NASA Technical Reports Server (NTRS)
Tucker, Jerry H.
1994-01-01
Hardware description languages are special purpose programming languages. They are primarily used to specify the behavior of digital systems and are rapidly replacing traditional digital system design techniques. This is because they allow the designer to concentrate on how the system should operate rather than on implementation details. Hardware description languages allow a digital system to be described with a wide range of abstraction, and they support top down design techniques. A key feature of any hardware description language environment is its ability to simulate the modeled system. The two most important hardware description languages are Verilog and VHDL. Verilog has been the dominant language for the design of application specific integrated circuits (ASIC's). However, VHDL is rapidly gaining in popularity.
Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C
2015-10-01
The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here before performing his/her own novel research. In addition, an investigator entering the field of Monte Carlo simulations can use these descriptions and results as a self-teaching tool to ensure that he/she is able to perform a specific simulation correctly. Finally, educators can assign these cases as learning projects as part of course objectives or training programs.
Rabilloud, Franck
2014-10-14
Absorption spectra of Ag20 and Ag55(q) (q = +1, -3) nanoclusters are investigated in the framework of the time-dependent density functional theory in order to analyse the role of the d electrons in plasmon-like band of silver clusters. The description of the plasmon-like band from calculations using density functionals containing an amount of Hartree-Fock exchange at long range, namely, hybrid and range-separated hybrid (RSH) density functionals, is in good agreement with the classical interpretation of the plasmon-like structure as a collective excitation of valence s-electrons. In contrast, using local or semi-local exchange functionals (generalized gradient approximations (GGAs) or meta-GGAs) leads to a strong overestimation of the role of d electrons in the plasmon-like band. The semi-local asymptotically corrected model potentials also describe the plasmon as mainly associated to d electrons, though calculated spectra are in fairly good agreement with those calculated using the RSH scheme. Our analysis shows that a portion of non-local exchange modifies the description of the plasmon-like band.
Concurrent heterogeneous neural model simulation on real-time neuromimetic hardware.
Rast, Alexander; Galluppi, Francesco; Davies, Sergio; Plana, Luis; Patterson, Cameron; Sharp, Thomas; Lester, David; Furber, Steve
2011-11-01
Dedicated hardware is becoming increasingly essential to simulate emerging very-large-scale neural models. Equally, however, it needs to be able to support multiple models of the neural dynamics, possibly operating simultaneously within the same system. This may be necessary either to simulate large models with heterogeneous neural types, or to simplify simulation and analysis of detailed, complex models in a large simulation by isolating the new model to a small subpopulation of a larger overall network. The SpiNNaker neuromimetic chip is a dedicated neural processor able to support such heterogeneous simulations. Implementing these models on-chip uses an integrated library-based tool chain incorporating the emerging PyNN interface that allows a modeller to input a high-level description and use an automated process to generate an on-chip simulation. Simulations using both LIF and Izhikevich models demonstrate the ability of the SpiNNaker system to generate and simulate heterogeneous networks on-chip, while illustrating, through the network-scale effects of wavefront synchronisation and burst gating, methods that can provide effective behavioural abstractions for large-scale hardware modelling. SpiNNaker's asynchronous virtual architecture permits greater scope for model exploration, with scalable levels of functional and temporal abstraction, than conventional (or neuromorphic) computing platforms. The complete system illustrates a potential path to understanding the neural model of computation, by building (and breaking) neural models at various scales, connecting the blocks, then comparing them against the biology: computational cognitive neuroscience. Copyright © 2011 Elsevier Ltd. All rights reserved.
Evaluation of snowmelt simulation in the Weather Research and Forecasting model
NASA Astrophysics Data System (ADS)
Jin, Jiming; Wen, Lijuan
2012-05-01
The objective of this study is to better understand and improve snowmelt simulations in the advanced Weather Research and Forecasting (WRF) model by coupling it with the Community Land Model (CLM) Version 3.5. Both WRF and CLM are developed by the National Center for Atmospheric Research. The automated Snow Telemetry (SNOTEL) station data over the Columbia River Basin in the northwestern United States are used to evaluate snowmelt simulations generated with the coupled WRF-CLM model. These SNOTEL data include snow water equivalent (SWE), precipitation, and temperature. The simulations cover the period of March through June 2002 and focus mostly on the snowmelt season. Initial results show that when compared to observations, WRF-CLM significantly improves the simulations of SWE, which is underestimated when the release version of WRF is coupled with the Noah and Rapid Update Cycle (RUC) land surface schemes, in which snow physics is oversimplified. Further analysis shows that more realistic snow surface energy allocation in CLM is an important process that results in improved snowmelt simulations when compared to that in Noah and RUC. Additional simulations with WRF-CLM at different horizontal spatial resolutions indicate that accurate description of topography is also vital to SWE simulations. WRF-CLM at 10 km resolution produces the most realistic SWE simulations when compared to those produced with coarser spatial resolutions in which SWE is remarkably underestimated. The coupled WRF-CLM provides an important tool for research and forecasts in weather, climate, and water resources at regional scales.
Kinetic modeling of x-ray laser-driven solid Al plasmas via particle-in-cell simulation
NASA Astrophysics Data System (ADS)
Royle, R.; Sentoku, Y.; Mancini, R. C.; Paraschiv, I.; Johzaki, T.
2017-06-01
Solid-density plasmas driven by intense x-ray free-electron laser (XFEL) radiation are seeded by sources of nonthermal photoelectrons and Auger electrons that ionize and heat the target via collisions. Simulation codes that are commonly used to model such plasmas, such as collisional-radiative (CR) codes, typically assume a Maxwellian distribution and thus instantaneous thermalization of the source electrons. In this study, we present a detailed description and initial applications of a collisional particle-in-cell code, picls, that has been extended with a self-consistent radiation transport model and Monte Carlo models for photoionization and K L L Auger ionization, enabling the fully kinetic simulation of XFEL-driven plasmas. The code is used to simulate two experiments previously performed at the Linac Coherent Light Source investigating XFEL-driven solid-density Al plasmas. It is shown that picls-simulated pulse transmissions using the Ecker-Kröll continuum-lowering model agree much better with measurements than do simulations using the Stewart-Pyatt model. Good quantitative agreement is also found between the time-dependent picls results and those of analogous simulations by the CR code scfly, which was used in the analysis of the experiments to accurately reproduce the observed K α emissions and pulse transmissions. Finally, it is shown that the effects of the nonthermal electrons are negligible for the conditions of the particular experiments under investigation.
A hybrid algorithm for coupling partial differential equation and compartment-based dynamics.
Harrison, Jonathan U; Yates, Christian A
2016-09-01
Stochastic simulation methods can be applied successfully to model exact spatio-temporally resolved reaction-diffusion systems. However, in many cases, these methods can quickly become extremely computationally intensive with increasing particle numbers. An alternative description of many of these systems can be derived in the diffusive limit as a deterministic, continuum system of partial differential equations (PDEs). Although the numerical solution of such PDEs is, in general, much more efficient than the full stochastic simulation, the deterministic continuum description is generally not valid when copy numbers are low and stochastic effects dominate. Therefore, to take advantage of the benefits of both of these types of models, each of which may be appropriate in different parts of a spatial domain, we have developed an algorithm that can be used to couple these two types of model together. This hybrid coupling algorithm uses an overlap region between the two modelling regimes. By coupling fluxes at one end of the interface and using a concentration-matching condition at the other end, we ensure that mass is appropriately transferred between PDE- and compartment-based regimes. Our methodology gives notable reductions in simulation time in comparison with using a fully stochastic model, while maintaining the important stochastic features of the system and providing detail in appropriate areas of the domain. We test our hybrid methodology robustly by applying it to several biologically motivated problems including diffusion and morphogen gradient formation. Our analysis shows that the resulting error is small, unbiased and does not grow over time. © 2016 The Authors.
A hybrid algorithm for coupling partial differential equation and compartment-based dynamics
Yates, Christian A.
2016-01-01
Stochastic simulation methods can be applied successfully to model exact spatio-temporally resolved reaction–diffusion systems. However, in many cases, these methods can quickly become extremely computationally intensive with increasing particle numbers. An alternative description of many of these systems can be derived in the diffusive limit as a deterministic, continuum system of partial differential equations (PDEs). Although the numerical solution of such PDEs is, in general, much more efficient than the full stochastic simulation, the deterministic continuum description is generally not valid when copy numbers are low and stochastic effects dominate. Therefore, to take advantage of the benefits of both of these types of models, each of which may be appropriate in different parts of a spatial domain, we have developed an algorithm that can be used to couple these two types of model together. This hybrid coupling algorithm uses an overlap region between the two modelling regimes. By coupling fluxes at one end of the interface and using a concentration-matching condition at the other end, we ensure that mass is appropriately transferred between PDE- and compartment-based regimes. Our methodology gives notable reductions in simulation time in comparison with using a fully stochastic model, while maintaining the important stochastic features of the system and providing detail in appropriate areas of the domain. We test our hybrid methodology robustly by applying it to several biologically motivated problems including diffusion and morphogen gradient formation. Our analysis shows that the resulting error is small, unbiased and does not grow over time. PMID:27628171
RUIZ-RAMOS, MARGARITA; MÍNGUEZ, M. INÉS
2006-01-01
• Background Plant structural (i.e. architectural) models explicitly describe plant morphology by providing detailed descriptions of the display of leaf and stem surfaces within heterogeneous canopies and thus provide the opportunity for modelling the functioning of plant organs in their microenvironments. The outcome is a class of structural–functional crop models that combines advantages of current structural and process approaches to crop modelling. ALAMEDA is such a model. • Methods The formalism of Lindenmayer systems (L-systems) was chosen for the development of a structural model of the faba bean canopy, providing both numerical and dynamic graphical outputs. It was parameterized according to the results obtained through detailed morphological and phenological descriptions that capture the detailed geometry and topology of the crop. The analysis distinguishes between relationships of general application for all sowing dates and stem ranks and others valid only for all stems of a single crop cycle. • Results and Conclusions The results reveal that in faba bean, structural parameterization valid for the entire plant may be drawn from a single stem. ALAMEDA was formed by linking the structural model to the growth model ‘Simulation d'Allongement des Feuilles’ (SAF) with the ability to simulate approx. 3500 crop organs and components of a group of nine plants. Model performance was verified for organ length, plant height and leaf area. The L-system formalism was able to capture the complex architecture of canopy leaf area of this indeterminate crop and, with the growth relationships, generate a 3D dynamic crop simulation. Future development and improvement of the model are discussed. PMID:16390842
Piloted aircraft simulation concepts and overview
NASA Technical Reports Server (NTRS)
Sinacori, J. B.
1978-01-01
An overview of piloted aircraft simulation is presented that reflects the viewpoint of an aeronautical technologist. The intent is to acquaint potential users with some of the basic concepts and issues that characterize piloted simulation. Application to the development of aircraft are highlighted, but some aspects of training simulators are covered. A historical review is given together with a description of some current simulators. Simulator usages, advantages, and limitations are discussed and human perception qualities important to simulation are related. An assessment of current simulation is presented that addresses validity, fidelity, and deficiencies. Future prospects are discussed and technology projections are made.
NASA Technical Reports Server (NTRS)
Baron, S.; Muralidharan, R.; Kleinman, D. L.
1978-01-01
The optimal control model of the human operator is used to develop closed loop models for analyzing the effects of (digital) simulator characteristics on predicted performance and/or workload. Two approaches are considered: the first utilizes a continuous approximation to the discrete simulation in conjunction with the standard optimal control model; the second involves a more exact discrete description of the simulator in a closed loop multirate simulation in which the optimal control model simulates the pilot. Both models predict that simulator characteristics can have significant effects on performance and workload.
The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.
ERIC Educational Resources Information Center
Shatz, Mark A.
1985-01-01
A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)
MCFire model technical description
David R. Conklin; James M. Lenihan; Dominique Bachelet; Ronald P. Neilson; John B. Kim
2016-01-01
MCFire is a computer program that simulates the occurrence and effects of wildfire on natural vegetation, as a submodel within the MC1 dynamic global vegetation model. This report is a technical description of the algorithms and parameter values used in MCFire, intended to encapsulate its design and features a higher level that is more conceptual than the level...
An Object Description Language for Distributed Discrete Event Simulations
2001-05-24
some tremendous improvements in simulation speed and fidelity. This dissertation describes a new programming language that is useful in creating...104 CHAPTER 8. GLUT- BASED USER INTERFACE....................................il 8. 1. OUTPUT CONCERNS...143 9.3. GLUT BASED DEMONSTRATIONS ......................................................... 145 9.3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, C. -Y.; Derbenev, Ya. S.; Douglas, D.
For a high-brightness electron beam with low energy and high bunch charge traversing a recirculation beamline, coherent synchrotron radiation and space charge effect may result in the microbunching instability (MBI). Both tracking simulation and Vlasov analysis for an early design of Circulator Cooler Ring for the Jefferson Lab Electron Ion Collider reveal significant MBI. It is envisioned these could be substantially suppressed by using a magnetized beam. In this work, we extend the existing Vlasov analysis, originally developed for a non-magnetized beam, to the description of transport of a magnetized beam including relevant collective effects. As a result, the newmore » formulation will be further employed to confirm prediction of microbunching suppression for a magnetized beam transport in a recirculating machine design.« less
NASA Astrophysics Data System (ADS)
Trochimczuk, R.
2017-02-01
This paper presents an analysis of a parallelogram mechanism commonly used to provide a kinematic remote center of motion in surgical telemanipulators. Selected types of parallel manipulator designs, encountered in commercial and laboratory-made designs described in the medical robotics literature, will serve as the research material. Among other things, computer simulations in the ANSYS 13.0 CAD/CAE software environment, employing the finite element method, will be used. The kinematics of the solution of manipulator with the parallelogram mechanism will be determined in order to provide a more complete description. These results will form the basis for the decision regarding the possibility of applying a parallelogram mechanism in an original prototype of a telemanipulator arm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goltz, G.; Weiner, H.
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document provides a detailed description of the DSPAmore » Computer Program system and its subprograms. This manual will assist the programmer in revising or updating the several subprograms.« less
Shuttle mission simulator baseline definition report, volume 1
NASA Technical Reports Server (NTRS)
Burke, J. F.; Small, D. E.
1973-01-01
A baseline definition of the space shuttle mission simulator is presented. The subjects discussed are: (1) physical arrangement of the complete simulator system in the appropriate facility, with a definition of the required facility modifications, (2) functional descriptions of all hardware units, including the operational features, data demands, and facility interfaces, (3) hardware features necessary to integrate the items into a baseline simulator system to include the rationale for selecting the chosen implementation, and (4) operating, maintenance, and configuration updating characteristics of the simulator hardware.
NASA Astrophysics Data System (ADS)
Babin, Volodymr; Baucom, Jason; Darden, Thomas; Sagui, Celeste
2006-03-01
We have investigated to what extend molecular dynamics (MD) simulatons can reproduce DNA sequence-specific features, given different electrostatic descriptions and different cell environments. For this purpose, we have carried out multiple unrestrained MD simulations of the duplex d(CCAACGTTGG)2. With respect to the electrostatic descriptions, two different force fields were studied: a traditional description based on atomic point charges and a polarizable force field. With respect to the cell environment, the difference between crystal and solution environments is emphasized, as well as the structural importance of divalent ions. By imposing the correct experimental unit cell environment, an initial configuration with two ideal B-DNA duplexes in the unit cell is shown to converge to the crystallographic structure. To the best of our knowledge, this provides the first example of a multiple nanosecond MD trajectory that shows and ideal structure converging to an experimental one, with a significant decay of the RMSD.
Standalone BISON Fuel Performance Results for Watts Bar Unit 1, Cycles 1-3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clarno, Kevin T.; Pawlowski, Roger; Stimpson, Shane
2016-03-07
The Consortium for Advanced Simulation of Light Water Reactors (CASL) is moving forward with more complex multiphysics simulations and increased focus on incorporating fuel performance analysis methods. The coupled neutronics/thermal-hydraulics capabilities within the Virtual Environment for Reactor Applications Core Simulator (VERA-CS) have become relatively stable, and major advances have been made in analysis efforts, including the simulation of twelve cycles of Watts Bar Nuclear Unit 1 (WBN1) operation. While this is a major achievement, the VERA-CS approaches for treating fuel pin heat transfer have well-known limitations that could be eliminated through better integration with the BISON fuel performance code. Severalmore » approaches are being implemented to consider fuel performance, including a more direct multiway coupling with Tiamat, as well as a more loosely coupled one-way approach with standalone BISON cases. Fuel performance typically undergoes an independent analysis using a standalone fuel performance code with manually specified input defined from an independent core simulator solution or set of assumptions. This report summarizes the improvements made since the initial milestone to execute BISON from VERA-CS output. Many of these improvements were prompted through tighter collaboration with the BISON development team at Idaho National Laboratory (INL). A brief description of WBN1 and some of the VERA-CS data used to simulate it are presented. Data from a small mesh sensitivity study are shown, which helps justify the mesh parameters used in this work. The multi-cycle results are presented, followed by the results for the first three cycles of WBN1 operation, particularly the parameters of interest to pellet-clad interaction (PCI) screening (fuel-clad gap closure, maximum centerline fuel temperature, maximum/minimum clad hoop stress, and cumulative damage index). Once the mechanics of this capability are functioning, future work will target cycles with known or suspected PCI failures to determine how well they can be estimated.« less
Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model
NASA Technical Reports Server (NTRS)
Lipatov, Alexander S.; Combi, Michael R.
2004-01-01
The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.
Tail reconnection in the global magnetospheric context: Vlasiator first results
NASA Astrophysics Data System (ADS)
Palmroth, Minna; Hoilijoki, Sanni; Juusola, Liisa; Pulkkinen, Tuija I.; Hietala, Heli; Pfau-Kempf, Yann; Ganse, Urs; von Alfthan, Sebastian; Vainio, Rami; Hesse, Michael
2017-11-01
The key dynamics of the magnetotail have been researched for decades and have been associated with either three-dimensional (3-D) plasma instabilities and/or magnetic reconnection. We apply a global hybrid-Vlasov code, Vlasiator, to simulate reconnection self-consistently in the ion kinetic scales in the noon-midnight meridional plane, including both dayside and nightside reconnection regions within the same simulation box. Our simulation represents a numerical experiment, which turns off the 3-D instabilities but models ion-scale reconnection physically accurately in 2-D. We demonstrate that many known tail dynamics are present in the simulation without a full description of 3-D instabilities or without the detailed description of the electrons. While multiple reconnection sites can coexist in the plasma sheet, one reconnection point can start a global reconfiguration process, in which magnetic field lines become detached and a plasmoid is released. As the simulation run features temporally steady solar wind input, this global reconfiguration is not associated with sudden changes in the solar wind. Further, we show that lobe density variations originating from dayside reconnection may play an important role in stabilising tail reconnection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y.
Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. In such simulations we often perform them at elevated temperaturesmore » to artificially “correct” for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. In order to address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na +, K +, and Cl - ions. We show that simulations at 390–400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. These results suggest that an elevated temperature around 390–400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.« less
Pham, Tuan Anh; Ogitsu, Tadashi; Lau, Edmond Y.; ...
2016-10-17
Establishing an accurate and predictive computational framework for the description of complex aqueous solutions is an ongoing challenge for density functional theory based first-principles molecular dynamics (FPMD) simulations. In this context, important advances have been made in recent years, including the development of sophisticated exchange-correlation functionals. On the other hand, simulations based on simple generalized gradient approximation (GGA) functionals remain an active field, particularly in the study of complex aqueous solutions due to a good balance between the accuracy, computational expense, and the applicability to a wide range of systems. In such simulations we often perform them at elevated temperaturesmore » to artificially “correct” for GGA inaccuracies in the description of liquid water; however, a detailed understanding of how the choice of temperature affects the structure and dynamics of other components, such as solvated ions, is largely unknown. In order to address this question, we carried out a series of FPMD simulations at temperatures ranging from 300 to 460 K for liquid water and three representative aqueous solutions containing solvated Na +, K +, and Cl - ions. We show that simulations at 390–400 K with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional yield water structure and dynamics in good agreement with experiments at ambient conditions. Simultaneously, this computational setup provides ion solvation structures and ion effects on water dynamics consistent with experiments. These results suggest that an elevated temperature around 390–400 K with the PBE functional can be used for the description of structural and dynamical properties of liquid water and complex solutions with solvated ions at ambient conditions.« less
ERIC Educational Resources Information Center
Georgia State Univ., Atlanta.
This employee's manual is part of a position simulation for use in an office applications laboratory at the postsecondary level. The purpose of the simulation is to give the student an opportunity to learn the tasks and duties performed by a legal secretary. Contents include information about the company, a job description for a legal secretary in…
MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1
1971-05-01
A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air
Technical evaluation report on the Flight Mechanics Panel Symposium on Flight Simulation
NASA Technical Reports Server (NTRS)
Cook, Anthony M.
1986-01-01
In recent years, important advances were made in technology both for ground-based and in-flight simulators. There was equally a broadening of the use of flight simulators for research, development, and training purposes. An up-to-date description of the state-of-the-art of technology and engineering was provided for both ground-based and in-flight simulators and their respective roles were placed in context within the aerospace scene.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.
PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.
1998-01-01
PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.
Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S
2015-12-01
Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
LLE review. Quarterly report, January 1994--March 1994, Volume 58
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, A.
1994-07-01
This volume of the LLE Review, covering the period Jan - Mar 1994, contains articles on backlighting diagnostics; the effect of electron collisions on ion-acoustic waves and heat flow; using PIC code simulations for analysis of ultrashort laser pulses interacting with solid targets; creating a new instrument for characterizing thick cryogenic layers; and a description of a large-aperture ring amplifier for laser-fusion drivers. Three of these articles - backlighting diagnostics; characterizing thick cryogenic layers; and large-aperture ring amplifier - are directly related to the OMEGA Upgrade, now under construction. Separate abstracts have been prepared for articles from this report.
A power transformer as a source of noise.
Zawieska, Wiktor Marek
2007-01-01
This article presents selected results of analyses and simulations carried out as part of research performed at the Central Institute of Labor Protection - the National Research Institute (CIOP-PIB) in connection with the development of a system for active reduction of noise emitted by high power electricity transformers. This analysis covers the transformer as a source of noise as well as a mathematical description of the phenomenon of radiation of vibroacoustic energy through a transformer enclosure modeled as a vibrating rectangular plate. Also described is an acoustic model of the transformer in the form of an array of loudspeakers.
The Effect of Visual Information on the Manual Approach and Landing
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1982-01-01
The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.
SimPhospho: a software tool enabling confident phosphosite assignment.
Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L
2018-03-27
Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.
Design and experiment of vehicular charger AC/DC system based on predictive control algorithm
NASA Astrophysics Data System (ADS)
He, Guangbi; Quan, Shuhai; Lu, Yuzhang
2018-06-01
For the car charging stage rectifier uncontrollable system, this paper proposes a predictive control algorithm of DC/DC converter based on the prediction model, established by the state space average method and its prediction model, obtained by the optimal mathematical description of mathematical calculation, to analysis prediction algorithm by Simulink simulation. The design of the structure of the car charging, at the request of the rated output power and output voltage adjustable control circuit, the first stage is the three-phase uncontrolled rectifier DC voltage Ud through the filter capacitor, after by using double-phase interleaved buck-boost circuit with wide range output voltage required value, analyzing its working principle and the the parameters for the design and selection of components. The analysis of current ripple shows that the double staggered parallel connection has the advantages of reducing the output current ripple and reducing the loss. The simulation experiment of the whole charging circuit is carried out by software, and the result is in line with the design requirements of the system. Finally combining the soft with hardware circuit to achieve charging of the system according to the requirements, experimental platform proved the feasibility and effectiveness of the proposed predictive control algorithm based on the car charging of the system, which is consistent with the simulation results.
Structural, Physical, and Compositional Analysis of Lunar Simulants and Regolith
NASA Technical Reports Server (NTRS)
Greenberg, Paul; Street, Kenneth W.; Gaier, James
2008-01-01
Relative to the prior manned Apollo and unmanned robotic missions, planned Lunar initiatives are comparatively complex and longer in duration. Individual crew rotations are envisioned to span several months, and various surface systems must function in the Lunar environment for periods of years. As a consequence, an increased understanding of the surface environment is required to engineer and test the associated materials, components, and systems necessary to sustain human habitation and surface operations. The effort described here concerns the analysis of existing simulant materials, with application to Lunar return samples. The interplay between these analyses fulfills the objective of ascertaining the critical properties of regolith itself, and the parallel objective of developing suitable stimulant materials for a variety of engineering applications. Presented here are measurements of the basic physical attributes, i.e. particle size distributions and general shape factors. Also discussed are structural and chemical properties, as determined through a variety of techniques, such as optical microscopy, SEM and TEM microscopy, Mossbauer Spectroscopy, X-ray diffraction, Raman microspectroscopy, inductively coupled argon plasma emission spectroscopy and energy dispersive X-ray fluorescence mapping. A comparative description of currently available stimulant materials is discussed, with implications for more detailed analyses, as well as the requirements for continued refinement of methods for simulant production.
Integration of process diagnostics and three dimensional simulations in thermal spraying
NASA Astrophysics Data System (ADS)
Zhang, Wei
Thermal spraying is a group of processes in which the metallic or ceramic materials are deposited in a molten or semi-molten state on a prepared substrate. In atmospheric plasma spray process, a thermal plasma jet is used to heat up and accelerate loading particles. The process is inherently complex due to the deviation from equilibrium conditions, three dimensional nature, multitude of interrelated variables involved, and stochastic variability at different stages. This dissertation is aimed at understanding the in-flight particle state and plasma plume characteristics in atmospheric plasma spray process through the integration of process diagnostics and three-dimensional simulation. Effects of injection angle and carrier gas flow rate on in-flight particle characteristics are studied experimentally and interpreted through numerical simulation. Plasma jet perturbation by particle injection angle, carrier gas, and particle loading are also identified. Maximum particle average temperature and velocity at any given spray distance is systematically quantified. Optimum plasma plume position for particle injection which was observed in experiments was verified numerically along with description of physical mechanisms. Correlation of spray distance with in-flight particle behavior for various kinds of materials is revealed. A new strategy for visualization and representation of particle diagnostic results for thermal spray processes has been presented. Specifically, 1 st order process maps (process-particle interactions) have been addressed by converting the Temperature-Velocity of particles obtained via diagnostics into non-dimensional group parameters [Melting Index-Reynolds number]. This approach provides an improved description of the thermal and kinetic energy of particles and allows for cross-comparison of diagnostic data within a given process for different materials, comparison of a single material across different thermal spray processes, and detailed assessment of the melting behavior through recourse to analysis of the distributions. An additional group parameter, Oxidation Index, has been applied to relatively track the oxidation extent of metallic particles under different operating conditions. The new mapping strategies have also been proposed in circumstances where only ensemble particle diagnostics are available. Through the integration of process diagnostics and numerical simulation, key issues concerning in-flight particle status as well as the controlling physical mechanisms have been analyzed. A scientific and intellectual strategy for universal description of particle characteristics has been successfully developed.
Pydna: a simulation and documentation tool for DNA assembly strategies using python.
Pereira, Filipa; Azevedo, Flávio; Carvalho, Ângela; Ribeiro, Gabriela F; Budde, Mark W; Johansson, Björn
2015-05-02
Recent advances in synthetic biology have provided tools to efficiently construct complex DNA molecules which are an important part of many molecular biology and biotechnology projects. The planning of such constructs has traditionally been done manually using a DNA sequence editor which becomes error-prone as scale and complexity of the construction increase. A human-readable formal description of cloning and assembly strategies, which also allows for automatic computer simulation and verification, would therefore be a valuable tool. We have developed pydna, an extensible, free and open source Python library for simulating basic molecular biology DNA unit operations such as restriction digestion, ligation, PCR, primer design, Gibson assembly and homologous recombination. A cloning strategy expressed as a pydna script provides a description that is complete, unambiguous and stable. Execution of the script automatically yields the sequence of the final molecule(s) and that of any intermediate constructs. Pydna has been designed to be understandable for biologists with limited programming skills by providing interfaces that are semantically similar to the description of molecular biology unit operations found in literature. Pydna simplifies both the planning and sharing of cloning strategies and is especially useful for complex or combinatorial DNA molecule construction. An important difference compared to existing tools with similar goals is the use of Python instead of a specifically constructed language, providing a simulation environment that is more flexible and extensible by the user.
Roudi, Yasser; Latham, Peter E
2007-09-01
A fundamental problem in neuroscience is understanding how working memory--the ability to store information at intermediate timescales, like tens of seconds--is implemented in realistic neuronal networks. The most likely candidate mechanism is the attractor network, and a great deal of effort has gone toward investigating it theoretically. Yet, despite almost a quarter century of intense work, attractor networks are not fully understood. In particular, there are still two unanswered questions. First, how is it that attractor networks exhibit irregular firing, as is observed experimentally during working memory tasks? And second, how many memories can be stored under biologically realistic conditions? Here we answer both questions by studying an attractor neural network in which inhibition and excitation balance each other. Using mean-field analysis, we derive a three-variable description of attractor networks. From this description it follows that irregular firing can exist only if the number of neurons involved in a memory is large. The same mean-field analysis also shows that the number of memories that can be stored in a network scales with the number of excitatory connections, a result that has been suggested for simple models but never shown for realistic ones. Both of these predictions are verified using simulations with large networks of spiking neurons.
Economic Theory and Management Games II.
ERIC Educational Resources Information Center
Zernik, Wolfgang
1988-01-01
Description of management games continues a previous article's discussion of how mathematical modeling and microeconomic concepts can be used by players. Highlights include an initial condition simulating a profit-maximizing monopoly; simulating the transition from monopoly to oligopoly; and how mathematical properties of the model affect final…
A computer program (HEVSIM) for heavy duty vehicle fuel economy and performance simulation
DOT National Transportation Integrated Search
1981-09-01
This report presents a description of a vehicle simulation program, which can determine the fuel economy and performance of a specified motor vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated by HEVSIM i...
Molecular Dynamic Studies of Particle Wake Potentials in Plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren
2010-11-01
Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (P^3M) code ddcMD to perform these simulations. As a starting point in our study, we examined the wake of a particle passing through a plasma. In this poster, we compare the wake observed in 3D ddcMD simulations with that predicted by Vlasov theory and those observed in the electrostatic PIC code BEPS where the cell size was reduced to .03λD.
Giner-Casares, J J; Camacho, L; Martín-Romero, M T; Cascales, J J López
2008-03-04
In this work, a DMPA Langmuir monolayer at the air/water interface was studied by molecular dynamics simulations. Thus, an atomistic picture of a Langmuir monolayer was drawn from its expanded gas phase to its final solid condensed one. In this sense, some properties of monolayers that were traditionally poorly or even not reproduced in computer simulations, such as lipid domain formation or pressure-area per lipid isotherm, were properly reproduced in this work. Thus, the physical laws that control the lipid domain formation in the gas phase and the structure of lipid monolayers from the gas to solid condensed phase were studied. Thanks to the atomistic information provided by the molecular dynamics simulations, we were able to add valuable information to the experimental description of these processes and to access experimental data related to the lipid monolayers in their expanded phase, which is difficult or inaccessible to study by experimental techniques. In this sense, properties such as lipids head hydration and lipid structure were studied.
The Simulation of Read-time Scalable Coherent Interface
NASA Technical Reports Server (NTRS)
Li, Qiang; Grant, Terry; Grover, Radhika S.
1997-01-01
Scalable Coherent Interface (SCI, IEEE/ANSI Std 1596-1992) (SCI1, SCI2) is a high performance interconnect for shared memory multiprocessor systems. In this project we investigate an SCI Real Time Protocols (RTSCI1) using Directed Flow Control Symbols. We studied the issues of efficient generation of control symbols, and created a simulation model of the protocol on a ring-based SCI system. This report presents the results of the study. The project has been implemented using SES/Workbench. The details that follow encompass aspects of both SCI and Flow Control Protocols, as well as the effect of realistic client/server processing delay. The report is organized as follows. Section 2 provides a description of the simulation model. Section 3 describes the protocol implementation details. The next three sections of the report elaborate on the workload, results and conclusions. Appended to the report is a description of the tool, SES/Workbench, used in our simulation, and internal details of our implementation of the protocol.
Studies of Particle Wake Potentials in Plasmas
NASA Astrophysics Data System (ADS)
Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren
2011-10-01
Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and with BEPS using various cell sizes. In this poster, we compare the wakes we observe in these simulations with each other and predictions from Vlasov theory. Prepared by LLNL under Contract DE-AC52-07NA27344 and by UCLA under Grant DE-FG52-09NA29552.
Materials science. Modeling strain hardening the hard way.
Gumbsch, Peter
2003-09-26
The plastic deformation of metals results in strain hardening, that is, an increase in the stress with increasing strain. Materials engineers can provide a simple approximate description of such deformation and hardening behavior. In his perspective, Gumbsch discusses work by Madec et al. who have undertaken the formidable task of computing the physical basis for the development of strain hardening by individually following the fate of all the dislocations involved. Their simulations show that the collinear dislocation interaction makes a substantial contribution to strain hardening. It is likely that such simulations will play an important role in guiding the development of future engineering descriptions of deformation and hardening.
Heat recovery and seed recovery development project: preliminary design report (PDR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arkett, A. H.; Alexander, K. C.; Bolek, A. D.
1981-06-01
The preliminary design and performance characteristics are described of the 20 MWt heat recovery and seed recovery (HRSR) system to be fabricated, installed, and evaluated to provide a technological basis for the design of commercial size HRSR systems for coal-fired open-cycle MHD power plants. The system description and heat and material balances, equipment description and functional requirements, controls, interfacing systems, and operation and maintenance are detailed. Appendices include: (1) recommended environmental requirements for compliance with federal and state of Tennessee regulations, (2) channel and diffuser simulator, (3) equipment arrangement drawings, and (4) channel and diffuser simulator barrel drawings. (WHK)
An Efficient Functional Test Generation Method For Processors Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Hudec, Ján; Gramatová, Elena
2015-07-01
The paper presents a new functional test generation method for processors testing based on genetic algorithms and evolutionary strategies. The tests are generated over an instruction set architecture and a processor description. Such functional tests belong to the software-oriented testing. Quality of the tests is evaluated by code coverage of the processor description using simulation. The presented test generation method uses VHDL models of processors and the professional simulator ModelSim. The rules, parameters and fitness functions were defined for various genetic algorithms used in automatic test generation. Functionality and effectiveness were evaluated using the RISC type processor DP32.
Methodology for automating software systems
NASA Technical Reports Server (NTRS)
Moseley, Warren
1990-01-01
Applying ITS technology to the shuttle diagnostics would not require the rigor of the Petri Net representation, however it is important in providing the animated simulated portion of the interface and the demands placed on the system to support the training aspects to have a homogeneous and consistent underlying knowledge representation. By keeping the diagnostic rule base, the hardware description, the software description, user profiles, desired behavioral knowledge, and the user interface in the same notation, it is possible to reason about the all of the properties of petri nets, on any selected portion of the simulation. This reasoning provides foundation for utilization of intelligent tutoring systems technology.
NASA Technical Reports Server (NTRS)
Newman, Dava J.
1995-01-01
Simulations of astronaut motions during extravehicular activity (EVA) tasks were performed using computational multibody dynamics methods. The application of computational dynamic simulation to EVA was prompted by the realization that physical microgravity simulators have inherent limitations: viscosity in neutral buoyancy tanks; friction in air bearing floors; short duration for parabolic aircraft; and inertia and friction in suspension mechanisms. These limitations can mask critical dynamic effects that later cause problems during actual EVA's performed in space. Methods of formulating dynamic equations of motion for multibody systems are discussed with emphasis on Kane's method, which forms the basis of the simulations presented herein. Formulation of the equations of motion for a two degree of freedom arm is presented as an explicit example. The four basic steps in creating the computational simulations were: system description, in which the geometry, mass properties, and interconnection of system bodies are input to the computer; equation formulation based on the system description; inverse kinematics, in which the angles, velocities, and accelerations of joints are calculated for prescribed motion of the endpoint (hand) of the arm; and inverse dynamics, in which joint torques are calculated for a prescribed motion. A graphical animation and data plotting program, EVADS (EVA Dynamics Simulation), was developed and used to analyze the results of the simulations that were performed on a Silicon Graphics Indigo2 computer. EVA tasks involving manipulation of the Spartan 204 free flying astronomy payload, as performed during Space Shuttle mission STS-63 (February 1995), served as the subject for two dynamic simulations. An EVA crewmember was modeled as a seven segment system with an eighth segment representing the massive payload attached to the hand. For both simulations, the initial configuration of the lower body (trunk, upper leg, and lower leg) was a neutral microgravity posture. In the first simulation, the payload was manipulated around a circular trajectory of 0.15 m radius in 10 seconds. It was found that the wrist joint theoretically exceeded its ulnal deviation limit by as much as 49. 8 deg and was required to exert torques as high as 26 N-m to accomplish the task, well in excess of the wrist physiological limit of 12 N-m. The largest torque in the first simulation, 52 N-m, occurred in the ankle joint. To avoid these problems, the second simulation placed the arm in a more comfortable initial position and the radius and speed of the circular trajectory were reduced by half. As a result, the joint angles and torques were reduced to values well within their physiological limits. In particular, the maximum wrist torque for the second simulation was only 3 N-m and the maximum ankle torque was only 6 N-m.
NASA Technical Reports Server (NTRS)
Slafer, Loren I.
1989-01-01
Realtime simulation and hardware-in-the-loop testing is being used extensively in all phases of the design, development, and testing of the attitude control system (ACS) for the new Hughes HS601 satellite bus. Realtime, hardware-in-the-loop simulation, integrated with traditional analysis and pure simulation activities is shown to provide a highly efficient and productive overall development program. Implementation of high fidelity simulations of the satellite dynamics and control system algorithms, capable of real-time execution (using applied Dynamics International's System 100), provides a tool which is capable of being integrated with the critical flight microprocessor to create a mixed simulation test (MST). The MST creates a highly accurate, detailed simulated on-orbit test environment, capable of open and closed loop ACS testing, in which the ACS design can be validated. The MST is shown to provide a valuable extension of traditional test methods. A description of the MST configuration is presented, including the spacecraft dynamics simulation model, sensor and actuator emulators, and the test support system. Overall system performance parameters are presented. MST applications are discussed; supporting ACS design, developing on-orbit system performance predictions, flight software development and qualification testing (augmenting the traditional software-based testing), mission planning, and a cost-effective subsystem-level acceptance test. The MST is shown to provide an ideal tool in which the ACS designer can fly the spacecraft on the ground.
NASA Technical Reports Server (NTRS)
Ackermann, M.; Ajello, M.; Albert, A.; Allafort, A.; Atwood, W. B.; Axelsson, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.;
2012-01-01
The Fermi Large Area Telescope (Fermi-LAT, hereafter LAT), the primary instrument on the Fermi Gamma-ray Space Telescope (Fermi) mission, is an imaging, wide field-of-view, high-energy -ray telescope, covering the energy range from 20 MeV to more than 300 GeV. During the first years of the mission the LAT team has gained considerable insight into the in-flight performance of the instrument. Accordingly, we have updated the analysis used to reduce LAT data for public release as well as the Instrument Response Functions (IRFs), the description of the instrument performance provided for data analysis. In this paper we describe the effects that motivated these updates. Furthermore, we discuss how we originally derived IRFs from Monte Carlo simulations and later corrected those IRFs for discrepancies observed between flight and simulated data. We also give details of the validations performed using flight data and quantify the residual uncertainties in the IRFs. Finally, we describe techniques the LAT team has developed to propagate those uncertainties into estimates of the systematic errors on common measurements such as fluxes and spectra of astrophysical sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olsen, R.J.; Westley, G.W.; Herzog, H.W. Jr.
This report documents the development of MULTIREGION, a computer model of regional and interregional socio-economic development. The MULTIREGION model interprets the economy of each BEA economic area as a labor market, measures all activity in terms of people as members of the population (labor supply) or as employees (labor demand), and simultaneously simulates or forecasts the demands and supplies of labor in all BEA economic areas at five-year intervals. In general the outputs of MULTIREGION are intended to resemble those of the Water Resource Council's OBERS projections and to be put to similar planning and analysis purposes. This report hasmore » been written at two levels to serve the needs of multiple audiences. The body of the report serves as a fairly nontechnical overview of the entire MULTIREGION project; a series of technical appendixes provide detailed descriptions of the background empirical studies of births, deaths, migration, labor force participation, natural resource employment, manufacturing employment location, and local service employment used to construct the model.« less
Uncertainty in Ecohydrological Modeling in an Arid Region Determined with Bayesian Methods
Yang, Junjun; He, Zhibin; Du, Jun; Chen, Longfei; Zhu, Xi
2016-01-01
In arid regions, water resources are a key forcing factor in ecosystem circulation, and soil moisture is the critical link that constrains plant and animal life on the soil surface and underground. Simulation of soil moisture in arid ecosystems is inherently difficult due to high variability. We assessed the applicability of the process-oriented CoupModel for forecasting of soil water relations in arid regions. We used vertical soil moisture profiling for model calibration. We determined that model-structural uncertainty constituted the largest error; the model did not capture the extremes of low soil moisture in the desert-oasis ecotone (DOE), particularly below 40 cm soil depth. Our results showed that total uncertainty in soil moisture prediction was improved when input and output data, parameter value array, and structure errors were characterized explicitly. Bayesian analysis was applied with prior information to reduce uncertainty. The need to provide independent descriptions of uncertainty analysis (UA) in the input and output data was demonstrated. Application of soil moisture simulation in arid regions will be useful for dune-stabilization and revegetation efforts in the DOE. PMID:26963523