Implementing a high-fidelity simulation program in a community college setting.
Tuoriniemi, Pamela; Schott-Baer, Darlene
2008-01-01
Despite their relatively high cost, there is heightened interest by faculty in undergraduate nursing programs to implement high-fidelity simulation (HFS) programs. High-fidelity simulators are appealing because they allow students to experience high-risk, low-volume patient problems in a realistic setting. The decision to purchase a simulator is the first step in the process of implementing and maintaining an HFS lab. Knowledge, technical skill, commitment, and considerable time are needed to develop a successful program. The process, as experienced by one community college nursing program, is described.
NASA Astrophysics Data System (ADS)
Michalik, Peter; Mital, Dusan; Zajac, Jozef; Brezikova, Katarina; Duplak, Jan; Hatala, Michal; Radchenko, Svetlana
2016-10-01
Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.
Program For Simulation Of Trajectories And Events
NASA Technical Reports Server (NTRS)
Gottlieb, Robert G.
1992-01-01
Universal Simulation Executive (USE) program accelerates and eases generation of application programs for numerical simulation of continuous trajectories interrupted by or containing discrete events. Developed for simulation of multiple spacecraft trajectories with events as one spacecraft crossing the equator, two spacecraft meeting or parting, or firing rocket engine. USE also simulates operation of chemical batch processing factory. Written in Ada.
A fortran program for Monte Carlo simulation of oil-field discovery sequences
Bohling, Geoffrey C.; Davis, J.C.
1993-01-01
We have developed a program for performing Monte Carlo simulation of oil-field discovery histories. A synthetic parent population of fields is generated as a finite sample from a distribution of specified form. The discovery sequence then is simulated by sampling without replacement from this parent population in accordance with a probabilistic discovery process model. The program computes a chi-squared deviation between synthetic and actual discovery sequences as a function of the parameters of the discovery process model, the number of fields in the parent population, and the distributional parameters of the parent population. The program employs the three-parameter log gamma model for the distribution of field sizes and employs a two-parameter discovery process model, allowing the simulation of a wide range of scenarios. ?? 1993.
Lee, Young Han
2012-01-01
The objectives are (1) to introduce an easy open-source macro program as connection software and (2) to illustrate the practical usages in radiologic reading environment by simulating the radiologic reading process. The simulation is a set of radiologic reading process to do a practical task in the radiologic reading room. The principal processes are: (1) to view radiologic images on the Picture Archiving and Communicating System (PACS), (2) to connect the HIS/EMR (Hospital Information System/Electronic Medical Record) system, (3) to make an automatic radiologic reporting system, and (4) to record and recall information of interesting cases. This simulation environment was designed by using open-source macro program as connection software. The simulation performed well on the Window-based PACS workstation. Radiologists practiced the steps of the simulation comfortably by utilizing the macro-powered radiologic environment. This macro program could automate several manual cumbersome steps in the radiologic reading process. This program successfully acts as connection software for the PACS software, EMR/HIS, spreadsheet, and other various input devices in the radiologic reading environment. A user-friendly efficient radiologic reading environment could be established by utilizing open-source macro program as connection software. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
User's guide to resin infusion simulation program in the FORTRAN language
NASA Technical Reports Server (NTRS)
Weideman, Mark H.; Hammond, Vince H.; Loos, Alfred C.
1992-01-01
RTMCL is a user friendly computer code which simulates the manufacture of fabric composites by the resin infusion process. The computer code is based on the process simulation model described in reference 1. Included in the user's guide is a detailed step by step description of how to run the program and enter and modify the input data set. Sample input and output files are included along with an explanation of the results. Finally, a complete listing of the program is provided.
How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.
Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar
2016-06-28
eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.
How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study
Johansen, Ayna; Brendryen, Håvar
2016-01-01
Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373
Spacecraft orbit/earth scan derivations, associated APL program, and application to IMP-6
NASA Technical Reports Server (NTRS)
Smith, G. A.
1971-01-01
The derivation of a time shared, remote site, demand processed computer program is discussed. The computer program analyzes the effects of selected orbit, attitude, and spacecraft parameters on earth sensor detections of earth. For prelaunch analysis, the program may be used to simulate effects in nominal parameters which are used in preparing attitude data processing programs. After launch, comparison of results from a simulation and from satellite data will produce deviations helpful in isolating problems.
Simulation of mass storage systems operating in a large data processing facility
NASA Technical Reports Server (NTRS)
Holmes, R.
1972-01-01
A mass storage simulation program was written to aid system designers in the design of a data processing facility. It acts as a tool for measuring the overall effect on the facility of on-line mass storage systems, and it provides the means of measuring and comparing the performance of competing mass storage systems. The performance of the simulation program is demonstrated.
NASA Technical Reports Server (NTRS)
1981-01-01
The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.
Taplay, Karyn; Jack, Susan M; Baxter, Pamela; Eva, Kevin; Martin, Lynn
2015-01-01
The aim of this study is to explain the process of adopting and incorporating simulation as a teaching strategy in undergraduate nursing programs, define uptake, and discuss potential outcomes. In many countries, simulation is increasingly adopted as a common teaching strategy. However, there is a dearth of knowledge related to the process of adoption and incorporation. We used an interpretive, constructivist approach to grounded theory to guide this research study. We conducted the study was in Ontario, Canada, during 2011-2012. Using multiple data sources, we informed the development of this theory including in-depth interviews (n = 43) and a review of key organizational documents, such as mission and vision statements (n = 67) from multiple nursing programs (n = 13). The adoption and uptake of mid- to high-fidelity simulation equipment is a multistep iterative process involving various organizational levels within the institution that entails a seven-phase process: (a) securing resources, (b) nursing leaders working in tandem, (c) getting it out of the box, (d) learning about simulation and its potential for teaching, (e) finding a fit, (f) trialing the equipment, and (g) integrating into the curriculum. These findings could assist nursing programs in Canada and internationally that wish to adopt or further incorporate simulation into their curricula and highlight potential organizational and program level outcomes. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.
USERS MANUAL FOR HYDROLOGICAL SIMULATION PROGRAM - FORTRAN (HSPF)
The Hydrological Simulation Program--Fortran (HSPF) is a set of computer codes that can simulate the hydrologic, and associated water quality, processes on pervious and impervious land surfaces and in streams and well-mixed impoundments. The manual discusses the modular structure...
Meaningful Use of Simulation as an Educational Method in Nursing Programs
ERIC Educational Resources Information Center
Thompson, Teri L.
2011-01-01
The purpose of this descriptive study was to examine the use of simulation technology within nursing programs leading to licensure as registered nurses. In preparation for this study the Use of Simulation Technology Inventory (USTI) was developed and based in the structure, processes, outcomes model and the current literature on simulation. The…
A general software reliability process simulation technique
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1991-01-01
The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.
Current status of endoscopic simulation in gastroenterology fellowship training programs.
Jirapinyo, Pichamol; Thompson, Christopher C
2015-07-01
Recent guidelines have encouraged gastroenterology and surgical training programs to integrate simulation into their core endoscopic curricula. However, the role that simulation currently has within training programs is unknown. This study aims to assess the current status of simulation among gastroenterology fellowship programs. This questionnaire study consisted of 38 fields divided into two sections. The first section queried program directors' experience on simulation and assessed the current status of simulation at their institution. The second portion surveyed their opinion on the potential role of simulation on the training curriculum. The study was conducted at the 2013 American Gastroenterological Association Training Directors' Workshop in Phoenix, Arizona. The participants were program directors from Accreditation Council for Graduate Medical Education accredited gastroenterology training programs, who attended the workshop. The questionnaire was returned by 69 of 97 program directors (response rate of 71%). 42% of programs had an endoscopic simulator. Computerized simulators (61.5%) were the most common, followed by mechanical (30.8%) and animal tissue (7.7%) simulators, respectively. Eleven programs (15%) required fellows to use simulation prior to clinical cases. Only one program has a minimum number of hours fellows have to participate in simulation training. Current simulators are deemed as easy to use (76%) and good educational tools (65%). Problems are cost (72%) and accessibility (69%). The majority of program directors believe that there is a need for endoscopic simulator training, with only 8% disagreeing. Additionally, a majority believe there is a role for simulation prior to initiation of clinical cases with 15% disagreeing. Gastroenterology fellowship program directors widely recognize the importance of simulation. Nevertheless, simulation is used by only 42% of programs and only 15% of programs require that trainees use simulation prior to clinical cases. No programs currently use simulation as part of the evaluation process.
Effects of Thinking Style on Design Strategies: Using Bridge Construction Simulation Programs
ERIC Educational Resources Information Center
Sun, Chuen-Tsai; Wang, Dai-Yi; Chang, Yu-Yeh
2013-01-01
Computer simulation users can freely control operational factors and simulation results, repeat processes, make changes, and learn from simulation environment feedback. The focus of this paper is on simulation-based design tools and their effects on student learning processes in a group of 101 Taiwanese senior high school students. Participants…
HYDROLOGICAL SIMULATION PROGRAM-FORTRAN (HSPF): USERS MANUAL FOR RELEASE 8.0
The Hydrological Simulation Program--FORTRAN (HSPF) is a set of computer codes that can simulate the hydrologic, and associated water quality, processes on pervious and impervious land surfaces and in streams and well mixed impoundments. The manual discusses the modular structure...
DYNSYL: a general-purpose dynamic simulator for chemical processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patterson, G.K.; Rozsa, R.B.
1978-09-05
Lawrence Livermore Laboratory is conducting a safeguards program for the Nuclear Regulatory Commission. The goal of the Material Control Project of this program is to evaluate material control and accounting (MCA) methods in plants that handle special nuclear material (SNM). To this end we designed and implemented the dynamic chemical plant simulation program DYNSYL. This program can be used to generate process data or to provide estimates of process performance; it simulates both steady-state and dynamic behavior. The MCA methods that may have to be evaluated range from sophisticated on-line material trackers such as Kalman filter estimators, to relatively simplemore » material balance procedures. This report describes the overall structure of DYNSYL and includes some example problems. The code is still in the experimental stage and revision is continuing.« less
Real-Time Monitoring of Scada Based Control System for Filling Process
NASA Astrophysics Data System (ADS)
Soe, Aung Kyaw; Myint, Aung Naing; Latt, Maung Maung; Theingi
2008-10-01
This paper is a design of real-time monitoring for filling system using Supervisory Control and Data Acquisition (SCADA). The monitoring of production process is described in real-time using Visual Basic.Net programming under Visual Studio 2005 software without SCADA software. The software integrators are programmed to get the required information for the configuration screens. Simulation of components is expressed on the computer screen using parallel port between computers and filling devices. The programs of real-time simulation for the filling process from the pure drinking water industry are provided.
REFLEAK: NIST Leak/Recharge Simulation Program for Refrigerant Mixtures
National Institute of Standards and Technology Data Gateway
SRD 73 NIST REFLEAK: NIST Leak/Recharge Simulation Program for Refrigerant Mixtures (PC database for purchase) REFLEAK estimates composition changes of zeotropic mixtures in leak and recharge processes.
Investigation of roughing machining simulation by using visual basic programming in NX CAM system
NASA Astrophysics Data System (ADS)
Hafiz Mohamad, Mohamad; Nafis Osman Zahid, Muhammed
2018-03-01
This paper outlines a simulation study to investigate the characteristic of roughing machining simulation in 4th axis milling processes by utilizing visual basic programming in NX CAM systems. The selection and optimization of cutting orientation in rough milling operation is critical in 4th axis machining. The main purpose of roughing operation is to approximately shape the machined parts into finished form by removing the bulk of material from workpieces. In this paper, the simulations are executed by manipulating a set of different cutting orientation to generate estimated volume removed from the machine parts. The cutting orientation with high volume removal is denoted as an optimum value and chosen to execute a roughing operation. In order to run the simulation, customized software is developed to assist the routines. Operations build-up instructions in NX CAM interface are translated into programming codes via advanced tool available in the Visual Basic Studio. The codes is customized and equipped with decision making tools to run and control the simulations. It permits the integration with any independent program files to execute specific operations. This paper aims to discuss about the simulation program and identifies optimum cutting orientations for roughing processes. The output of this study will broaden up the simulation routines performed in NX CAM systems.
P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)
NASA Astrophysics Data System (ADS)
Kropp, Derek L.
2009-05-01
One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.
Integration of communications and tracking data processing simulation for space station
NASA Technical Reports Server (NTRS)
Lacovara, Robert C.
1987-01-01
A simplified model of the communications network for the Communications and Tracking Data Processing System (CTDP) was developed. It was simulated by use of programs running on several on-site computers. These programs communicate with one another by means of both local area networks and direct serial connections. The domain of the model and its simulation is from Orbital Replaceable Unit (ORU) interface to Data Management Systems (DMS). The simulation was designed to allow status queries from remote entities across the DMS networks to be propagated through the model to several simulated ORU's. The ORU response is then propagated back to the remote entity which originated the request. Response times at the various levels were investigated in a multi-tasking, multi-user operating system environment. Results indicate that the effective bandwidth of the system may be too low to support expected data volume requirements under conventional operating systems. Instead, some form of embedded process control program may be required on the node computers.
Combining high performance simulation, data acquisition, and graphics display computers
NASA Technical Reports Server (NTRS)
Hickman, Robert J.
1989-01-01
Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.
ISPE: A knowledge-based system for fluidization studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
The transition of a real-time single-rotor helicopter simulation program to a supercomputer
NASA Technical Reports Server (NTRS)
Martinez, Debbie
1995-01-01
This report presents the conversion effort and results of a real-time flight simulation application transition to a CONVEX supercomputer. Enclosed is a detailed description of the conversion process and a brief description of the Langley Research Center's (LaRC) flight simulation application program structure. Currently, this simulation program may be configured to represent Sikorsky S-61 helicopter (a five-blade, single-rotor, commercial passenger-type helicopter) or an Army Cobra helicopter (either the AH-1 G or AH-1 S model). This report refers to the Sikorsky S-61 simulation program since it is the most frequently used configuration.
Feasibility Study On Missile Launch Detection And Trajectory Tracking
2016-09-01
Vehicles ( UAVs ) in military operations, their role in a missile defense operation is not well defined. The simulation program discussed in this thesis ...targeting information to an attacking UAV to reliably intercept the missile. B . FURTHER STUDIES The simulation program can be enhanced to improve the...intercept the threat. This thesis explores the challenges in creating a simulation program to process video footage from an unstable platform and the
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1972-01-01
The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.
ERIC Educational Resources Information Center
Clarke, Matthew A.; Giraldo, Carlos
2009-01-01
Chemical process simulation is one of the most fundamental skills that is expected from chemical engineers, yet relatively few graduates have the opportunity to learn, in depth, how a process simulator works, from programming the unit operations to the sequencing. The University of Calgary offers a "hands-on" postgraduate course in…
PLYMAP : a computer simulation model of the rotary peeled softwood plywood manufacturing process
Henry Spelter
1990-01-01
This report documents a simulation model of the plywood manufacturing process. Its purpose is to enable a user to make quick estimates of the economic impact of a particular process change within a mill. The program was designed to simulate the processing of plywood within a relatively simplified mill design. Within that limitation, however, it allows a wide range of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Hendrickson, Bruce
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less
Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogden, John Edward; Unal, Cetin
A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.
Process Simulation of Gas Metal Arc Welding Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murray, Paul E.
2005-09-06
ARCWELDER is a Windows-based application that simulates gas metal arc welding (GMAW) of steel and aluminum. The software simulates the welding process in an accurate and efficient manner, provides menu items for process parameter selection, and includes a graphical user interface with the option to animate the process. The user enters the base and electrode material, open circuit voltage, wire diameter, wire feed speed, welding speed, and standoff distance. The program computes the size and shape of a square-groove or V-groove weld in the flat position. The program also computes the current, arc voltage, arc length, electrode extension, transfer ofmore » droplets, heat input, filler metal deposition, base metal dilution, and centerline cooling rate, in English or SI units. The simulation may be used to select welding parameters that lead to desired operation conditions.« less
Incorporating scenario-based simulation into a hospital nursing education program.
Nagle, Beth M; McHale, Jeanne M; Alexander, Gail A; French, Brian M
2009-01-01
Nurse educators are challenged to provide meaningful and effective learning opportunities for both new and experienced nurses. Simulation as a teaching and learning methodology is being embraced by nursing in academic and practice settings to provide innovative educational experiences to assess and develop clinical competency, promote teamwork, and improve care processes. This article provides an overview of the historical basis for using simulation in education, simulation methodologies, and perceived advantages and disadvantages. It also provides a description of the integration of scenario-based programs using a full-scale patient simulator into nursing education programming at a large academic medical center.
ISPE: A knowledge-based system for fluidization studies. 1990 Annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all ``specified goals`` are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team.
Yager, Phoebe; Collins, Corey; Blais, Carlene; O'Connor, Kathy; Donovan, Patricia; Martinez, Maureen; Cummings, Brian; Hartnick, Christopher; Noviski, Natan
2016-09-01
Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level. Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement's (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level. Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control. Utilizing the IHI's Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.
1999-01-01
Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…
Method and Process for the Creation of Modeling and Simulation Tools for Human Crowd Behavior
2014-07-23
Support• Program Executive Office Ground Combat Systems • Program Executive Office Soldier TACOM LCMC MG Michael J. Terry Assigned/Direct Support...environmental technologies and explosive ordnance disposal Fire Control: Battlefield digitization; embedded system software; aero ballistics and...MRAD – Handheld stand-off NLW operated by Control Force • Simulated Projectile Weapon • Simulated Handheld Directed Energy NLW ( VDE ) – Simulated
Simulation of a master-slave event set processor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Comfort, J.C.
1984-03-01
Event set manipulation may consume a considerable amount of the computation time spent in performing a discrete-event simulation. One way of minimizing this time is to allow event set processing to proceed in parallel with the remainder of the simulation computation. The paper describes a multiprocessor simulation computer, in which all non-event set processing is performed by the principal processor (called the host). Event set processing is coordinated by a front end processor (the master) and actually performed by several other functionally identical processors (the slaves). A trace-driven simulation program modeling this system was constructed, and was run with tracemore » output taken from two different simulation programs. Output from this simulation suggests that a significant reduction in run time may be realized by this approach. Sensitivity analysis was performed on the significant parameters to the system (number of slave processors, relative processor speeds, and interprocessor communication times). A comparison between actual and simulation run times for a one-processor system was used to assist in the validation of the simulation. 7 references.« less
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-processing of data, related to a GPS receiver test in a GPS simulator and test facility, is an important step towards qualifying a receiver for space flight. Although the GPS simulator provides all the parameters needed to analyze a simulation, as well as excellent analysis tools on the simulator workstation, post-processing is not a GPS simulator or receiver function alone, and it must be planned as a separate pre-flight test program requirement. A GPS simulator is a critical resource, and it is desirable to move off the pertinent test data from the simulator as soon as a test is completed. The receiver and simulator databases are used to extract the test data files for postprocessing. These files are then usually moved from the simulator and receiver systems to a personal computer (PC) platform, where post-processing is done typically using PC-based commercial software languages and tools. Because of commercial software systems generality their functions are notoriously slow and more than often are the bottleneck even for short duration simulator-based tests. There is a need to do post-processing faster and within an hour after test completion, including all required operations on the simulator and receiver to prepare and move off the post-processing files. This is especially significant in order to use the previous test feedback for the next simulation setup or to run near back-to-back simulation scenarios. Solving the post-processing timing problem is critical for a pre-flight test program success. Towards this goal an approach was developed that allows to speed-up post-processing by an order of a magnitude. It is based on improving the post-processing bottleneck function algorithm using a priory information that is specific to a GPS simulation application and using only the necessary volume of truth data. The presented postprocessing scheme was used in support of a few successful space flight missions carrying GPS receivers.
Program Costing with the CAMPUS Simulation Model. Project PRIME Report, Number 5.
ERIC Educational Resources Information Center
Cordes, David C.
The first section of this report on program costing with the CAMPUS simulation discusses the structuring process of Program Planning and Budgeting (PPB) systems, and emphasizes the ideas, rules, and principles for structuring resource data that have evolved during the 10 years of PPB existence. It also discusses the WICHE-PMS program…
A high-order language for a system of closely coupled processing elements
NASA Technical Reports Server (NTRS)
Feyock, S.; Collins, W. R.
1986-01-01
The research reported in this paper was occasioned by the requirements on part of the Real-Time Digital Simulator (RTDS) project under way at NASA Lewis Research Center. The RTDS simulation scheme employs a network of CPUs running lock-step cycles in the parallel computations of jet airplane simulations. Their need for a high order language (HOL) that would allow non-experts to write simulation applications and that could be implemented on a possibly varying network can best be fulfilled by using the programming language Ada. We describe how the simulation problems can be modeled in Ada, how to map a single, multi-processing Ada program into code for individual processors, regardless of network reconfiguration, and why some Ada language features are particulary well-suited to network simulations.
A study of trends and techniques for space base electronics
NASA Technical Reports Server (NTRS)
Trotter, J. D.; Wade, T. E.; Gassaway, J. D.
1979-01-01
The use of dry processing and alternate dielectrics for processing wafers is reported. A two dimensional modeling program was written for the simulation of short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide-silicon interface. In solving current continuity equation, the program does not converge. However, solving the two dimensional Poisson equation for the potential distribution was achieved. The status of other 2D MOSFET simulation programs are summarized.
Method for distributed agent-based non-expert simulation of manufacturing process behavior
Ivezic, Nenad; Potok, Thomas E.
2004-11-30
A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.
Learning-Testing Process in Classroom: An Empirical Simulation Model
ERIC Educational Resources Information Center
Buda, Rodolphe
2009-01-01
This paper presents an empirical micro-simulation model of the teaching and the testing process in the classroom (Programs and sample data are available--the actual names of pupils have been hidden). It is a non-econometric micro-simulation model describing informational behaviors of the pupils, based on the observation of the pupils'…
Furniture rough mill costs evaluated by computer simulation
R. Bruce Anderson
1983-01-01
A crosscut-first furniture rough mill was simulated to evaluate processing and raw material costs on an individual part basis. Distributions representing the real-world characteristics of lumber, equipment feed speeds, and processing requirements are programed into the simulation. Costs of parts from a specific cutting bill are given, and effects of lumber input costs...
Line-by-line spectroscopic simulations on graphics processing units
NASA Astrophysics Data System (ADS)
Collange, Sylvain; Daumas, Marc; Defour, David
2008-01-01
We report here on software that performs line-by-line spectroscopic simulations on gases. Elaborate models (such as narrow band and correlated-K) are accurate and efficient for bands where various components are not simultaneously and significantly active. Line-by-line is probably the most accurate model in the infrared for blends of gases that contain high proportions of H 2O and CO 2 as this was the case for our prototype simulation. Our implementation on graphics processing units sustains a speedup close to 330 on computation-intensive tasks and 12 on memory intensive tasks compared to implementations on one core of high-end processors. This speedup is due to data parallelism, efficient memory access for specific patterns and some dedicated hardware operators only available in graphics processing units. It is obtained leaving most of processor resources available and it would scale linearly with the number of graphics processing units in parallel machines. Line-by-line simulation coupled with simulation of fluid dynamics was long believed to be economically intractable but our work shows that it could be done with some affordable additional resources compared to what is necessary to perform simulations on fluid dynamics alone. Program summaryProgram title: GPU4RE Catalogue identifier: ADZY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 62 776 No. of bytes in distributed program, including test data, etc.: 1 513 247 Distribution format: tar.gz Programming language: C++ Computer: x86 PC Operating system: Linux, Microsoft Windows. Compilation requires either gcc/g++ under Linux or Visual C++ 2003/2005 and Cygwin under Windows. It has been tested using gcc 4.1.2 under Ubuntu Linux 7.04 and using Visual C++ 2005 with Cygwin 1.5.24 under Windows XP. RAM: 1 gigabyte Classification: 21.2 External routines: OpenGL ( http://www.opengl.org) Nature of problem: Simulating radiative transfer on high-temperature high-pressure gases. Solution method: Line-by-line Monte-Carlo ray-tracing. Unusual features: Parallel computations are moved to the GPU. Additional comments: nVidia GeForce 7000 or ATI Radeon X1000 series graphics processing unit is required. Running time: A few minutes.
ERIC Educational Resources Information Center
Neely, Pat; Tucker, Jan
2013-01-01
Purpose: Simulations are designed as activities which imitate real world scenarios and are often used to teach and enhance skill building. The purpose of this case study is to examine the decision making process and outcomes of a faculty committee tasked with examining simulations in the marketplace to determine if the simulations could be used as…
Programming and machining of complex parts based on CATIA solid modeling
NASA Astrophysics Data System (ADS)
Zhu, Xiurong
2017-09-01
The complex parts of the use of CATIA solid modeling programming and simulation processing design, elaborated in the field of CNC machining, programming and the importance of processing technology. In parts of the design process, first make a deep analysis on the principle, and then the size of the design, the size of each chain, connected to each other. After the use of backstepping and a variety of methods to calculate the final size of the parts. In the selection of parts materials, careful study, repeated testing, the final choice of 6061 aluminum alloy. According to the actual situation of the processing site, it is necessary to make a comprehensive consideration of various factors in the machining process. The simulation process should be based on the actual processing, not only pay attention to shape. It can be used as reference for machining.
A computational model for simulating text comprehension.
Lemaire, Benoît; Denhière, Guy; Bellissens, Cédrick; Jhean-Larose, Sandra
2006-11-01
In the present article, we outline the architecture of a computer program for simulating the process by which humans comprehend texts. The program is based on psycholinguistic theories about human memory and text comprehension processes, such as the construction-integration model (Kintsch, 1998), the latent semantic analysis theory of knowledge representation (Landauer & Dumais, 1997), and the predication algorithms (Kintsch, 2001; Lemaire & Bianco, 2003), and it is intended to help psycholinguists investigate the way humans comprehend texts.
Enhanced TCAS 2/CDTI traffic Sensor digital simulation model and program description
NASA Technical Reports Server (NTRS)
Goka, T.
1984-01-01
Digital simulation models of enhanced TCAS 2/CDTI traffic sensors are developed, based on actual or projected operational and performance characteristics. Two enhanced Traffic (or Threat) Alert and Collision Avoidance Systems are considered. A digital simulation program is developed in FORTRAN. The program contains an executive with a semireal time batch processing capability. The simulation program can be interfaced with other modules with a minimum requirement. Both the traffic sensor and CAS logic modules are validated by means of extensive simulation runs. Selected validation cases are discussed in detail, and capabilities and limitations of the actual and simulated systems are noted. The TCAS systems are not specifically intended for Cockpit Display of Traffic Information (CDTI) applications. These systems are sufficiently general to allow implementation of CDTI functions within the real systems' constraints.
Concurrent simulation of a parallel jaw end effector
NASA Technical Reports Server (NTRS)
Bynum, Bill
1985-01-01
A system of programs developed to aid in the design and development of the command/response protocol between a parallel jaw end effector and the strategic planner program controlling it are presented. The system executes concurrently with the LISP controlling program to generate a graphical image of the end effector that moves in approximately real time in response to commands sent from the controlling program. Concurrent execution of the simulation program is useful for revealing flaws in the communication command structure arising from the asynchronous nature of the message traffic between the end effector and the strategic planner. Software simulation helps to minimize the number of hardware changes necessary to the microprocessor driving the end effector because of changes in the communication protocol. The simulation of other actuator devices can be easily incorporated into the system of programs by using the underlying support that was developed for the concurrent execution of the simulation process and the communication between it and the controlling program.
Accelerating sino-atrium computer simulations with graphic processing units.
Zhang, Hong; Xiao, Zheng; Lin, Shien-fong
2015-01-01
Sino-atrial node cells (SANCs) play a significant role in rhythmic firing. To investigate their role in arrhythmia and interactions with the atrium, computer simulations based on cellular dynamic mathematical models are generally used. However, the large-scale computation usually makes research difficult, given the limited computational power of Central Processing Units (CPUs). In this paper, an accelerating approach with Graphic Processing Units (GPUs) is proposed in a simulation consisting of the SAN tissue and the adjoining atrium. By using the operator splitting method, the computational task was made parallel. Three parallelization strategies were then put forward. The strategy with the shortest running time was further optimized by considering block size, data transfer and partition. The results showed that for a simulation with 500 SANCs and 30 atrial cells, the execution time taken by the non-optimized program decreased 62% with respect to a serial program running on CPU. The execution time decreased by 80% after the program was optimized. The larger the tissue was, the more significant the acceleration became. The results demonstrated the effectiveness of the proposed GPU-accelerating methods and their promising applications in more complicated biological simulations.
Object oriented design (OOD) in real-time hardware-in-the-loop (HWIL) simulations
NASA Astrophysics Data System (ADS)
Morris, Joe; Richard, Henri; Lowman, Alan; Youngren, Rob
2006-05-01
Using Object Oriented Design (OOD) concepts in AMRDEC's Hardware-in-the Loop (HWIL) real-time simulations allows the user to interchange parts of the simulation to meet test requirements. A large-scale three-spectral band simulator connected via a high speed reflective memory ring for time-critical data transfers to PC controllers connected by non real-time Ethernet protocols is used to separate software objects from logical entities close to their respective controlled hardware. Each standalone object does its own dynamic initialization, real-time processing, and end of run processing; therefore it can be easily maintained and updated. A Resource Allocation Program (RAP) is also utilized along with a device table to allocate, organize, and document the communication protocol between the software and hardware components. A GUI display program lists all allocations and deallocations of HWIL memory and hardware resources. This interactive program is also used to clean up defunct allocations of dead processes. Three examples are presented using the OOD and RAP concepts. The first is the control of an ACUTRONICS built three-axis flight table using the same control for calibration and real-time functions. The second is the transportability of a six-degree-of-freedom (6-DOF) simulation from an Onyx residence to a Linux-PC. The third is the replacement of the 6-DOF simulation with a replay program to drive the facility with archived run data for demonstration or analysis purposes.
Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu
2015-09-15
UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
The Scientific Computing for Chemists course taught at Wabash College teaches chemistry students to use the Python programming language, Jupyter notebooks, and a number of common Python scientific libraries to process, analyze, and visualize data. Assuming no prior programming experience, the course introduces students to basic programming and…
Automating approximate Bayesian computation by local linear regression.
Thornton, Kevin R
2009-07-07
In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.
Simulation Environment Synchronizing Real Equipment for Manufacturing Cell
NASA Astrophysics Data System (ADS)
Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro
Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.
Research and Analysis of Image Processing Technologies Based on DotNet Framework
NASA Astrophysics Data System (ADS)
Ya-Lin, Song; Chen-Xi, Bai
Microsoft.Net is a kind of most popular program development tool. This paper gave a detailed analysis concluded about some image processing technologies of the advantages and disadvantages by .Net processed image while the same algorithm is used in Programming experiments. The result shows that the two best efficient methods are unsafe pointer and Direct 3D, and Direct 3D used to 3D simulation development, and the others are useful in some fields while these technologies are poor efficiency and not suited to real-time processing. The experiment results in paper will help some projects about image processing and simulation based DotNet and it has strong practicability.
NASA Astrophysics Data System (ADS)
Petrila, S.; Brabie, G.; Chirita, B.
2016-08-01
The analysis performed on manufacturing flows within industrial enterprises producing hydrostatic components twos made on a number of factors that influence smooth running of production such: distance between pieces, waiting time from one surgery to another; time achievement of setups on CNC machines; tool changing in case of a large number of operators and manufacturing complexity of large files [2]. To optimize the manufacturing flow it was used the software Tecnomatix. This software represents a complete portfolio of manufacturing solutions digital manufactured by Siemens. It provides innovation by linking all production methods of a product from process design, process simulation, validation and ending the manufacturing process. Among its many capabilities to create a wide range of simulations, the program offers various demonstrations regarding the behavior manufacturing cycles. This program allows the simulation and optimization of production systems and processes in several areas such as: car suppliers, production of industrial equipment; electronics manufacturing, design and production of aerospace and defense parts.
Cellular automata-based modelling and simulation of biofilm structure on multi-core computers.
Skoneczny, Szymon
2015-01-01
The article presents a mathematical model of biofilm growth for aerobic biodegradation of a toxic carbonaceous substrate. Modelling of biofilm growth has fundamental significance in numerous processes of biotechnology and mathematical modelling of bioreactors. The process following double-substrate kinetics with substrate inhibition proceeding in a biofilm has not been modelled so far by means of cellular automata. Each process in the model proposed, i.e. diffusion of substrates, uptake of substrates, growth and decay of microorganisms and biofilm detachment, is simulated in a discrete manner. It was shown that for flat biofilm of constant thickness, the results of the presented model agree with those of a continuous model. The primary outcome of the study was to propose a mathematical model of biofilm growth; however a considerable amount of focus was also placed on the development of efficient algorithms for its solution. Two parallel algorithms were created, differing in the way computations are distributed. Computer programs were created using OpenMP Application Programming Interface for C++ programming language. Simulations of biofilm growth were performed on three high-performance computers. Speed-up coefficients of computer programs were compared. Both algorithms enabled a significant reduction of computation time. It is important, inter alia, in modelling and simulation of bioreactor dynamics.
ASC FY17 Implementation Plan, Rev. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, P. G.
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.« less
ERIC Educational Resources Information Center
Saraswat, Satya Prakash; Anderson, Dennis M.; Chircu, Alina M.
2014-01-01
This paper describes the development and evaluation of a graduate level Business Process Management (BPM) course with process modeling and simulation as its integral component, being offered at an accredited business university in the Northeastern U.S. Our approach is similar to that found in other Information Systems (IS) education papers, and…
Time Warp Operating System, Version 2.5.1
NASA Technical Reports Server (NTRS)
Bellenot, Steven F.; Gieselman, John S.; Hawley, Lawrence R.; Peterson, Judy; Presley, Matthew T.; Reiher, Peter L.; Springer, Paul L.; Tupman, John R.; Wedel, John J., Jr.; Wieland, Frederick P.;
1993-01-01
Time Warp Operating System, TWOS, is special purpose computer program designed to support parallel simulation of discrete events. Complete implementation of Time Warp software mechanism, which implements distributed protocol for virtual synchronization based on rollback of processes and annihilation of messages. Supports simulations and other computations in which both virtual time and dynamic load balancing used. Program utilizes underlying resources of operating system. Written in C programming language.
A Simplified Finite Element Simulation for Straightening Process of Thin-Walled Tube
NASA Astrophysics Data System (ADS)
Zhang, Ziqian; Yang, Huilin
2017-12-01
The finite element simulation is an effective way for the study of thin-walled tube in the two cross rolls straightening process. To determine the accurate radius of curvature of the roll profile more efficiently, a simplified finite element model based on the technical parameters of an actual two cross roll straightening machine, was developed to simulate the complex straightening process. Then a dynamic simulation was carried out using ANSYS LS-DYNA program. The result implied that the simplified finite element model was reasonable for simulate the two cross rolls straightening process, and can be obtained the radius of curvature of the roll profile with the tube’s straightness 2 mm/m.
Passive coherent location system simulation and evaluation
NASA Astrophysics Data System (ADS)
Slezák, Libor; Kvasnička, Michael; Pelant, Martin; Vávra, Jiř; Plšek, Radek
2006-02-01
Passive Coherent Location (PCL) is going to be important and perspective system of passive location of non cooperative and stealth targets. It works with the sources of irradiation of opportunity. PCL is intended to be a part of mobile Air Command and Control System (ACCS) as a Deployable ACCS Component (DAC). The company ERA works on PCL system parameters verification program by complete PCL simulator development since the year 2003. The Czech DoD takes financial participation on this program. The moving targets scenario, the RCS calculation by method of moment, ground clutter scattering and signal processing method (the bottle neck of the PCL) are available up to now in simulator tool. The digital signal (DSP) processing algorithms are performed both on simulated data and on real data measured at NATO C3 Agency in their Haag experiment. The Institute of Information Theory and Automation of the Academy of Sciences of the Czech Republic takes part on the implementation of the DSP algorithms in FPGA. The paper describes the simulator and signal processing structure and results both on simulated and measured data.
Abdominal surgery process modeling framework for simulation using spreadsheets.
Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja
2015-08-01
We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Janssens, Sarah; Beckmann, Michael; Bonney, Donna
2015-08-01
Simulation training in laparoscopic surgery has been shown to improve surgical performance. To describe the implementation of a laparoscopic simulation training and credentialing program for gynaecology registrars. A pilot program consisting of protected, supervised laparoscopic simulation time, a tailored curriculum and a credentialing process, was developed and implemented. Quantitative measures assessing simulated surgical performance were measured over the simulation training period. Laparoscopic procedures requiring credentialing were assessed for both the frequency of a registrar being the primary operator and the duration of surgery and compared to a presimulation cohort. Qualitative measures regarding quality of surgical training were assessed pre- and postsimulation. Improvements were seen in simulated surgical performance in efficiency domains. Operative time for procedures requiring credentialing was reduced by 12%. Primary operator status in the operating theatre for registrars was unchanged. Registrar assessment of training quality improved. The introduction of a laparoscopic simulation training and credentialing program resulted in improvements in simulated performance, reduced operative time and improved registrar assessment of the quality of training. © 2015 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.
Lattice QCD simulations using the OpenACC platform
NASA Astrophysics Data System (ADS)
Majumdar, Pushan
2016-10-01
In this article we will explore the OpenACC platform for programming Graphics Processing Units (GPUs). The OpenACC platform offers a directive based programming model for GPUs which avoids the detailed data flow control and memory management necessary in a CUDA programming environment. In the OpenACC model, programs can be written in high level languages with OpenMP like directives. We present some examples of QCD simulation codes using OpenACC and discuss their performance on the Fermi and Kepler GPUs.
A computer program for simulating geohydrologic systems in three dimensions
Posson, D.R.; Hearne, G.A.; Tracy, J.V.; Frenzel, P.F.
1980-01-01
This document is directed toward individuals who wish to use a computer program to simulate ground-water flow in three dimensions. The strongly implicit procedure (SIP) numerical method is used to solve the set of simultaneous equations. New data processing techniques and program input and output options are emphasized. The quifer system to be modeled may be heterogeneous and anisotropic, and may include both artesian and water-table conditions. Systems which consist of well defined alternating layers of highly permeable and poorly permeable material may be represented by a sequence of equations for two dimensional flow in each of the highly permeable units. Boundaries where head or flux is user-specified may be irregularly shaped. The program also allows the user to represent streams as limited-source boundaries when the streamflow is small in relation to the hydraulic stress on the system. The data-processing techniques relating to ' cube ' input and output, to swapping of layers, to restarting of simulation, to free-format NAMELIST input, to the details of each sub-routine 's logic, and to the overlay program structure are discussed. The program is capable of processing large models that might overflow computer memories with conventional programs. Detailed instructions for selecting program options, for initializing the data arrays, for defining ' cube ' output lists and maps, and for plotting hydrographs of calculated and observed heads and/or drawdowns are provided. Output may be restricted to those nodes of particular interest, thereby reducing the volumes of printout for modelers, which may be critical when working at remote terminals. ' Cube ' input commands allow the modeler to set aquifer parameters and initialize the model with very few input records. Appendixes provide instructions to compile the program, definitions and cross-references for program variables, summary of the FLECS structured FORTRAN programming language, listings of the FLECS and FORTRAN source code, and samples of input and output for example simulations. (USGS)
A better sequence-read simulator program for metagenomics.
Johnson, Stephen; Trost, Brett; Long, Jeffrey R; Pittet, Vanessa; Kusalik, Anthony
2014-01-01
There are many programs available for generating simulated whole-genome shotgun sequence reads. The data generated by many of these programs follow predefined models, which limits their use to the authors' original intentions. For example, many models assume that read lengths follow a uniform or normal distribution. Other programs generate models from actual sequencing data, but are limited to reads from single-genome studies. To our knowledge, there are no programs that allow a user to generate simulated data following non-parametric read-length distributions and quality profiles based on empirically-derived information from metagenomics sequencing data. We present BEAR (Better Emulation for Artificial Reads), a program that uses a machine-learning approach to generate reads with lengths and quality values that closely match empirically-derived distributions. BEAR can emulate reads from various sequencing platforms, including Illumina, 454, and Ion Torrent. BEAR requires minimal user input, as it automatically determines appropriate parameter settings from user-supplied data. BEAR also uses a unique method for deriving run-specific error rates, and extracts useful statistics from the metagenomic data itself, such as quality-error models. Many existing simulators are specific to a particular sequencing technology; however, BEAR is not restricted in this way. Because of its flexibility, BEAR is particularly useful for emulating the behaviour of technologies like Ion Torrent, for which no dedicated sequencing simulators are currently available. BEAR is also the first metagenomic sequencing simulator program that automates the process of generating abundances, which can be an arduous task. BEAR is useful for evaluating data processing tools in genomics. It has many advantages over existing comparable software, such as generating more realistic reads and being independent of sequencing technology, and has features particularly useful for metagenomics work.
AMPS/PC - AUTOMATIC MANUFACTURING PROGRAMMING SYSTEM
NASA Technical Reports Server (NTRS)
Schroer, B. J.
1994-01-01
The AMPS/PC system is a simulation tool designed to aid the user in defining the specifications of a manufacturing environment and then automatically writing code for the target simulation language, GPSS/PC. The domain of problems that AMPS/PC can simulate are manufacturing assembly lines with subassembly lines and manufacturing cells. The user defines the problem domain by responding to the questions from the interface program. Based on the responses, the interface program creates an internal problem specification file. This file includes the manufacturing process network flow and the attributes for all stations, cells, and stock points. AMPS then uses the problem specification file as input for the automatic code generator program to produce a simulation program in the target language GPSS. The output of the generator program is the source code of the corresponding GPSS/PC simulation program. The system runs entirely on an IBM PC running PC DOS Version 2.0 or higher and is written in Turbo Pascal Version 4 requiring 640K memory and one 360K disk drive. To execute the GPSS program, the PC must have resident the GPSS/PC System Version 2.0 from Minuteman Software. The AMPS/PC program was developed in 1988.
Pre- and postprocessing for reservoir simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, W.L.; Ingalls, L.J.; Prasad, S.J.
1991-05-01
This paper describes the functionality and underlying programing paradigms of Shell's simulator-related reservoir-engineering graphics system. THis system includes the simulation postprocessing programs Reservoir Display System (RDS) and Fast Reservoir Engineering Displays (FRED), a hypertext-like on-line documentation system (DOC), and a simulator input preprocessor (SIMPLSIM). RDS creates displays of reservoir simulation results. These displays represent the areal or cross-section distribution of computer reservoir parameters, such as pressure, phase saturation, or temperature. Generation of these images at real-time animation rates is discussed. FRED facilitates the creation of plot files from reservoir simulation output. The use of dynamic memory allocation, asynchronous I/O, amore » table-driven screen manager, and mixed-language (FORTRAN and C) programming are detailed. DOC is used to create and access on-line documentation for the pre-and post-processing programs and the reservoir simulators. DOC can be run by itself or can be accessed from within any other graphics or nongraphics application program. DOC includes a text editor, which is that basis for a reservoir simulation tutorial and greatly simplifies the preparation of simulator input. The use of sharable images, graphics, and the documentation file network are described. Finally, SIMPLSIM is a suite of program that uses interactive graphics in the preparation of reservoir description data for input into reservoir simulators. The SIMPLSIM user-interface manager (UIM) and its graphic interface for reservoir description are discussed.« less
A process-based algorithm for simulating terraces in SWAT
USDA-ARS?s Scientific Manuscript database
Terraces in crop fields are one of the most important soil and water conservation measures that affect runoff and erosion processes in a watershed. In large hydrological programs such as the Soil and Water Assessment Tool (SWAT), terrace effects are simulated by adjusting the slope length and the US...
DOT National Transportation Integrated Search
1982-06-01
This volume provides a general description of the Airport Landside Simulation Model. A summary of simulated passenger and vehicular processing through the landside is presented. Program operating characteristics and assumptions are documented and a c...
Note on the artefacts in SRIM simulation of sputtering
NASA Astrophysics Data System (ADS)
Shulga, V. I.
2018-05-01
The computer simulation program SRIM, unlike other well-known programs (MARLOWE, TRIM.SP, etc.), predicts non-zero values of the sputter yield at glancing ion bombardment of smooth amorphous targets and, for heavy ions, greatly underestimates the sputter yield at normal incidence. To understand the reasons for this, the sputtering of amorphous silicon bombarded with different ions was modeled here using the author's program OKSANA. Most simulations refer to 1 keV Xe ions, and angles of incidence cover range from 0 (normal incidence) to almost 90°. It has been shown that SRIM improperly simulates the initial stage of the sputtering process. Some other artefacts in SRIM calculations of sputtering are also revealed and discussed.
ERIC Educational Resources Information Center
Ndirangu, Mwangi; Kiboss, Joel K.; Wekesa, Eric W.
2005-01-01
The application of computer technology in education is a relatively new approach that is trying to justify inclusion in the Kenyan school curriculum. Being abstract, with a dynamic nature that does not manifest itself visibly, the process of cell division has posed difficulties for teachers. Consequently, a computer simulation program, using…
Using artificial intelligence to control fluid flow computations
NASA Technical Reports Server (NTRS)
Gelsey, Andrew
1992-01-01
Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.
Arab, Abeer; Alatassi, Abdulaleem; Alattas, Elias; Alzoraigi, Usamah; AlZaher, Zaki; Ahmad, Abdulaziz; Albabtain, Hesham; Boker, Abdulaziz
2017-01-01
The educational programs in the Saudi Commission for Health Specialties are developing rapidly in the fields of technical development. Such development is witnessed, particularly in the scientific areas related to what is commonly known as evidence-based medicine. This review highlights the critical need and importance of integrating simulation into anesthesia training and assessment. Furthermore, it describes the current utilization of simulation in anesthesia and critical care assessment process. PMID:28442961
NASA Technical Reports Server (NTRS)
Burgin, G. H.; Fogel, L. J.; Phelps, J. P.
1975-01-01
A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.
Graphical simulation for aerospace manufacturing
NASA Technical Reports Server (NTRS)
Babai, Majid; Bien, Christopher
1994-01-01
Simulation software has become a key technological enabler for integrating flexible manufacturing systems and streamlining the overall aerospace manufacturing process. In particular, robot simulation and offline programming software is being credited for reducing down time and labor cost, while boosting quality and significantly increasing productivity.
A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL)
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Owen, Jeffrey E.
1988-01-01
A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL) is presented which overcomes the traditional disadvantages of simulations executed on a digital computer. The incorporation of parallel processing allows the mapping of simulations into a digital computer to be done in the same inherently parallel manner as they are currently mapped onto an analog computer. The direct-execution format maximizes the efficiency of the executed code since the need for a high level language compiler is eliminated. Resolution is greatly increased over that which is available with an analog computer without the sacrifice in execution speed normally expected with digitial computer simulations. Although this report covers all aspects of the new architecture, key emphasis is placed on the processing element configuration and the microprogramming of the ACLS constructs. The execution times for all ACLS constructs are computed using a model of a processing element based on the AMD 29000 CPU and the AMD 29027 FPU. The increase in execution speed provided by parallel processing is exemplified by comparing the derived execution times of two ACSL programs with the execution times for the same programs executed on a similar sequential architecture.
Simulated interprofessional education: an analysis of teaching and learning processes.
van Soeren, Mary; Devlin-Cop, Sandra; Macmillan, Kathleen; Baker, Lindsay; Egan-Lee, Eileen; Reeves, Scott
2011-11-01
Simulated learning activities are increasingly being used in health professions and interprofessional education (IPE). Specifically, IPE programs are frequently adopting role-play simulations as a key learning approach. Despite this widespread adoption, there is little empirical evidence exploring the teaching and learning processes embedded within this type of simulation. This exploratory study provides insight into the nature of these processes through the use of qualitative methods. A total of 152 clinicians, 101 students and 9 facilitators representing a range of health professions, participated in video-recorded role-plays and debrief sessions. Videotapes were analyzed to explore emerging issues and themes related to teaching and learning processes related to this type of interprofessional simulated learning experience. In addition, three focus groups were conducted with a subset of participants to explore perceptions of their educational experiences. Five key themes emerged from the data analysis: enthusiasm and motivation, professional role assignment, scenario realism, facilitator style and background and team facilitation. Our findings suggest that program developers need to be mindful of these five themes when using role-plays in an interprofessional context and point to the importance of deliberate and skilled facilitation in meeting desired learning outcomes.
National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; Evans, A.
1999-01-01
Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.
Automated Classification of Phonological Errors in Aphasic Language
Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.
1984-01-01
Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.
David A. Marquis; Richard L. Ernst
1992-01-01
Describes the purpose and function of the SILVAH computer program in general terms; provides detailed instructions on use of the program; and provides information on program organization , data formats, and the basis of processing algorithms.
NASA Technical Reports Server (NTRS)
Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.
2005-01-01
Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.
Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Matzen, M. Keith
2014-09-16
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less
WEST-3 wind turbine simulator development
NASA Technical Reports Server (NTRS)
Hoffman, J. A.; Sridhar, S.
1985-01-01
The software developed for WEST-3, a new, all digital, and fully programmable wind turbine simulator is given. The process of wind turbine simulation on WEST-3 is described in detail. The major steps are, the processing of the mathematical models, the preparation of the constant data, and the use of system software generated executable code for running on WEST-3. The mechanics of reformulation, normalization, and scaling of the mathematical models is discussed in detail, in particulr, the significance of reformulation which leads to accurate simulations. Descriptions for the preprocessor computer programs which are used to prepare the constant data needed in the simulation are given. These programs, in addition to scaling and normalizing all the constants, relieve the user from having to generate a large number of constants used in the simulation. Also given are brief descriptions of the components of the WEST-3 system software: Translator, Assembler, Linker, and Loader. Also included are: details of the aeroelastic rotor analysis, which is the center of a wind turbine simulation model, analysis of the gimbal subsystem; and listings of the variables, constants, and equations used in the simulation.
Global Scale Atmospheric Processes Research Program Review
NASA Technical Reports Server (NTRS)
Worley, B. A. (Editor); Peslen, C. A. (Editor)
1984-01-01
Global modeling; satellite data assimilation and initialization; simulation of future observing systems; model and observed energetics; dynamics of planetary waves; First Global Atmospheric Research Program Global Experiment (FGGE) diagnosis studies; and National Research Council Research Associateship Program are discussed.
TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY
Somogyi, Endre; Hagar, Amit; Glazier, James A.
2017-01-01
Living tissues are dynamic, heterogeneous compositions of objects, including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes. Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology (CCOPM) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models. PMID:29282379
TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY.
Somogyi, Endre; Hagar, Amit; Glazier, James A
2016-12-01
Living tissues are dynamic, heterogeneous compositions of objects , including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes . Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology ( CCOPM ) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models.
Design and application of star map simulation system for star sensors
NASA Astrophysics Data System (ADS)
Wu, Feng; Shen, Weimin; Zhu, Xifang; Chen, Yuheng; Xu, Qinquan
2013-12-01
Modern star sensors are powerful to measure attitude automatically which assure a perfect performance of spacecrafts. They achieve very accurate attitudes by applying algorithms to process star maps obtained by the star camera mounted on them. Therefore, star maps play an important role in designing star cameras and developing procession algorithms. Furthermore, star maps supply significant supports to exam the performance of star sensors completely before their launch. However, it is not always convenient to supply abundant star maps by taking pictures of the sky. Thus, star map simulation with the aid of computer attracts a lot of interests by virtue of its low price and good convenience. A method to simulate star maps by programming and extending the function of the optical design program ZEMAX is proposed. The star map simulation system is established. Firstly, based on analyzing the working procedures of star sensors to measure attitudes and the basic method to design optical system by ZEMAX, the principle of simulating star sensor imaging is given out in detail. The theory about adding false stars and noises, and outputting maps is discussed and the corresponding approaches are proposed. Then, by external programming, the star map simulation program is designed and produced. Its user interference and operation are introduced. Applications of star map simulation method in evaluating optical system, star image extraction algorithm and star identification algorithm, and calibrating system errors are presented completely. It was proved that the proposed simulation method provides magnificent supports to the study on star sensors, and improves the performance of star sensors efficiently.
Establishing High-Quality Prostate Brachytherapy Using a Phantom Simulator Training Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thaker, Nikhil G.; Kudchadker, Rajat J.; Swanson, David A.
2014-11-01
Purpose: To design and implement a unique training program that uses a phantom-based simulator to teach the process of prostate brachytherapy (PB) quality assurance and improve the quality of education. Methods and Materials: Trainees in our simulator program were practicing radiation oncologists, radiation oncology residents, and fellows of the American Brachytherapy Society. The program emphasized 6 core areas of quality assurance: patient selection, simulation, treatment planning, implant technique, treatment evaluation, and outcome assessment. Using the Iodine 125 ({sup 125}I) preoperative treatment planning technique, trainees implanted their ultrasound phantoms with dummy seeds (ie, seeds with no activity). Pre- and postimplant dosimetric parametersmore » were compared and correlated using regression analysis. Results: Thirty-one trainees successfully completed the simulator program during the period under study. The mean phantom prostate size, number of seeds used, and total activity were generally consistent between trainees. All trainees met the V100 >95% objective both before and after implantation. Regardless of the initial volume of the prostate phantom, trainees' ability to cover the target volume with at least 100% of the dose (V100) was not compromised (R=0.99 pre- and postimplant). However, the V150 had lower concordance (R=0.37) and may better reflect heterogeneity control of the implant process. Conclusions: Analysis of implants from this phantom-based simulator shows a high degree of consistency between trainees and uniformly high-quality implants with respect to parameters used in clinical practice. This training program provides a valuable educational opportunity that improves the quality of PB training and likely accelerates the learning curve inherent in PB. Prostate phantom implantation can be a valuable first step in the acquisition of the required skills to safely perform PB.« less
Proposal of Modification Strategy of NC Program in the Virtual Manufacturing Environment
NASA Astrophysics Data System (ADS)
Narita, Hirohisa; Chen, Lian-Yi; Fujimoto, Hideo; Shirase, Keiichi; Arai, Eiji
Virtual manufacturing will be a key technology in process planning, because there are no evaluation tools for cutting conditions. Therefore, virtual machining simulator (VMSim), which can predict end milling processes, has been developed. The modification strategy of NC program using VMSim is proposed in this paper.
USER MANUAL FOR EXPRESS, THE EXAMS-PRZM EXPOSURE SIMULATION SHELL
The Environmental Fate and Effects Division (EFED) of EPA's Office of Pesticide Programs(OPP) uses a suite of ORD simulation models for the exposure analysis portion of regulatory risk assessments. These models (PRZM, EXAMS, AgDisp) are complex, process-based simulation codes tha...
Parallel Signal Processing and System Simulation using aCe
NASA Technical Reports Server (NTRS)
Dorband, John E.; Aburdene, Maurice F.
2003-01-01
Recently, networked and cluster computation have become very popular for both signal processing and system simulation. A new language is ideally suited for parallel signal processing applications and system simulation since it allows the programmer to explicitly express the computations that can be performed concurrently. In addition, the new C based parallel language (ace C) for architecture-adaptive programming allows programmers to implement algorithms and system simulation applications on parallel architectures by providing them with the assurance that future parallel architectures will be able to run their applications with a minimum of modification. In this paper, we will focus on some fundamental features of ace C and present a signal processing application (FFT).
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
Formalizing Knowledge in Multi-Scale Agent-Based Simulations
Somogyi, Endre; Sluka, James P.; Glazier, James A.
2017-01-01
Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused. PMID:29338063
Formalizing Knowledge in Multi-Scale Agent-Based Simulations.
Somogyi, Endre; Sluka, James P; Glazier, James A
2016-10-01
Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.
NVSIM: UNIX-based thermal imaging system simulator
NASA Astrophysics Data System (ADS)
Horger, John D.
1993-08-01
For several years the Night Vision and Electronic Sensors Directorate (NVESD) has been using an internally developed forward looking infrared (FLIR) simulation program. In response to interest in the simulation part of these projects by other organizations, NVESD has been working on a new version of the simulation, NVSIM, that will be made generally available to the FLIR using community. NVSIM uses basic FLIR specification data, high resolution thermal input imagery and spatial domain image processing techniques to produce simulated image outputs from a broad variety of FLIRs. It is being built around modular programming techniques to allow simpler addition of more sensor effects. The modularity also allows selective inclusion and exclusion of individual sensor effects at run time. The simulation has been written in the industry standard ANSI C programming language under the widely used UNIX operating system to make it easily portable to a wide variety of computer platforms.
Building Interactive Simulations in Web Pages without Programming.
Mailen Kootsey, J; McAuley, Grant; Bernal, Julie
2005-01-01
A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.
NASA Astrophysics Data System (ADS)
Huppert, J.; Michal Lomask, S.; Lazarowitz, R.
2002-08-01
Computer-assisted learning, including simulated experiments, has great potential to address the problem solving process which is a complex activity. It requires a highly structured approach in order to understand the use of simulations as an instructional device. This study is based on a computer simulation program, 'The Growth Curve of Microorganisms', which required tenth grade biology students to use problem solving skills whilst simultaneously manipulating three independent variables in one simulated experiment. The aims were to investigate the computer simulation's impact on students' academic achievement and on their mastery of science process skills in relation to their cognitive stages. The results indicate that the concrete and transition operational students in the experimental group achieved significantly higher academic achievement than their counterparts in the control group. The higher the cognitive operational stage, the higher students' achievement was, except in the control group where students in the concrete and transition operational stages did not differ. Girls achieved equally with the boys in the experimental group. Students' academic achievement may indicate the potential impact a computer simulation program can have, enabling students with low reasoning abilities to cope successfully with learning concepts and principles in science which require high cognitive skills.
Parallelization of Program to Optimize Simulated Trajectories (POST3D)
NASA Technical Reports Server (NTRS)
Hammond, Dana P.; Korte, John J. (Technical Monitor)
2001-01-01
This paper describes the parallelization of the Program to Optimize Simulated Trajectories (POST3D). POST3D uses a gradient-based optimization algorithm that reaches an optimum design point by moving from one design point to the next. The gradient calculations required to complete the optimization process, dominate the computational time and have been parallelized using a Single Program Multiple Data (SPMD) on a distributed memory NUMA (non-uniform memory access) architecture. The Origin2000 was used for the tests presented.
2014-04-30
cost to acquire systems as design maturity could be verified incrementally as the system was developed vice waiting for specific large “ big bang ...Framework (MBAF) be applied to simulate or optimize process variations on programs? LSI Roles and Responsibilities A review of the roles and...the model/process optimization process. It is the current intent that NAVAIR will use the model to run simulations on process changes in an attempt to
The National Shipbuilding Research Program, Computer Aided Process Planning for Shipyards
1986-08-01
Factory Simulation with Conventional Factory Planning Techniques Financial Justification of State-of-the-Art Investment: A Study Using CAPP I–5 T I T L...and engineer to order.” “Factory Simulation: Approach to Integration of Computer- Based Factory Simulation with Conventional Factory Planning Techniques
Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L
2016-09-21
Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.
Analytical solutions for coagulation and condensation kinetics of composite particles
NASA Astrophysics Data System (ADS)
Piskunov, Vladimir N.
2013-04-01
The processes of composite particles formation consisting of a mixture of different materials are essential for many practical problems: for analysis of the consequences of accidental releases in atmosphere; for simulation of precipitation formation in clouds; for description of multi-phase processes in chemical reactors and industrial facilities. Computer codes developed for numerical simulation of these processes require optimization of computational methods and verification of numerical programs. Kinetic equations of composite particle formation are given in this work in a concise form (impurity integrated). Coagulation, condensation and external sources associated with nucleation are taken into account. Analytical solutions were obtained in a number of model cases. The general laws for fraction redistribution of impurities were defined. The results can be applied to develop numerical algorithms considerably reducing the simulation effort, as well as to verify the numerical programs for calculation of the formation kinetics of composite particles in the problems of practical importance.
Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meisner, Robert; McCoy, Michel; Archer, Bill
2013-09-11
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less
Manipulation and handling processes off-line programming and optimization with use of K-Roset
NASA Astrophysics Data System (ADS)
Gołda, G.; Kampa, A.
2017-08-01
Contemporary trends in development of efficient, flexible manufacturing systems require practical implementation of modern “Lean production” concepts for maximizing customer value through minimizing all wastes in manufacturing and logistics processes. Every FMS is built on the basis of automated and robotized production cells. Except flexible CNC machine tools and other equipments, the industrial robots are primary elements of the system. In the studies, authors look for wastes of time and cost in real tasks of robots, during manipulation processes. According to aspiration for optimization of handling and manipulation processes with use of the robots, the application of modern off-line programming methods and computer simulation, is the best solution and it is only way to minimize unnecessary movements and other instructions. The modelling process of robotized production cell and offline programming of Kawasaki robots in AS-Language will be described. The simulation of robotized workstation will be realized with use of virtual reality software K-Roset. Authors show the process of industrial robot’s programs improvement and optimization in terms of minimizing the number of useless manipulator movements and unnecessary instructions. This is realized in order to shorten the time of production cycles. This will also reduce costs of handling, manipulations and technological process.
In situ simulated cardiac arrest exercises to detect system vulnerabilities.
Barbeito, Atilio; Bonifacio, Alberto; Holtschneider, Mary; Segall, Noa; Schroeder, Rebecca; Mark, Jonathan
2015-06-01
Sudden cardiac arrest is the leading cause of death in the United States. Despite new therapies, progress in this area has been slow, and outcomes remain poor even in the hospital setting, where providers, drugs, and devices are readily available. This is partly attributed to the quality of resuscitation, which is an important determinant of survival for patients who experience cardiac arrest. Systems problems, such as deficiencies in the physical space or equipment design, hospital-level policies, work culture, and poor leadership and teamwork, are now known to contribute significantly to the quality of resuscitation provided. We describe an in situ simulation-based quality improvement program that was designed to continuously monitor the cardiac arrest response process for hazards and defects and to detect opportunities for system optimization. A total of 72 simulated unannounced cardiac arrest exercises were conducted between October 2010 and September 2013 at various locations throughout our medical center and at different times of the day. We detected several environmental, human-machine interface, culture, and policy hazards and defects. We used the Systems Engineering Initiative for Patient Safety (SEIPS) model to understand the structure, processes, and outcomes related to the hospital's emergency response system. Multidisciplinary solutions were crafted for each of the hazards detected, and the simulation program was used to iteratively test the redesigned processes before implementation in real clinical settings. We describe an ongoing program that uses in situ simulation to identify and mitigate latent hazards and defects in the hospital emergency response system. The SEIPS model provides a framework for describing and analyzing the structure, processes, and outcomes related to these events.
Workflow of the Grover algorithm simulation incorporating CUDA and GPGPU
NASA Astrophysics Data System (ADS)
Lu, Xiangwen; Yuan, Jiabin; Zhang, Weiwei
2013-09-01
The Grover quantum search algorithm, one of only a few representative quantum algorithms, can speed up many classical algorithms that use search heuristics. No true quantum computer has yet been developed. For the present, simulation is one effective means of verifying the search algorithm. In this work, we focus on the simulation workflow using a compute unified device architecture (CUDA). Two simulation workflow schemes are proposed. These schemes combine the characteristics of the Grover algorithm and the parallelism of general-purpose computing on graphics processing units (GPGPU). We also analyzed the optimization of memory space and memory access from this perspective. We implemented four programs on CUDA to evaluate the performance of schemes and optimization. Through experimentation, we analyzed the organization of threads suited to Grover algorithm simulations, compared the storage costs of the four programs, and validated the effectiveness of optimization. Experimental results also showed that the distinguished program on CUDA outperformed the serial program of libquantum on a CPU with a speedup of up to 23 times (12 times on average), depending on the scale of the simulation.
A Computer Simulation of the Trophic Dynamics of an Aquatic System.
ERIC Educational Resources Information Center
Bowker, D. W.; Randerson, P. F.
1989-01-01
Described is a computer program, AQUASIM, which simulates interaction between environmental factors, phytoplankton, zooplankton, and fish in an aquatic ecosystem. The conceptual flow, equations, variables, rate processes, and parameter manipulations are discussed. (CW)
Survey: Computer Usage in Design Courses.
ERIC Educational Resources Information Center
Henley, Ernest J.
1983-01-01
Presents results of a survey of chemical engineering departments regarding computer usage in senior design courses. Results are categorized according to: computer usage (use of process simulators, student-written programs, faculty-written or "canned" programs; costs (hard and soft money); and available software. Programs offered are…
A Modularized Counselor-Education Program.
ERIC Educational Resources Information Center
Miller, Thomas V.; Dimattia, Dominic J.
1978-01-01
Counselor-education programs may be enriched through the use of modularized learning experiences. This article notes several recent articles on competency-based counselor education, the concepts of simulation and modularization, and describes the process of developing a modularized master's program at the University of Bridgeport in Connecticut.…
Static analysis techniques for semiautomatic synthesis of message passing software skeletons
Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...
2015-06-29
The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less
SAMIS- STANDARD ASSEMBLY-LINE MANUFACTURING INDUSTRY SIMULATION
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.
1994-01-01
The Standard Assembly-Line Manufacturing Industry Simulation (SAMIS) program was originally developed to model a hypothetical U. S. industry which manufactures silicon solar modules for use in electricity generation. The SAMIS program has now been generalized to the extent that it should be useful for simulating many different production-line manufacturing industries and companies. The most important capability of SAMIS is its ability to "simulate" an industry based on a model developed by the user with the aid of the SAMIS program. The results of the simulation are a set of financial reports which detail the requirements, including quantities and cost, of the companies and processes which comprise the industry. SAMIS provides a fair, consistent, and reliable means of comparing manufacturing processes being developed by numerous independent efforts. It can also be used to assess the industry-wide impact of changes in financial parameters, such as cost of resources and services, inflation rates, interest rates, tax policies, and required return on equity. Because of the large amount of data needed to describe an industry, a major portion of SAMIS is dedicated to data entry and maintenance. This activity in SAMIS is referred to as model management. Model management requires a significant amount of interaction through a system of "prompts" which make it possible for persons not familiar with computers, or the SAMIS program, to provide all of the data necessary to perform a simulation. SAMIS is written in TURBO PASCAL (version 2.0 required for compilation) and requires 10 meg of hard disk space, an 8087 coprocessor, and an IBM color graphics monitor. Executables and source code are provided. SAMIS was originally developed in 1978; the IBM PC version was developed in 1985. Release 6.1 was made available in 1986, and includes the PC-IPEG program.
An Overview of NASA's Program of Future M&S VV&A Outreach and Training Activities
NASA Technical Reports Server (NTRS)
Caine, Lisa; Hale, Joseph P.
2006-01-01
NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model s fidelity, credibility, and quality. The Integrated Modeling & Simulation Verification, Validation and Accreditation (IM&S W&A) process will allow the decision-maker to understand the risks involved in using a model s results for mission-critical decisions. The W&A Technical Working Group (W&A TWG) has been identified to communicate this process throughout the agency. As the W&A experts, the W&A NVG will be the central resource for support of W&A policy, procedures, training and templates for documentation. This presentation will discuss the W&A Technical Working Group s outreach approach aimed at educating M&S program managers, developers, users and proponents on the W&A process, beginning at MSFC with the CLV program.
Chung, Hyun Soo; Issenberg, S Barry; Phrampus, Paul; Miller, Geoff; Je, Sang Mo; Lim, Tae Ho; Kim, Young Min
2012-12-01
Countries that are less experienced with simulation-based healthcare education (SBHE) often import Western programs to initiate their efforts to deliver effective simulation training. Acknowledging cultural differences, we sought to determine whether faculty development program on SBHE in the United States could be transported successfully to train faculty members in Korea. An international, collaborative, multi-professional program from a pre-existing Western model was adapted. The process focused on prioritization of curricular elements based on local needs, translation of course materials, and delivery of the program in small group facilitation exercises. Three types of evaluation data were collected: participants' simulation experience; participants' ratings of the course; and participant's self-assessment of the impact of the course on their knowledge, skills, and attitudes (KSA) toward simulation teaching. Thirty faculty teachers participated in the course. Eighty percent of the participants answered that they spent less than 25% of their time as simulation instructors. Time spent on planning, scenario development, delivering training, research, and administrative work ranged from 10% to 30%. Twenty-eight of 30 participants agreed or strongly agreed that the course was excellent and relevant to their needs. The participants' assessment of the impact of the course on their KSA toward simulation teaching improved significantly. Although there were many challenges to overcome, a systematic approach in the adaptation of a Western simulation faculty development course model was successfully implemented in Korea, and the program improves self-confidence and learning in participants.
Multidisciplinary propulsion simulation using NPSS
NASA Technical Reports Server (NTRS)
Claus, Russell W.; Evans, Austin L.; Follen, Gregory J.
1992-01-01
The current status of the Numerical Propulsion System Simulation (NPSS) program, a cooperative effort of NASA, industry, and universities to reduce the cost and time of advanced technology propulsion system development, is reviewed. The technologies required for this program include (1) interdisciplinary analysis to couple the relevant disciplines, such as aerodynamics, structures, heat transfer, combustion, acoustics, controls, and materials; (2) integrated systems analysis; (3) a high-performance computing platform, including massively parallel processing; and (4) a simulation environment providing a user-friendly interface. Several research efforts to develop these technologies are discussed.
NASA Technical Reports Server (NTRS)
Fouts, Douglas J.; Butner, Steven E.
1991-01-01
The design of the processing element of GASP, a GaAs supercomputer with a 500-MHz instruction issue rate and 1-GHz subsystem clocks, is presented. The novel, functionally modular, block data flow architecture of GASP is described. The architecture and design of a GASP processing element is then presented. The processing element (PE) is implemented in a hybrid semiconductor module with 152 custom GaAs ICs of eight different types. The effects of the implementation technology on both the system-level architecture and the PE design are discussed. SPICE simulations indicate that parts of the PE are capable of being clocked at 1 GHz, while the rest of the PE uses a 500-MHz clock. The architecture utilizes data flow techniques at a program block level, which allows efficient execution of parallel programs while maintaining reasonably good performance on sequential programs. A simulation study of the architecture indicates that an instruction execution rate of over 30,000 MIPS can be attained with 65 PEs.
Dynamic Simulation of a Helium Liquefier
NASA Astrophysics Data System (ADS)
Maekawa, R.; Ooba, K.; Nobutoki, M.; Mito, T.
2004-06-01
Dynamic behavior of a helium liquefier has been studied in detail with a Cryogenic Process REal-time SimulaTor (C-PREST) at the National Institute for Fusion Science (NIFS). The C-PREST is being developed to integrate large-scale helium cryogenic plant design, operation and maintenance for optimum process establishment. As a first step of simulations of cooldown to 4.5 K with the helium liquefier model is conducted, which provides a plant-process validation platform. The helium liquefier consists of seven heat exchangers, a liquid-nitrogen (LN2) precooler, two expansion turbines and a liquid-helium (LHe) reservoir. Process simulations are fulfilled with sequence programs, which were implemented with C-PREST based on an existing liquefier operation. The interactions of a JT valve, a JT-bypass valve and a reservoir-return valve have been dynamically simulated. The paper discusses various aspects of refrigeration process simulation, including its difficulties such as a balance between complexity of the adopted models and CPU time.
Status of the Electroforming Shield Design (ESD) project
NASA Technical Reports Server (NTRS)
Fletcher, R. E.
1977-01-01
The utilization of a digital computer to augment electrodeposition/electroforming processes in which nonconducting shielding controls local cathodic current distribution is reported. The primary underlying philosophy of the physics of electrodeposition was presented. The technical approach taken to analytically simulate electrolytic tank variables was also included. A FORTRAN computer program has been developed and implemented. The program utilized finite element techniques and electrostatic theory to simulate electropotential fields and ionic transport.
A simulation model of IT risk on program trading
NASA Astrophysics Data System (ADS)
Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan
2015-12-01
The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.
Digital autopilots: Design considerations and simulator evaluations
NASA Technical Reports Server (NTRS)
Osder, S.; Neuman, F.; Foster, J.
1971-01-01
The development of a digital autopilot program for a transport aircraft and the evaluation of that system's performance on a transport aircraft simulator is discussed. The digital autopilot includes three axis attitude stabilization, automatic throttle control and flight path guidance functions with emphasis on the mode progression from descent into the terminal area through automatic landing. The study effort involved a sequence of tasks starting with the definition of detailed system block diagrams of control laws followed by a flow charting and programming phase and concluding with performance verification using the transport aircraft simulation. The autopilot control laws were programmed in FORTRAN 4 in order to isolate the design process from requirements peculiar to an individual computer.
Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment
NASA Technical Reports Server (NTRS)
Miranda, David J.; Fayez, Sam; Steele, Martin J.
2011-01-01
On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete event simulations.
NASA Technical Reports Server (NTRS)
Chawner, David M.; Gomez, Ray J.
2010-01-01
In the Applied Aerosciences and CFD branch at Johnson Space Center, computational simulations are run that face many challenges. Two of which are the ability to customize software for specialized needs and the need to run simulations as fast as possible. There are many different tools that are used for running these simulations and each one has its own pros and cons. Once these simulations are run, there needs to be software capable of visualizing the results in an appealing manner. Some of this software is called open source, meaning that anyone can edit the source code to make modifications and distribute it to all other users in a future release. This is very useful, especially in this branch where many different tools are being used. File readers can be written to load any file format into a program, to ease the bridging from one tool to another. Programming such a reader requires knowledge of the file format that is being read as well as the equations necessary to obtain the derived values after loading. When running these CFD simulations, extremely large files are being loaded and having values being calculated. These simulations usually take a few hours to complete, even on the fastest machines. Graphics processing units (GPUs) are usually used to load the graphics for computers; however, in recent years, GPUs are being used for more generic applications because of the speed of these processors. Applications run on GPUs have been known to run up to forty times faster than they would on normal central processing units (CPUs). If these CFD programs are extended to run on GPUs, the amount of time they would require to complete would be much less. This would allow more simulations to be run in the same amount of time and possibly perform more complex computations.
Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Wangda; McNeil, Andrew; Wetter, Michael
2011-09-06
We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.
Modeling Best Management Practices (BMPs) with HSPF
The Hydrological Simulation Program-Fortran (HSPF) is a semi-distributed watershed model, which simulates hydrology and water quality processes at user-specified spatial and temporal scales. Although HSPF is a comprehensive and highly flexible model, a number of investigators not...
Bernal, Javier; Torres-Jimenez, Jose
2015-01-01
SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller's scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed, such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller's algorithm; the (re)initialization of weights with simulated annealing required to (re)start Møller's algorithm the first time and each time thereafter that it shows insufficient progress in reaching a possibly local minimum; and the use of simulated annealing when Møller's algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of the scaled conjugate gradient algorithm, the simulated annealing procedure and the training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data.
Methods for design and evaluation of parallel computating systems (The PISCES project)
NASA Technical Reports Server (NTRS)
Pratt, Terrence W.; Wise, Robert; Haught, Mary JO
1989-01-01
The PISCES project started in 1984 under the sponsorship of the NASA Computational Structural Mechanics (CSM) program. A PISCES 1 programming environment and parallel FORTRAN were implemented in 1984 for the DEC VAX (using UNIX processes to simulate parallel processes). This system was used for experimentation with parallel programs for scientific applications and AI (dynamic scene analysis) applications. PISCES 1 was ported to a network of Apollo workstations by N. Fitzgerald.
Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam
2014-08-01
Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, E.A.; Smed, P.F.; Bryndum, M.B.
The paper describes the numerical program, PIPESIN, that simulates the behavior of a pipeline placed on an erodible seabed. PIPEline Seabed INteraction from installation until a stable pipeline seabed configuration has occurred is simulated in the time domain including all important physical processes. The program is the result of the joint research project, ``Free Span Development and Self-lowering of Offshore Pipelines`` sponsored by EU and a group of companies and carried out by the Danish Hydraulic Institute and Delft Hydraulics. The basic modules of PIPESIN are described. The description of the scouring processes has been based on and verified throughmore » physical model tests carried out as part of the research project. The program simulates a section of the pipeline (typically 500 m) in the time domain, the main input being time series of the waves and current. The main results include predictions of the onset of free spans, their length distribution, their variation in time, and the lowering of the pipeline as function of time.« less
A Case Study: Using Delmia at Kennedy Space Center to Support NASA's Constellation Program
NASA Technical Reports Server (NTRS)
Kickbusch, Tracey; Humeniuk, Bob
2010-01-01
The presentation examines the use of Delmia (Digital Enterprise Lean Manufacturing Interactive Application) for digital simulation in NASA's Constellation Program. Topics include an overview of the Kennedy Space Center (KSC) Design Visualization Group tasks, NASA's Constellation Program, Ares 1 ground processing preliminary design review, and challenges and how Delmia is used at KSC, Challenges include dealing with large data sets, creating and maintaining KSC's infrastructure, gathering customer requirements and meeting objectives, creating life-like simulations, and providing quick turn-around on varied products,
Lefkoff, L.J.; Gorelick, S.M.
1986-01-01
Detailed two-dimensional flow simulation of a complex ground-water system is combined with quadratic and linear programming to evaluate design alternatives for rapid aquifer restoration. Results show how treatment and pumping costs depend dynamically on the type of treatment process, and capacity of pumping and injection wells, and the number of wells. The design for an inexpensive treatment process minimizes pumping costs, while an expensive process results in the minimization of treatment costs. Substantial reductions in pumping costs occur with increases in injection capacity or in the number of wells. Treatment costs are reduced by expansions in pumping capacity or injecion capacity. The analysis identifies maximum pumping and injection capacities.-from Authors
MSC products for the simulation of tire behavior
NASA Technical Reports Server (NTRS)
Muskivitch, John C.
1995-01-01
The modeling of tires and the simulation of tire behavior are complex problems. The MacNeal-Schwendler Corporation (MSC) has a number of finite element analysis products that can be used to address the complexities of tire modeling and simulation. While there are many similarities between the products, each product has a number of capabilities that uniquely enable it to be used for a specific aspect of tire behavior. This paper discusses the following programs: (1) MSC/NASTRAN - general purpose finite element program for linear and nonlinear static and dynamic analysis; (2) MSC/ADAQUS - nonlinear statics and dynamics finite element program; (3) MSC/PATRAN AFEA (Advanced Finite Element Analysis) - general purpose finite element program with a subset of linear and nonlinear static and dynamic analysis capabilities with an integrated version of MSC/PATRAN for pre- and post-processing; and (4) MSC/DYTRAN - nonlinear explicit transient dynamics finite element program.
FERN - a Java framework for stochastic simulation and evaluation of reaction networks.
Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf
2008-08-29
Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.
Performance of the NASA Airborne Radar with the Windshear Database for Forward-Looking Systems
NASA Technical Reports Server (NTRS)
Switzer, George F.; Britt, Charles L.
1996-01-01
This document describes the simulation approach used to test the performance of the NASA airborne windshear radar. An explanation of the actual radar hardware and processing algorithms provides an understanding of the parameters used in the simulation program. This report also contains a brief overview of the NASA airborne windshear radar experimental flight test results. A description of the radar simulation program shows the capabilities of the program and the techniques used for certification evaluation. Simulation of the NASA radar is comprised of three steps. First, the choice of the ground clutter data must be made. The ground clutter is the return from objects in or nearby an airport facility. The choice of the ground clutter also dictates the aircraft flight path since ground clutter is gathered while in flight. The second step is the choice of the radar parameters and the running of the simulation program which properly combines the ground clutter data with simulated windshear weather data. The simulated windshear weather data is comprised of a number of Terminal Area Simulation System (TASS) model results. The final step is the comparison of the radar simulation results to the known windshear data base. The final evaluation of the radar simulation is based on the ability to detect hazardous windshear with the aircraft at a safe distance while at the same time not displaying false alerts.
NASA Technical Reports Server (NTRS)
Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.
1973-01-01
Digital image processing, image recorders, high-density digital data recorders, and data system element processing for use in an Earth Resources Survey image data processing system are studied. Loading to various ERS systems is also estimated by simulation.
Using a simulation assistant in modeling manufacturing systems
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.
1988-01-01
Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.
Numerical simulation of hydrogen fluorine overtone chemical lasers
NASA Astrophysics Data System (ADS)
Chen, Jinbao; Jiang, Zhongfu; Hua, Weihong; Liu, Zejin; Shu, Baihong
1998-08-01
A two-dimensional program was applied to simulate the chemical dynamic process, gas dynamic process and lasing process of a combustion-driven CW HF overtone chemical lasers. Some important parameters in the cavity were obtained. The calculated results included HF molecule concentration on each vibration energy level while lasing, averaged pressure and temperature, zero power gain coefficient of each spectral line, laser spectrum, the averaged laser intensity, output power, chemical efficiency and the length of lasing zone.
A Simulation of the Base Civil Engineering Work Request/Work Order System.
1981-09-01
with better information with which to make a decision. For example, if the Chief of R&R wanted to know the effect on work order processing time of...work order processing times for the system. The Q-GERT Analysis Program developed by Pritsker (11) was used to simulate the generation of work...several factors affecting the mean work order processing time. 26 [2 r -- ... ... CHAPTER III RESEARCH METHODOLOGY Overview This chapter presents the
NASA Technical Reports Server (NTRS)
Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga
2005-01-01
Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.
Development and training of a learning expert system in an autonomous mobile robot via simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spelt, P.F.; Lyness, E.; DeSaussure, G.
1989-11-01
The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. Recently at CESAR a learning expert system was created to operate on board an autonomous robot working at a process control panel. The authors discuss two-computer simulation system used to create, evaluate and train this learning system. The simulation system has a graphics display of the current status of the process being simulated, and the same program which does the simulating also drives the actual control panel. Simulation results were validated on the actual robot. The speed and safety values of using amore » computerized simulator to train a learning computer, and future uses of the simulation system, are discussed.« less
DnaSAM: Software to perform neutrality testing for large datasets with complex null models.
Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B
2010-05-01
Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.
Regan, R.S.; Schaffranek, R.W.; Baltzer, R.A.
1996-01-01
A system of functional utilities and computer routines, collectively identified as the Time-Dependent Data System CI DDS), has been developed and documented by the U.S. Geological Survey. The TDDS is designed for processing time sequences of discrete, fixed-interval, time-varying geophysical data--in particular, hydrologic data. Such data include various, dependent variables and related parameters typically needed as input for execution of one-, two-, and three-dimensional hydrodynamic/transport and associated water-quality simulation models. Such data can also include time sequences of results generated by numerical simulation models. Specifically, TDDS provides the functional capabilities to process, store, retrieve, and compile data in a Time-Dependent Data Base (TDDB) in response to interactive user commands or pre-programmed directives. Thus, the TDDS, in conjunction with a companion TDDB, provides a ready means for processing, preparation, and assembly of time sequences of data for input to models; collection, categorization, and storage of simulation results from models; and intercomparison of field data and simulation results. The TDDS can be used to edit and verify prototype, time-dependent data to affirm that selected sequences of data are accurate, contiguous, and appropriate for numerical simulation modeling. It can be used to prepare time-varying data in a variety of formats, such as tabular lists, sequential files, arrays, graphical displays, as well as line-printer plots of single or multiparameter data sets. The TDDB is organized and maintained as a direct-access data base by the TDDS, thus providing simple, yet efficient, data management and access. A single, easily used, program interface that provides all access to and from a particular TDDB is available for use directly within models, other user-provided programs, and other data systems. This interface, together with each major functional utility of the TDDS, is described and documented in this report.
Emergency Management Operations Process Mapping: Public Safety Technical Program Study
2011-02-01
Enterprise Architectures in industry, and have been successfully applied to assist companies to optimise interdependencies and relationships between...model for more in-depth analysis of EM processes, and for use in tandem with other studies that apply modeling and simulation to assess EM...for use in tandem with other studies that apply modeling and simulation to assess EM operational effectiveness before and after changing elements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Pernice
2010-09-01
INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.
Radar target classification studies: Software development and documentation
NASA Astrophysics Data System (ADS)
Kamis, A.; Garber, F.; Walton, E.
1985-09-01
Three computer programs were developed to process and analyze calibrated radar returns. The first program, called DATABASE, was developed to create and manage a random accessed data base. The second program, called FTRAN DB, was developed to process horizontal and vertical polarizations radar returns into different formats (i.e., time domain, circular polarizations and polarization parameters). The third program, called RSSE, was developed to simulate a variety of radar systems and to evaluate their ability to identify radar returns. Complete computer listings are included in the appendix volumes.
BIOPLUME III: NATURAL ATTENTUATION DECISION SUPPORT SYSTEM USER'S MANUAL - VERSION 1.0
The BIOPLUME III program is a two-dimensional, finite difference model for simulating the natural attenuation of organic contaminants in ground water due to the processes of advection, dispersion, sorption, and biodegradation. The model simulates the biodegradation of organic...
Visualizing ultrasound through computational modeling
NASA Technical Reports Server (NTRS)
Guo, Theresa W.
2004-01-01
The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.
Eye growth and myopia development: Unifying theory and Matlab model.
Hung, George K; Mahadas, Kausalendra; Mohammad, Faisal
2016-03-01
The aim of this article is to present an updated unifying theory of the mechanisms underlying eye growth and myopia development. A series of model simulation programs were developed to illustrate the mechanism of eye growth regulation and myopia development. Two fundamental processes are presumed to govern the relationship between physiological optics and eye growth: genetically pre-programmed signaling and blur feedback. Cornea/lens is considered to have only a genetically pre-programmed component, whereas eye growth is considered to have both a genetically pre-programmed and a blur feedback component. Moreover, based on the Incremental Retinal-Defocus Theory (IRDT), the rate of change of blur size provides the direction for blur-driven regulation. The various factors affecting eye growth are shown in 5 simulations: (1 - unregulated eye growth): blur feedback is rendered ineffective, as in the case of form deprivation, so there is only genetically pre-programmed eye growth, generally resulting in myopia; (2 - regulated eye growth): blur feedback regulation demonstrates the emmetropization process, with abnormally excessive or reduced eye growth leading to myopia and hyperopia, respectively; (3 - repeated near-far viewing): simulation of large-to-small change in blur size as seen in the accommodative stimulus/response function, and via IRDT as well as nearwork-induced transient myopia (NITM), leading to the development of myopia; (4 - neurochemical bulk flow and diffusion): release of dopamine from the inner plexiform layer of the retina, and the subsequent diffusion and relay of neurochemical cascade show that a decrease in dopamine results in a reduction of proteoglycan synthesis rate, which leads to myopia; (5 - Simulink model): model of genetically pre-programmed signaling and blur feedback components that allows for different input functions to simulate experimental manipulations that result in hyperopia, emmetropia, and myopia. These model simulation programs (available upon request) can provide a useful tutorial for the general scientist and serve as a quantitative tool for researchers in eye growth and myopia. Copyright © 2016 Elsevier Ltd. All rights reserved.
CalcHEP 3.4 for collider physics within and beyond the Standard Model
NASA Astrophysics Data System (ADS)
Belyaev, Alexander; Christensen, Neil D.; Pukhov, Alexander
2013-07-01
We present version 3.4 of the CalcHEP software package which is designed for effective evaluation and simulation of high energy physics collider processes at parton level. The main features of CalcHEP are the computation of Feynman diagrams, integration over multi-particle phase space and event simulation at parton level. The principle attractive key-points along these lines are that it has: (a) an easy startup and usage even for those who are not familiar with CalcHEP and programming; (b) a friendly and convenient graphical user interface (GUI); (c) the option for the user to easily modify a model or introduce a new model by either using the graphical interface or by using an external package with the possibility of cross checking the results in different gauges; (d) a batch interface which allows to perform very complicated and tedious calculations connecting production and decay modes for processes with many particles in the final state. With this features set, CalcHEP can efficiently perform calculations with a high level of automation from a theory in the form of a Lagrangian down to phenomenology in the form of cross sections, parton level event simulation and various kinematical distributions. In this paper we report on the new features of CalcHEP 3.4 which improves the power of our package to be an effective tool for the study of modern collider phenomenology. Program summaryProgram title: CalcHEP Catalogue identifier: AEOV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 78535 No. of bytes in distributed program, including test data, etc.: 818061 Distribution format: tar.gz Programming language: C. Computer: PC, MAC, Unix Workstations. Operating system: Unix. RAM: Depends on process under study Classification: 4.4, 5. External routines: X11 Nature of problem: Implement new models of particle interactions. Generate Feynman diagrams for a physical process in any implemented theoretical model. Integrate phase space for Feynman diagrams to obtain cross sections or particle widths taking into account kinematical cuts. Simulate collisions at modern colliders and generate respective unweighted events. Mix events for different subprocesses and connect them with the decays of unstable particles. Solution method: Symbolic calculations. Squared Feynman diagram approach Vegas Monte Carlo algorithm. Restrictions: Up to 2→4 production (1→5 decay) processes are realistic on typical computers. Higher multiplicities sometimes possible for specific 2→5 and 2→6 processes. Unusual features: Graphical user interface, symbolic algebra calculation of squared matrix element, parallelization on a pbs cluster. Running time: Depends strongly on the process. For a typical 2→2 process it takes seconds. For 2→3 processes the typical running time is of the order of minutes. For higher multiplicities it could take much longer.
Combining Simulation Tools for End-to-End Trajectory Optimization
NASA Technical Reports Server (NTRS)
Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min
2015-01-01
Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.
NASA Astrophysics Data System (ADS)
Boisson, F.; Wimberley, C. J.; Lehnert, W.; Zahra, D.; Pham, T.; Perkins, G.; Hamze, H.; Gregoire, M.-C.; Reilhac, A.
2013-10-01
Monte Carlo-based simulation of positron emission tomography (PET) data plays a key role in the design and optimization of data correction and processing methods. Our first aim was to adapt and configure the PET-SORTEO Monte Carlo simulation program for the geometry of the widely distributed Inveon PET preclinical scanner manufactured by Siemens Preclinical Solutions. The validation was carried out against actual measurements performed on the Inveon PET scanner at the Australian Nuclear Science and Technology Organisation in Australia and at the Brain & Mind Research Institute and by strictly following the NEMA NU 4-2008 standard. The comparison of simulated and experimental performance measurements included spatial resolution, sensitivity, scatter fraction and count rates, image quality and Derenzo phantom studies. Results showed that PET-SORTEO reliably reproduces the performances of this Inveon preclinical system. In addition, imaging studies showed that the PET-SORTEO simulation program provides raw data for the Inveon scanner that can be fully corrected and reconstructed using the same programs as for the actual data. All correction techniques (attenuation, scatter, randoms, dead-time, and normalization) can be applied on the simulated data leading to fully quantitative reconstructed images. In the second part of the study, we demonstrated its ability to generate fast and realistic biological studies. PET-SORTEO is a workable and reliable tool that can be used, in a classical way, to validate and/or optimize a single PET data processing step such as a reconstruction method. However, we demonstrated that by combining a realistic simulated biological study ([11C]Raclopride here) involving different condition groups, simulation allows one also to assess and optimize the data correction, reconstruction and data processing line flow as a whole, specifically for each biological study, which is our ultimate intent.
NASA Technical Reports Server (NTRS)
Kumar, P.; Lin, F. Y.; Vaishampayan, V.; Farvardin, N.
1986-01-01
A complete documentation of the software developed in the Communication and Signal Processing Laboratory (CSPL) during the period of July 1985 to March 1986 is provided. Utility programs and subroutines that were developed for a user-friendly image and speech processing environment are described. Additional programs for data compression of image and speech type signals are included. Also, programs for the zero-memory and block transform quantization in the presence of channel noise are described. Finally, several routines for simulating the perfromance of image compression algorithms are included.
Hydro turbine governor’s power control of hydroelectric unit with sloping ceiling tailrace tunnel
NASA Astrophysics Data System (ADS)
Fu, Liang; Wu, Changli; Tang, Weiping
2018-02-01
The primary frequency regulation and load regulation transient process when the hydro turbine governor is under the power mode of hydropower unit with sloping ceiling tailrace are analysed by field test and numerical simulation in this paper. A simulation method based on “three-zone model” to simulate small fluctuation transient process of the sloping ceiling tailrace is proposed. The simulation model of hydraulic turbine governor power mode is established by governor’s PLC program identification and parameter measurement, and the simulation model is verified by the test. The slow-fast-slow “three-stage regulation” method which can improve the dynamic quality of hydro turbine governor power mode is proposed. The power regulation strategy and parameters are optimized by numerical simulation, the performance of primary frequency regulation and load regulation transient process when the hydro turbine governor is under power mode are improved significantly.
TIERRAS: A package to simulate high energy cosmic ray showers underground, underwater and under-ice
NASA Astrophysics Data System (ADS)
Tueros, Matías; Sciutto, Sergio
2010-02-01
In this paper we present TIERRAS, a Monte Carlo simulation program based on the well-known AIRES air shower simulations system that enables the propagation of particle cascades underground, providing a tool to study particles arriving underground from a primary cosmic ray on the atmosphere or to initiate cascades directly underground and propagate them, exiting into the atmosphere if necessary. We show several cross-checks of its results against CORSIKA, FLUKA, GEANT and ZHS simulations and we make some considerations regarding its possible use and limitations. The first results of full underground shower simulations are presented, as an example of the package capabilities. Program summaryProgram title: TIERRAS for AIRES Catalogue identifier: AEFO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 36 489 No. of bytes in distributed program, including test data, etc.: 3 261 669 Distribution format: tar.gz Programming language: Fortran 77 and C Computer: PC, Alpha, IBM, HP, Silicon Graphics and Sun workstations Operating system: Linux, DEC Unix, AIX, SunOS, Unix System V RAM: 22 Mb bytes Classification: 1.1 External routines: TIERRAS requires AIRES 2.8.4 to be installed on the system. AIRES 2.8.4 can be downloaded from http://www.fisica.unlp.edu.ar/auger/aires/eg_AiresDownload.html. Nature of problem: Simulation of high and ultra high energy underground particle showers. Solution method: Modification of the AIRES 2.8.4 code to accommodate underground conditions. Restrictions: In AIRES some processes that are not statistically significant on the atmosphere are not simulated. In particular, it does not include muon photonuclear processes. This imposes a limitation on the application of this package to a depth of 1 km of standard rock (or 2.5 km of water equivalent). Neutrinos are not tracked on the simulation, but their energy is taken into account in decays. Running time: A TIERRAS for AIRES run of a 10 eV shower with statistical sampling (thinning) below 10 eV and 0.2 weight factor (see [1]) uses approximately 1 h of CPU time on an Intel Core 2 Quad Q6600 at 2.4 GHz. It uses only one core, so 4 simultaneous simulations can be run on this computer. Aires includes a spooling system to run several simultaneous jobs of any type. References:S. Sciutto, AIRES 2.6 User Manual, http://www.fisica.unlp.edu.ar/auger/aires/.
User's guide to the SEPHIS computer code for calculating the Thorex solvent extraction system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, S.B.; Rainey, R.H.
1979-05-01
The SEPHIS computer program was developed to simulate the countercurrent solvent extraction process. The code has now been adapted to model the Acid Thorex flow sheet. This report represents a practical user's guide to SEPHIS - Thorex containing a program description, user information, program listing, and sample input and output.
A computer program for the simulation of folds of different sizes under the influence of gravity
NASA Astrophysics Data System (ADS)
Vacas Peña, José M.; Martínez Catalán, José R.
2004-02-01
Folding&g is a computer program, based on the finite element method, developed to simulate the process of natural folding from small to large scales in two dimensions. Written in Pascal code and compiled with Borland Delphi 3.0, the program has a friendly interactive user interface and can be used for research as well as educational purposes. Four main menu options allow the user to import or to build and to save a model data file, select the type of graphic output, introduce and modify several physical parameters and enter the calculation routines. The program employs isoparametric, initially rectangular elements with eight nodes, which can sustain large deformations. The mathematical procedure is based on the elasticity equations, but has been modified to simulate a viscous rheology, either linear or of power-law type. The parameters to be introduced include either the linear viscosity, or, when the viscosity is non-linear, the material constant, activation energy, temperature and power of the differential stress. All the parameters can be set by rows, which simulate layers. A toggle permits gravity to be introduced into the calculations. In this case, the density of the different rows must be specified, and the sizes of the finite elements and of the whole model become meaningful. Viscosity values can also be assigned to blocks of several rows and columns, which permits the modelling of heterogeneities such as rectangular areas of high strength, which can be used to simulate shearing components interfering with the buckling process. The program is applied to several cases of folding, including a single competent bed and multilayers, and its results compared with analytical and experimental results. The influence of gravity is illustrated by the modelling of diapiric structures and of a large recumbent fold.
Tempo: A Toolkit for the Timed Input/Output Automata Formalism
2008-01-30
generation of distributed code from specifications. F .4.3 [Formal Languages]: Tempo;, D.3 [Programming Many distributed systems involve a combination of...and require The chek (i) transition is enabled when process i’s program the simulator to check the assertions after every single step counter is set to...output foo (n:Int) The Tempo simulator addresses this issue by putting the states x: Int : = 10;transitions modeler in charge of resolving the non
TWOS - TIME WARP OPERATING SYSTEM, VERSION 2.5.1
NASA Technical Reports Server (NTRS)
Bellenot, S. F.
1994-01-01
The Time Warp Operating System (TWOS) is a special-purpose operating system designed to support parallel discrete-event simulation. TWOS is a complete implementation of the Time Warp mechanism, a distributed protocol for virtual time synchronization based on process rollback and message annihilation. Version 2.5.1 supports simulations and other computations using both virtual time and dynamic load balancing; it does not support general time-sharing or multi-process jobs using conventional message synchronization and communication. The program utilizes the underlying operating system's resources. TWOS runs a single simulation at a time, executing it concurrently on as many processors of a distributed system as are allocated. The simulation needs only to be decomposed into objects (logical processes) that interact through time-stamped messages. TWOS provides transparent synchronization. The user does not have to add any more special logic to aid in synchronization, nor give any synchronization advice, nor even understand much about how the Time Warp mechanism works. The Time Warp Simulator (TWSIM) subdirectory contains a sequential simulation engine that is interface compatible with TWOS. This means that an application designer and programmer who wish to use TWOS can prototype code on TWSIM on a single processor and/or workstation before having to deal with the complexity of working on a distributed system. TWSIM also provides statistics about the application which may be helpful for determining the correctness of an application and for achieving good performance on TWOS. Version 2.5.1 has an updated interface that is not compatible with 2.0. The program's user manual assists the simulation programmer in the design, coding, and implementation of discrete-event simulations running on TWOS. The manual also includes a practical user's guide to the TWOS application benchmark, Colliding Pucks. TWOS supports simulations written in the C programming language. It is designed to run on the Sun3/Sun4 series computers and the BBN "Butterfly" GP-1000 computer. The standard distribution medium for this package is a .25 inch tape cartridge in TAR format. TWOS was developed in 1989 and updated in 1991. This program is a copyrighted work with all copyright vested in NASA. Sun3 and Sun4 are trademarks of Sun Microsystems, Inc.
Mathematics in medicine: tumor detection, radiation dosimetry, and simulation in psychotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bellman, R.; Kashef, B.; Smith, C.P.
1975-05-01
Work done in the application of mathematics to medicine over the last 20 years is briefly reviewed. Scan-rescan processes, radiation dosimetry, and medical interviewing are discussed. The first uses dynamic programming, the second invariant imbedding, and the third simulation. (ACR)
2014-11-14
responses from any analyte under consideration. Figure 1 illustrates this behavior. Figure 1: LIBS spectra from OVA (ricin simulant) on...illustrates this behavior. Figure 1: LIBS spectra from OVA (ricin simulant) on several different substrates: steel, aluminum, and polycarbonate
Conducting a Hiring Fair Simulation for Teacher Education Candidates
ERIC Educational Resources Information Center
Mosier, Brian; Heidorn, Brent; Johnson, Christie
2015-01-01
The purpose of this article is to provide a basic review of a hiring simulation fair, and to describe strategies for successfully implementing a similar organized event in a college/university teacher education program in order to better prepare students for the interview process.
Simulated Batch Production of Penicillin
ERIC Educational Resources Information Center
Whitaker, A.; Walker, J. D.
1973-01-01
Describes a program in applied biology in which the simulation of the production of penicillin in a batch fermentor is used as a teaching technique to give students experience before handling a genuine industrial fermentation process. Details are given for the calculation of minimum production cost. (JR)
Li, Jia; Xu, Zhenming; Zhou, Yaohe
2008-05-30
Traditionally, the mixture metals from waste printed circuit board (PCB) were sent to the smelt factory to refine pure copper. Some valuable metals (aluminum, zinc and tin) with low content in PCB were lost during smelt. A new method which used roll-type electrostatic separator (RES) to recovery low content metals in waste PCB was presented in this study. The theoretic model which was established from computing electric field and the analysis of forces on the particles was used to write a program by MATLAB language. The program was design to simulate the process of separating mixture metal particles. Electrical, material and mechanical factors were analyzed to optimize the operating parameters of separator. The experiment results of separating copper and aluminum particles by RES had a good agreement with computer simulation results. The model could be used to simulate separating other metal (tin, zinc, etc.) particles during the process of recycling waste PCBs by RES.
Experiences with serial and parallel algorithms for channel routing using simulated annealing
NASA Technical Reports Server (NTRS)
Brouwer, Randall Jay
1988-01-01
Two algorithms for channel routing using simulated annealing are presented. Simulated annealing is an optimization methodology which allows the solution process to back up out of local minima that may be encountered by inappropriate selections. By properly controlling the annealing process, it is very likely that the optimal solution to an NP-complete problem such as channel routing may be found. The algorithm presented proposes very relaxed restrictions on the types of allowable transformations, including overlapping nets. By freeing that restriction and controlling overlap situations with an appropriate cost function, the algorithm becomes very flexible and can be applied to many extensions of channel routing. The selection of the transformation utilizes a number of heuristics, still retaining the pseudorandom nature of simulated annealing. The algorithm was implemented as a serial program for a workstation, and a parallel program designed for a hypercube computer. The details of the serial implementation are presented, including many of the heuristics used and some of the resulting solutions.
MHSS: a material handling system simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomernacki, L.; Hollstien, R.B.
1976-04-07
A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less
Jung, Kwang-Wook; Yoon, Choon-G; Jang, Jae-Ho; Kong, Dong-Soo
2008-01-01
Effective watershed management often demands qualitative and quantitative predictions of the effect of future management activities as arguments for policy makers and administration. The BASINS geographic information system was developed to compute total maximum daily loads, which are helpful to establish hydrological process and water quality modeling system. In this paper the BASINS toolkit HSPF model is applied in 20,271 km(2) large watershed of the Han River Basin is used for applicability of HSPF and BMPs scenarios. For proper evaluation of watershed and stream water quality, comprehensive estimation methods are necessary to assess large amounts of point source and nonpoint-source (NPS) pollution based on the total watershed area. In this study, The Hydrological Simulation Program-FORTRAN (HSPF) was estimated to simulate watershed pollutant loads containing dam operation and applied BMPs scenarios for control NPS pollution. The 8-day monitoring data (about three years) were used in the calibration and verification processes. Model performance was in the range of "very good" and "good" based on percent difference. The water-quality simulation results were encouraging for this large sizable watershed with dam operation practice and mixed land uses; HSPF proved adequate, and its application is recommended to simulate watershed processes and BMPs evaluation. IWA Publishing 2008.
Processing experiments on non-Czochralski silicon sheet
NASA Technical Reports Server (NTRS)
Pryor, R. A.; Grenon, L. A.; Sakiotis, N. G.; Pastirik, E. M.; Sparks, T. O.; Legge, R. N.
1981-01-01
A program is described which supports and promotes the development of processing techniques which may be successfully and cost-effectively applied to low-cost sheets for solar cell fabrication. Results are reported in the areas of process technology, cell design, cell metallization, and production cost simulation.
Improvement of the Processes of Liquid-Phase Epitaxial Growth of Nanoheteroepitaxial Structures
NASA Astrophysics Data System (ADS)
Maronchuk, I. I.; Sanikovich, D. D.; Potapkov, P. V.; Vel‧chenko, A. A.
2018-05-01
We have revealed the shortcomings of equipment and technological approaches in growing nanoheteroepitaxial structures with quantum dots by liquid-phase epitaxy. We have developed and fabricated a new vertical barreltype cassette for growing quantum dots and epitaxial layers of various thicknesses in one technological process. A physico-mathematical simulation has been carried out of the processes of liquid-phase epitaxial growth of quantumdimensional structures with the use of the program product SolidWorks (FlowSimulation program). Analysis has revealed the presence of negative factors influencing the growth process of the above structures. The mathematical model has been optimized, and the equipment has been modernized without additional experiments and measurements. The flow dynamics of the process gas in the reactor at various flow rates has been investigated. A method for tuning the thermal equipment has been developed. The calculated and experimental temperature distributions in the process of growing structures with high reproducibility are in good agreement, which confirms the validity of the modernization made.
Bernal, Javier; Torres-Jimenez, Jose
2015-01-01
SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller’s scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed, such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller’s algorithm; the (re)initialization of weights with simulated annealing required to (re)start Møller’s algorithm the first time and each time thereafter that it shows insufficient progress in reaching a possibly local minimum; and the use of simulated annealing when Møller’s algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of the scaled conjugate gradient algorithm, the simulated annealing procedure and the training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data. PMID:26958442
An interactive drilling simulator for teaching and research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, G.A.; Cooper, A.G.; Bihn, G.
1995-12-31
An interactive program has been constructed that allows a student or engineer to simulate the drilling of an oil well, and to optimize the drilling process by comparing different drilling plans. The program operates in a very user-friendly way, with emphasis on menu and button-driven commands. The simulator may be run either as a training program, with exercises that illustrate various features of the drilling process, as a game, in which a student is set a challenge to drill a well with minimum cost or time under constraints set by an instructor, or as a simulator of a real situationmore » to investigate the merit of different drilling strategies. It has three main parts, a Lithology Editor, a Settings Editor and the simulation program itself. The Lithology Editor allows the student, instructor or engineer to build a real or imaginary sequence of rock layers, each characterized by its mineralogy, drilling and log responses. The Settings Editor allows the definition of all the operational parameters, ranging from the drilling and wear rates of particular bits in specified rocks to the costs of different procedures. The simulator itself contains an algorithm that determines rate of penetration and rate of wear of the bit as drilling continues. It also determines whether the well kicks or fractures, and assigns various other {open_quotes}accident{close_quotes} conditions. During operation, a depth vs. time curve is displayed, together with a {open_quotes}mud log{close_quotes} showing the rock layers penetrated. If desired, the well may be {open_quotes}logged{close_quotes} casings may be set and pore and fracture pressure gradients may be displayed. During drilling, the total time and cost are shown, together with cost per foot in total and for the current bit run.« less
AESS: Accelerated Exact Stochastic Simulation
NASA Astrophysics Data System (ADS)
Jenkins, David D.; Peterson, Gregory D.
2011-12-01
The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution method: The Accelerated Exact Stochastic Simulation (AESS) tool provides implementations of a wide variety of popular variations on the Gillespie method. Users can select the specific algorithm considered most appropriate. Comparisons between the methods and with other available implementations indicate that AESS provides the fastest known implementation of Gillespie's method for a variety of test models. Users may wish to execute ensembles of simulations to sweep parameters or to obtain better statistical results, so AESS supports acceleration of ensembles of simulation using parallel processing with MPI, SSE vector units on x86 processors, and/or using NVIDIA GPUs with CUDA.
Simulation of beam-induced plasma in gas-filled rf cavities
Yu, Kwangmin; Samulyak, Roman; Yonehara, Katsuya; ...
2017-03-07
Processes occurring in a radio-frequency (rf) cavity, filled with high pressure gas and interacting with proton beams, have been studied via advanced numerical simulations. Simulations support the experimental program on the hydrogen gas-filled rf cavity in the Mucool Test Area (MTA) at Fermilab, and broader research on the design of muon cooling devices. space, a 3D electromagnetic particle-in-cell (EM-PIC) code with atomic physics support, was used in simulation studies. Plasma dynamics in the rf cavity, including the process of neutral gas ionization by proton beams, plasma loading of the rf cavity, and atomic processes in plasma such as electron-ion andmore » ion-ion recombination and electron attachment to dopant molecules, have been studied. Here, through comparison with experiments in the MTA, simulations quantified several uncertain values of plasma properties such as effective recombination rates and the attachment time of electrons to dopant molecules. Simulations have achieved very good agreement with experiments on plasma loading and related processes. Lastly, the experimentally validated code space is capable of predictive simulations of muon cooling devices.« less
Using Simulation in a Psychiatric Mental Health Nurse Practitioner Doctoral Program.
Calohan, Jess; Pauli, Eric; Combs, Teresa; Creel, Andrea; Convoy, Sean; Owen, Regina
The use and effectiveness of simulation with standardized patients in undergraduate and graduate nursing education programs is well documented. Simulation has been primarily used to develop health assessment skills. Evidence supports using simulation and standardized patients in psychiatric-mental health nurse practitioner (PMHNP) programs is useful in developing psychosocial assessment skills. These interactions provide individualized and instantaneous clinical feedback to the student from faculty, peers, and standardized patients. Incorporating simulation into advanced practice psychiatric-mental health nursing curriculum allows students to develop the necessary requisite skills and principles needed to safely and effectively provide care to patients. There are no documented standardized processes for using simulation throughout a doctor of nursing practice PMHNP curriculum. The purpose of this article is to describe a framework for using simulation with standardized patients in a PMHNP curriculum. Students report high levels of satisfaction with the simulation experience and believe that they are more prepared for clinical rotations. Faculty feedback indicates that simulated clinical scenarios are a method to ensure that each student experiences demonstrate a minimum standard of competency ahead of clinical rotations with live patients. Initial preceptor feedback indicates that students are more prepared for clinical practice and function more independently than students that did not experience this standardized clinical simulation framework. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Manzara, Leonard Charles
1990-01-01
The dissertation is in two parts:. 1. Percussion Sextet. The Percussion Sextet is a one movement musical composition with a length of approximately fifteen minutes. It is for six instrumentalists, each on a number of percussion instruments. The overriding formal problem was to construct a coherent and compelling structure which fuses a diversity of musical materials and textures into a dramatic whole. Particularly important is the synthesis of opposing tendencies contained in stochastic and deterministic processes: global textures versus motivic detail, and randomness versus total control. Several compositional techniques are employed in the composition. These methods of composition will be aided, in part, by the use of artificial intelligence techniques programmed on a computer. Finally, the percussion ensemble is the ideal medium to realize the above processes since it encompasses a wide range of both pitched and unpitched timbres, and since a great variety of textures and densities can be created with a certain economy of means. 2. The simulation of acoustical space by means of physical modeling. This is a written report describing the research and development of a computer program which simulates the characteristics of acoustical space in two dimensions. With the computer program the user can simulate most conventional acoustical spaces, as well as those physically impossible to realize in the real world. The program simulates acoustical space by means of geometric modeling. This involves defining wall equations, phantom source points and wall diffusions, and then processing input files containing digital signals through the program, producing output files ready for digital to analog conversion. The user of the program is able to define wall locations and wall reflectivity and roughness characteristics, all of which can be changed over time. Sound source locations are also definable within the acoustical space and these locations can be changed independently at any rate of speed. The sounds themselves are generated from any external sound synthesis program or appropriate sampling system. Finally, listener location and orientation is also user definable and dynamic in nature. A Receive-ReBroadcast (RRB) model is used to play back the sound and is definable from two to eight channels of sound. (Abstract shortened with permission of author.).
Virtual commissioning of automated micro-optical assembly
NASA Astrophysics Data System (ADS)
Schlette, Christian; Losch, Daniel; Haag, Sebastian; Zontar, Daniel; Roßmann, Jürgen; Brecher, Christian
2015-02-01
In this contribution, we present a novel approach to enable virtual commissioning for process developers in micro-optical assembly. Our approach aims at supporting micro-optics experts to effectively develop assisted or fully automated assembly solutions without detailed prior experience in programming while at the same time enabling them to easily implement their own libraries of expert schemes and algorithms for handling optical components. Virtual commissioning is enabled by a 3D simulation and visualization system in which the functionalities and properties of automated systems are modeled, simulated and controlled based on multi-agent systems. For process development, our approach supports event-, state- and time-based visual programming techniques for the agents and allows for their kinematic motion simulation in combination with looped-in simulation results for the optical components. First results have been achieved for simply switching the agents to command the real hardware setup after successful process implementation and validation in the virtual environment. We evaluated and adapted our system to meet the requirements set by industrial partners-- laser manufacturers as well as hardware suppliers of assembly platforms. The concept is applied to the automated assembly of optical components for optically pumped semiconductor lasers and positioning of optical components for beam-shaping
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2015-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. The test article, a 3.35 m long with the diameter of 20 cm incline line, was filled with liquid or gaseous hydrogen and then purged with gaseous helium (GHe). Total of 10 tests were conducted. The influences of GHe flow rates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer computer program, was utilized to model and simulate selective tests. The test procedures, modeling descriptions, and the results are presented in the following sections.
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2013-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. The test article, a 3.35 m long with the diameter of 20 cm incline line, was filled with liquid or gaseous hydrogen and then purged with gaseous helium (GHe). Total of 10 tests were conducted. The influences of GHe flow rates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer computer program, was utilized to model and simulate selective tests. The test procedures, modeling descriptions, and the results are presented in the following sections.
NASA Technical Reports Server (NTRS)
Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.
1979-01-01
The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.
Interface design of VSOP'94 computer code for safety analysis
NASA Astrophysics Data System (ADS)
Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi
2014-09-01
Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.
The Center-TRACON Automation System: Simulation and field testing
NASA Technical Reports Server (NTRS)
Denery, Dallas G.; Erzberger, Heinz
1995-01-01
A new concept for air traffic management in the terminal area, implemented as the Center-TRACON Automation System, has been under development at NASA Ames in a cooperative program with the FAA since 1991. The development has been strongly influenced by concurrent simulation and field site evaluations. The role of simulation and field activities in the development process will be discussed. Results of recent simulation and field tests will be presented.
NASA Technical Reports Server (NTRS)
Lacovara, R. C.
1990-01-01
The notions, benefits, and drawbacks of numeric simulation are introduced. Two formal simulation languages, Simpscript and Modsim are introduced. The capabilities of each are discussed briefly, and then the two programs are compared. The use of simulation in the process of design engineering for the Control and Monitoring System (CMS) for Space Station Freedom is discussed. The application of the formal simulation language to the CMS design is presented, and recommendations are made as to their use.
Progress in modeling and simulation.
Kindler, E
1998-01-01
For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.
Creating and Testing Simulation Software
NASA Technical Reports Server (NTRS)
Heinich, Christina M.
2013-01-01
The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.
User's guide to STIPPAN: A panel method program for slotted tunnel interference prediction
NASA Technical Reports Server (NTRS)
Kemp, W. B., Jr.
1985-01-01
Guidelines are presented for use of the computer program STIPPAN to simulate the subsonic flow in a slotted wind tunnel test section with a known model disturbance. Input data requirements are defined in detail and other aspects of the program usage are discussed in more general terms. The program is written for use in a CDC CYBER 200 class vector processing system.
Streamlining DOD Acquisitions: Balancing Schedule with Complexity
2006-09-01
from them has a distinct industrial flavor: streamlined processes, benchmarking, and business models . The requirements generation com- munity led by... model ), and the Department of the Navy assumed program lead. [Stable Program Inputs (-)] By 1984, the program goals included delivery of 913 V-22...they subsequently specified a crew of two. [Stable Program Input (-)] The contractor team won in a “fly-off” solely via modeling and simulation
Airborne Systems Technology Application to the Windshear Threat
NASA Technical Reports Server (NTRS)
Arbuckle, P. Douglas; Lewis, Michael S.; Hinton, David A.
1996-01-01
The general approach and products of the NASA/FAA Airborne Windshear Program conducted by NASA Langley Research Center are summarized, with references provided for the major technical contributions. During this period, NASA conducted 2 years of flight testing to characterize forward-looking sensor performance. The NASA/FAA Airborne Windshear Program was divided into three main elements: Hazard Characterization, Sensor Technology, and Flight Management Systems. Simulation models developed under the Hazard Characterization element are correlated with flight test data. Flight test results comparing the performance and characteristics of the various Sensor Technologies (microwave radar, lidar, and infrared) are presented. Most of the activities in the Flight Management Systems element were conducted in simulation. Simulation results from a study evaluating windshear crew procedures and displays for forward-looking sensor-equipped airplanes are discussed. NASA Langley researchers participated heavily in the FAA process of generating certification guidelines for predictive windshear detection systems. NASA participants felt that more valuable technology products were generated by the program because of this interaction. NASA involvement in the process and the resulting impact on products and technology transfer are discussed in this paper.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
LANES - LOCAL AREA NETWORK EXTENSIBLE SIMULATOR
NASA Technical Reports Server (NTRS)
Gibson, J.
1994-01-01
The Local Area Network Extensible Simulator (LANES) provides a method for simulating the performance of high speed local area network (LAN) technology. LANES was developed as a design and analysis tool for networking on board the Space Station. The load, network, link and physical layers of a layered network architecture are all modeled. LANES models to different lower-layer protocols, the Fiber Distributed Data Interface (FDDI) and the Star*Bus. The load and network layers are included in the model as a means of introducing upper-layer processing delays associated with message transmission; they do not model any particular protocols. FDDI is an American National Standard and an International Organization for Standardization (ISO) draft standard for a 100 megabit-per-second fiber-optic token ring. Specifications for the LANES model of FDDI are taken from the Draft Proposed American National Standard FDDI Token Ring Media Access Control (MAC), document number X3T9.5/83-16 Rev. 10, February 28, 1986. This is a mature document describing the FDDI media-access-control protocol. Star*Bus, also known as the Fiber Optic Demonstration System, is a protocol for a 100 megabit-per-second fiber-optic star-topology LAN. This protocol, along with a hardware prototype, was developed by Sperry Corporation under contract to NASA Goddard Space Flight Center as a candidate LAN protocol for the Space Station. LANES can be used to analyze performance of a networking system based on either FDDI or Star*Bus under a variety of loading conditions. Delays due to upper-layer processing can easily be nullified, allowing analysis of FDDI or Star*Bus as stand-alone protocols. LANES is a parameter-driven simulation; it provides considerable flexibility in specifying both protocol an run-time parameters. Code has been optimized for fast execution and detailed tracing facilities have been included. LANES was written in FORTRAN 77 for implementation on a DEC VAX under VMS 4.6. It consists of two programs, a simulation program and a user-interface program. The simulation program requires the SLAM II simulation library from Pritsker and Associates, W. Lafayette IN; the user interface is implemented using the Ingres database manager from Relational Technology, Inc. Information about running the simulation program without the user-interface program is contained in the documentation. The memory requirement is 129,024 bytes. LANES was developed in 1988.
Process control charts in infection prevention: Make it simple to make it happen.
Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A
2017-03-01
Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
BASIC Simulation Programs; Volumes V and VI. Social Studies, Teacher Assistance.
ERIC Educational Resources Information Center
Digital Equipment Corp., Maynard, MA.
Five computer programs which teach concepts and processes related to social studies (in the main, economics) are presented. The subjects of the programs are the distinction between balance of trade and balance of payments; installment buying, loan payments, and savings accounts; flow of goods, services, and money between business and the consumer;…
Linking laser scanning to snowpack modeling: Data processing and visualization
NASA Astrophysics Data System (ADS)
Teufelsbauer, H.
2009-07-01
SnowSim is a newly developed physical snowpack model that can use three-dimensional terrestrial laser scanning data to generate model domains. This greatly simplifies the input and numerical simulation of snow covers in complex terrains. The program can model two-dimensional cross sections of general slopes, with complicated snow distributions. The model predicts temperature distributions and snow settlements in this cross section. Thus, the model can be used for a wide range of problems in snow science and engineering, including numerical investigations of avalanche formation. The governing partial differential equations are solved by means of the finite element method, using triangular elements. All essential data for defining the boundary conditions and evaluating the simulation results are gathered by automatic weather and snow measurement sites. This work focuses on the treatment of these measurements and the simulation results, and presents a pre- and post-processing graphical user interface (GUI) programmed in Matlab.
Facilitating Co-Design for Extreme-Scale Systems Through Lightweight Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engelmann, Christian; Lauer, Frank
This work focuses on tools for investigating algorithm performance at extreme scale with millions of concurrent threads and for evaluating the impact of future architecture choices to facilitate the co-design of high-performance computing (HPC) architectures and applications. The approach focuses on lightweight simulation of extreme-scale HPC systems with the needed amount of accuracy. The prototype presented in this paper is able to provide this capability using a parallel discrete event simulation (PDES), such that a Message Passing Interface (MPI) application can be executed at extreme scale, and its performance properties can be evaluated. The results of an initial prototype aremore » encouraging as a simple 'hello world' MPI program could be scaled up to 1,048,576 virtual MPI processes on a four-node cluster, and the performance properties of two MPI programs could be evaluated at up to 16,384 virtual MPI processes on the same system.« less
Computer modeling and simulation in inertial confinement fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCrory, R.L.; Verdon, C.P.
1989-03-01
The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper wemore » describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics. 46 refs., 19 figs., 1 tab.« less
NASA Astrophysics Data System (ADS)
Kochanov, R. V.; Gordon, I. E.; Rothman, L. S.; Wcisło, P.; Hill, C.; Wilzewski, J. S.
2016-07-01
The HITRAN Application Programming Interface (HAPI) is presented. HAPI is a free Python library, which extends the capabilities of the HITRANonline interface (www.hitran.org) and can be used to filter and process the structured spectroscopic data. HAPI incorporates a set of tools for spectra simulation accounting for the temperature, pressure, optical path length, and instrument properties. HAPI is aimed to facilitate the spectroscopic data analysis and the spectra simulation based on the line-by-line data, such as from the HITRAN database [JQSRT (2013) 130, 4-50], allowing the usage of the non-Voigt line profile parameters, custom temperature and pressure dependences, and partition sums. The HAPI functions allow the user to control the spectra simulation and data filtering process via a set of the function parameters. HAPI can be obtained at its homepage www.hitran.org/hapi.
An Integrated Development Environment for Adiabatic Quantum Programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; McCaskey, Alex; Bennink, Ryan S
2014-01-01
Adiabatic quantum computing is a promising route to the computational power afforded by quantum information processing. The recent availability of adiabatic hardware raises the question of how well quantum programs perform. Benchmarking behavior is challenging since the multiple steps to synthesize an adiabatic quantum program are highly tunable. We present an adiabatic quantum programming environment called JADE that provides control over all the steps taken during program development. JADE captures the workflow needed to rigorously benchmark performance while also allowing a variety of problem types, programming techniques, and processor configurations. We have also integrated JADE with a quantum simulation enginemore » that enables program profiling using numerical calculation. The computational engine supports plug-ins for simulation methodologies tailored to various metrics and computing resources. We present the design, integration, and deployment of JADE and discuss its use for benchmarking adiabatic quantum programs.« less
Interactive Media and Simulation Tools for Technical Training
NASA Technical Reports Server (NTRS)
Gramoll, Kurt
1997-01-01
Over the last several years, integration of multiple media sources into a single information system has been rapidly developing. It has been found that when sound, graphics, text, animations, and simulations are skillfully integrated, the sum of the parts exceeds the individual parts for effective learning. In addition, simulations can be used to design and understand complex engineering processes. With the recent introduction of many high-level authoring, animation, modeling, and rendering programs for personal computers, significant multimedia programs can be developed by practicing engineers, scientists and even managers for both training and education. However, even with these new tools, a considerable amount of time is required to produce an interactive multimedia program. The development of both CD-ROM and Web-based programs are discussed in addition to the use of technically oriented animations. Also examined are various multimedia development tools and how they are used to develop effective engineering education courseware. Demonstrations of actual programs in engineering mechanics are shown.
Las Palmeras Molecular Dynamics: A flexible and modular molecular dynamics code
NASA Astrophysics Data System (ADS)
Davis, Sergio; Loyola, Claudia; González, Felipe; Peralta, Joaquín
2010-12-01
Las Palmeras Molecular Dynamics (LPMD) is a highly modular and extensible molecular dynamics (MD) code using interatomic potential functions. LPMD is able to perform equilibrium MD simulations of bulk crystalline solids, amorphous solids and liquids, as well as non-equilibrium MD (NEMD) simulations such as shock wave propagation, projectile impacts, cluster collisions, shearing, deformation under load, heat conduction, heterogeneous melting, among others, which involve unusual MD features like non-moving atoms and walls, unstoppable atoms with constant-velocity, and external forces like electric fields. LPMD is written in C++ as a compromise between efficiency and clarity of design, and its architecture is based on separate components or plug-ins, implemented as modules which are loaded on demand at runtime. The advantage of this architecture is the ability to completely link together the desired components involved in the simulation in different ways at runtime, using a user-friendly control file language which describes the simulation work-flow. As an added bonus, the plug-in API (Application Programming Interface) makes it possible to use the LPMD components to analyze data coming from other simulation packages, convert between input file formats, apply different transformations to saved MD atomic trajectories, and visualize dynamical processes either in real-time or as a post-processing step. Individual components, such as a new potential function, a new integrator, a new file format, new properties to calculate, new real-time visualizers, and even a new algorithm for handling neighbor lists can be easily coded, compiled and tested within LPMD by virtue of its object-oriented API, without the need to modify the rest of the code. LPMD includes already several pair potential functions such as Lennard-Jones, Morse, Buckingham, MCY and the harmonic potential, as well as embedded-atom model (EAM) functions such as the Sutton-Chen and Gupta potentials. Integrators to choose include Euler (if only for demonstration purposes), Verlet and Velocity Verlet, Leapfrog and Beeman, among others. Electrostatic forces are treated as another potential function, by default using the plug-in implementing the Ewald summation method. Program summaryProgram title: LPMD Catalogue identifier: AEHG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 509 490 No. of bytes in distributed program, including test data, etc.: 6 814 754 Distribution format: tar.gz Programming language: C++ Computer: 32-bit and 64-bit workstation Operating system: UNIX RAM: Minimum 1024 bytes Classification: 7.7 External routines: zlib, OpenGL Nature of problem: Study of Statistical Mechanics and Thermodynamics of condensed matter systems, as well as kinetics of non-equilibrium processes in the same systems. Solution method: Equilibrium and non-equilibrium molecular dynamics method, Monte Carlo methods. Restrictions: Rigid molecules are not supported. Polarizable atoms and chemical bonds (proteins) either. Unusual features: The program is able to change the temperature of the simulation cell, the pressure, cut regions of the cell, color the atoms by properties, even during the simulation. It is also possible to fix the positions and/or velocity of groups of atoms. Visualization of atoms and some physical properties during the simulation. Additional comments: The program does not only perform molecular dynamics and Monte Carlo simulations, it is also able to filter and manipulate atomic configurations, read and write different file formats, convert between them, evaluate different structural and dynamical properties. Running time: 50 seconds on a 1000-step simulation of 4000 argon atoms, running on a single 2.67 GHz Intel processor.
Beowulf Distributed Processing and the United States Geological Survey
Maddox, Brian G.
2002-01-01
Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.
Comparison of Building Energy Modeling Programs: Building Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Dandan; Hong, Tianzhen; Yan, Da
This technical report presented the methodologies, processes, and results of comparing three Building Energy Modeling Programs (BEMPs) for load calculations: EnergyPlus, DeST and DOE-2.1E. This joint effort, between Lawrence Berkeley National Laboratory, USA and Tsinghua University, China, was part of research projects under the US-China Clean Energy Research Center on Building Energy Efficiency (CERC-BEE). Energy Foundation, an industrial partner of CERC-BEE, was the co-sponsor of this study work. It is widely known that large discrepancies in simulation results can exist between different BEMPs. The result is a lack of confidence in building simulation amongst many users and stakeholders. In themore » fields of building energy code development and energy labeling programs where building simulation plays a key role, there are also confusing and misleading claims that some BEMPs are better than others. In order to address these problems, it is essential to identify and understand differences between widely-used BEMPs, and the impact of these differences on load simulation results, by detailed comparisons of these BEMPs from source code to results. The primary goal of this work was to research methods and processes that would allow a thorough scientific comparison of the BEMPs. The secondary goal was to provide a list of strengths and weaknesses for each BEMP, based on in-depth understandings of their modeling capabilities, mathematical algorithms, advantages and limitations. This is to guide the use of BEMPs in the design and retrofit of buildings, especially to support China’s building energy standard development and energy labeling program. The research findings could also serve as a good reference to improve the modeling capabilities and applications of the three BEMPs. The methodologies, processes, and analyses employed in the comparison work could also be used to compare other programs. The load calculation method of each program was analyzed and compared to identify the differences in solution algorithms, modeling assumptions and simplifications. Identifying inputs of each program and their default values or algorithms for load simulation was a critical step. These tend to be overlooked by users, but can lead to large discrepancies in simulation results. As weather data was an important input, weather file formats and weather variables used by each program were summarized. Some common mistakes in the weather data conversion process were discussed. ASHRAE Standard 140-2007 tests were carried out to test the fundamental modeling capabilities of the load calculations of the three BEMPs, where inputs for each test case were strictly defined and specified. The tests indicated that the cooling and heating load results of the three BEMPs fell mostly within the range of spread of results from other programs. Based on ASHRAE 140-2007 test results, the finer differences between DeST and EnergyPlus were further analyzed by designing and conducting additional tests. Potential key influencing factors (such as internal gains, air infiltration, convection coefficients of windows and opaque surfaces) were added one at a time to a simple base case with an analytical solution, to compare their relative impacts on load calculation results. Finally, special tests were designed and conducted aiming to ascertain the potential limitations of each program to perform accurate load calculations. The heat balance module was tested for both single and double zone cases. Furthermore, cooling and heating load calculations were compared between the three programs by varying the heat transfer between adjacent zones, the occupancy of the building, and the air-conditioning schedule.« less
Cognitive Learning Bias of College Students in an Aviation Program
DOT National Transportation Integrated Search
1996-01-01
Students are attracted to university aviation programs for a number of reasons. How well they learn from instruction in a classroom, an airplane, a simulator or in other environments is impacted by their ability to react to stimuli and to process dif...
Projected 2050 Model Simulations for the Chesapeake Bay Program
The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and...
NASA Technical Reports Server (NTRS)
Ogburn, Marilyn E.; Foster, John V.; Hoffler, Keith D.
2005-01-01
This paper reviews the use of piloted simulation at Langley Research Center as part of the NASA High-Angle-of-Attack Technology Program (HATP), which was created to provide concepts and methods for the design of advanced fighter aircraft. A major research activity within this program is the development of the design processes required to take advantage of the benefits of advanced control concepts for high-angle-of-attack agility. Fundamental methodologies associated with the effective use of piloted simulation for this research are described, particularly those relating to the test techniques, validation of the test results, and design guideline/criteria development.
Simulation Model Development for Icing Effects Flight Training
NASA Technical Reports Server (NTRS)
Barnhart, Billy P.; Dickes, Edward G.; Gingras, David R.; Ratvasky, Thomas P.
2003-01-01
A high-fidelity simulation model for icing effects flight training was developed from wind tunnel data for the DeHavilland DHC-6 Twin Otter aircraft. First, a flight model of the un-iced airplane was developed and then modifications were generated to model the icing conditions. The models were validated against data records from the NASA Twin Otter Icing Research flight test program with only minimal refinements being required. The goals of this program were to demonstrate the effectiveness of such a simulator for training pilots to recognize and recover from icing situations and to establish a process for modeling icing effects to be used for future training devices.
A Monte-Carlo maplet for the study of the optical properties of biological tissues
NASA Astrophysics Data System (ADS)
Yip, Man Ho; Carvalho, M. J.
2007-12-01
Monte-Carlo simulations are commonly used to study complex physical processes in various fields of physics. In this paper we present a Maple program intended for Monte-Carlo simulations of photon transport in biological tissues. The program has been designed so that the input data and output display can be handled by a maplet (an easy and user-friendly graphical interface), named the MonteCarloMaplet. A thorough explanation of the programming steps and how to use the maplet is given. Results obtained with the Maple program are compared with corresponding results available in the literature. Program summaryProgram title:MonteCarloMaplet Catalogue identifier:ADZU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.:3251 No. of bytes in distributed program, including test data, etc.:296 465 Distribution format: tar.gz Programming language:Maple 10 Computer: Acer Aspire 5610 (any running Maple 10) Operating system: Windows XP professional (any running Maple 10) Classification: 3.1, 5 Nature of problem: Simulate the transport of radiation in biological tissues. Solution method: The Maple program follows the steps of the C program of L. Wang et al. [L. Wang, S.L. Jacques, L. Zheng, Computer Methods and Programs in Biomedicine 47 (1995) 131-146]; The Maple library routine for random number generation is used [Maple 10 User Manual c Maplesoft, a division of Waterloo Maple Inc., 2005]. Restrictions: Running time increases rapidly with the number of photons used in the simulation. Unusual features: A maplet (graphical user interface) has been programmed for data input and output. Note that the Monte-Carlo simulation was programmed with Maple 10. If attempting to run the simulation with an earlier version of Maple, appropriate modifications (regarding typesetting fonts) are required and once effected the worksheet runs without problem. However some of the windows of the maplet may still appear distorted. Running time: Depends essentially on the number of photons used in the simulation. Elapsed times for particular runs are reported in the main text.
"Intelligent" Computer Assisted Instruction (CAI) Applications. Interim Report.
ERIC Educational Resources Information Center
Brown, John Seely; And Others
Interim work is documented describing efforts to modify computer techniques used to recognize and process English language requests to an instructional simulator. The conversion from a hand-coded to a table driven technique are described in detail. Other modifications to a simulation based computer assisted instruction program to allow a gaming…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moryakov, A. V., E-mail: sailor@yauza.ru; Pylyov, S. S.
This paper presents the formulation of the problem and the methodical approach for solving large systems of linear differential equations describing nonstationary processes with the use of CUDA technology; this approach is implemented in the ANGEL program. Results for a test problem on transport of radioactive products over loops of a nuclear power plant are given. The possibilities for the use of the ANGEL program for solving various problems that simulate arbitrary nonstationary processes are discussed.
NASA Astrophysics Data System (ADS)
Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.
2016-04-01
Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.
Optimal fabrication processes for unidirectional metal-matrix composites: A computational simulation
NASA Technical Reports Server (NTRS)
Saravanos, D. A.; Murthy, P. L. N.; Morel, M.
1990-01-01
A method is proposed for optimizing the fabrication process of unidirectional metal matrix composites. The temperature and pressure histories are optimized such that the residual microstresses of the composite at the end of the fabrication process are minimized and the material integrity throughout the process is ensured. The response of the composite during the fabrication is simulated based on a nonlinear micromechanics theory. The optimal fabrication problem is formulated and solved with non-linear programming. Application cases regarding the optimization of the fabrication cool-down phases of unidirectional ultra-high modulus graphite/copper and silicon carbide/titanium composites are presented.
NASA Technical Reports Server (NTRS)
Saravanos, D. A.; Murthy, P. L. N.; Morel, M.
1990-01-01
A method is proposed for optimizing the fabrication process of unidirectional metal matrix composites. The temperature and pressure histories are optimized such that the residual microstresses of the composite at the end of the fabrication process are minimized and the material integrity throughout the process is ensured. The response of the composite during the fabrication is simulated based on a nonlinear micromechanics theory. The optimal fabrication problem is formulated and solved with nonlinear programming. Application cases regarding the optimization of the fabrication cool-down phases of unidirectional ultra-high modulus graphite/copper and silicon carbide/titanium composites are presented.
NASA Technical Reports Server (NTRS)
Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.
1997-01-01
Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor degreasing process.
NASA Astrophysics Data System (ADS)
Fatimah, Siti; Setiawan, Wawan; Kusnendar, Jajang; Rasim, Junaeti, Enjun; Anggraeni, Ria
2017-05-01
Debriefing of pedagogical competence through both theory and practice which became a requirement for prospective teachers were through micro teaching and teaching practice program. But, some reports from the partner schools stated that the participants of teaching practice program have not well prepared on implementing the learning in the classroom because of lacking the debriefing. In line with the development of information technology, it is very possible to develop a media briefing of pedagogical competencies for prospective teachers through an application so that they can use it anytime and anywhere. This study was one answer to the problem of unpreparedness participants of the teaching practice program. This study developed a teaching simulator, which was an application for learning simulation with the animated film to enhance the professional pedagogical competence prospective teachers. By the application of this teaching simulator, students as prospective teacher could test their own pedagogic competence through learning models with different varied characteristics of students. Teaching Simulator has been equipped with features that allow users to be able to explore the quality of teaching techniques that they employ for the teaching and learning activities in the classroom. These features included the election approaches, the student's character, learning materials, questioning techniques, discussion, and evaluation. Teaching simulator application provided the ease of prospective teachers or teachers in implementing the development of lessons for practice in the classroom. Applications that have been developed to apply simulation models allow users to freely manage a lesson. Development of teaching simulator application was passed through the stages which include needs assessment, design, coding, testing, revision, improvement, grading, and packaging. The application of teaching simulator was also enriched with some real instructional video as a comparison for the user. Based on the two experts, the media expert and education expert, stated that the application of teaching simulator is feasible to be used as an instrument for the debriefing of students as potential participants of the teaching practice program. The results of the use of the application to the students as potential participants of teaching practice program, showed significant increases in the pedagogic competence. This study was presented at an international seminar and in the process of publishing in international reputated journals. Applications teaching simulator was in the process of registration to obtain the copyright of the Ministry of Justice and Human Rights. Debriefing for prospective teachers to use teaching simulator application could improve the mastery of pedagogy, give clear feedback, and perform repetitions at anytime.
CPAS Preflight Drop Test Analysis Process
NASA Technical Reports Server (NTRS)
Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.
2015-01-01
Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.
NASA Technical Reports Server (NTRS)
Burkhardt, Z.; Ramachandran, N.; Majumdar, A.
2017-01-01
Fluid Transient analysis is important for the design of spacecraft propulsion system to ensure structural stability of the system in the event of sudden closing or opening of the valve. Generalized Fluid System Simulation Program (GFSSP), a general purpose flow network code developed at NASA/MSFC is capable of simulating pressure surge due to sudden opening or closing of valve when thermodynamic properties of real fluid are available for the entire range of simulation. Specifically GFSSP needs an accurate representation of pressure-density relationship in order to predict pressure surge during a fluid transient. Unfortunately, the available thermodynamic property programs such as REFPROP, GASP or GASPAK does not provide the thermodynamic properties of Monomethylhydrazine (MMH). This paper will illustrate the process used for building a customized table of properties of state variables from available properties and speed of sound that is required by GFSSP for simulation. Good agreement was found between the simulations and measured data. This method can be adopted for modeling flow networks and systems with other fluids whose properties are not known in detail in order to obtain general technical insight. Rigorous code validation of this approach will be done and reported at a future date.
Computational Simulation of Containment Influence on Defect Generation During Growth of GeSi
NASA Technical Reports Server (NTRS)
Motakef, Shariar; Yesilyurt, S.; Vujisic, L.
2001-01-01
This report contains results of theoretical work in conjunction with the NASA RDGS program. It is specifically focused on factors controlling the stability of detachment and the sensitivity of the detachment process to the processing and geometric parameters of the crystal growth process.
Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive and time consuming. One of the main contributors to the high cost and lengthy time is the need to perform many large-scale hardware tests and the inability to integrate all appropriate subsystems early in the design process. The NASA Glenn Research Center is developing the technologies required to enable simulations of full aerospace propulsion systems in sufficient detail to resolve critical design issues early in the design process before hardware is built. This concept, called the Numerical Propulsion System Simulation (NPSS), is focused on the integration of multiple disciplines such as aerodynamics, structures and heat transfer with computing and communication technologies to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS, as illustrated, is to be a "numerical test cell" that enables full engine simulation overnight on cost-effective computing platforms. There are several key elements within NPSS that are required to achieve this capability: 1) clear data interfaces through the development and/or use of data exchange standards, 2) modular and flexible program construction through the use of object-oriented programming, 3) integrated multiple fidelity analysis (zooming) techniques that capture the appropriate physics at the appropriate fidelity for the engine systems, 4) multidisciplinary coupling techniques and finally 5) high performance parallel and distributed computing. The current state of development in these five area focuses on air breathing gas turbine engines and is reported in this paper. However, many of the technologies are generic and can be readily applied to rocket based systems and combined cycles currently being considered for low-cost access-to-space applications. Recent accomplishments include: (1) the development of an industry-standard engine cycle analysis program and plug 'n play architecture, called NPSS Version 1, (2) A full engine simulation that combines a 3D low-pressure subsystem with a 0D high pressure core simulation. This demonstrates the ability to integrate analyses at different levels of detail and to aerodynamically couple components, the fan/booster and low-pressure turbine, through a 3D computational fluid dynamics simulation. (3) Simulation of all of the turbomachinery in a modern turbofan engine on parallel computing platform for rapid and cost-effective execution. This capability can also be used to generate full compressor map, requiring both design and off-design simulation. (4) Three levels of coupling characterize the multidisciplinary analysis under NPSS: loosely coupled, process coupled and tightly coupled. The loosely coupled and process coupled approaches require a common geometry definition to link CAD to analysis tools. The tightly coupled approach is currently validating the use of arbitrary Lagrangian/Eulerian formulation for rotating turbomachinery. The validation includes both centrifugal and axial compression systems. The results of the validation will be reported in the paper. (5) The demonstration of significant computing cost/performance reduction for turbine engine applications using PC clusters. The NPSS Project is supported under the NASA High Performance Computing and Communications Program.
RENEW v3.2 user's manual, maintenance estimation simulation for Space Station Freedom Program
NASA Technical Reports Server (NTRS)
Bream, Bruce L.
1993-01-01
RENEW is a maintenance event estimation simulation program developed in support of the Space Station Freedom Program (SSFP). This simulation uses reliability and maintainability (R&M) and logistics data to estimate both average and time dependent maintenance demands. The simulation uses Monte Carlo techniques to generate failure and repair times as a function of the R&M and logistics parameters. The estimates are generated for a single type of orbital replacement unit (ORU). The simulation has been in use by the SSFP Work Package 4 prime contractor, Rocketdyne, since January 1991. The RENEW simulation gives closer estimates of performance since it uses a time dependent approach and depicts more factors affecting ORU failure and repair than steady state average calculations. RENEW gives both average and time dependent demand values. Graphs of failures over the mission period and yearly failure occurrences are generated. The averages demand rate for the ORU over the mission period is also calculated. While RENEW displays the results in graphs, the results are also available in a data file for further use by spreadsheets or other programs. The process of using RENEW starts with keyboard entry of the R&M and operational data. Once entered, the data may be saved in a data file for later retrieval. The parameters may be viewed and changed after entry using RENEW. The simulation program runs the number of Monte Carlo simulations requested by the operator. Plots and tables of the results can be viewed on the screen or sent to a printer. The results of the simulation are saved along with the input data. Help screens are provided with each menu and data entry screen.
Computer modeling and simulators as part of university training for NPP operating personnel
NASA Astrophysics Data System (ADS)
Volman, M.
2017-01-01
This paper considers aspects of a program for training future nuclear power plant personnel developed by the NPP Department of Ivanovo State Power Engineering University. Computer modeling is used for numerical experiments on the kinetics of nuclear reactors in Mathcad. Simulation modeling is carried out on the computer and full-scale simulator of water-cooled power reactor for the simulation of neutron-physical reactor measurements and the start-up - shutdown process.
NASA Astrophysics Data System (ADS)
Dicker, R. J.
The main objective of this thesis is to describe the effect on cognition of the structure of CAL simulation programs used, in science teaching. Four programs simulating a pond ecosystem were written so as to present a simulation model and to assist in cognition in different ways. Various clinically detailed methods of describing learning were developed and tried including concept maps which were found to be sammative rather than formative descriptions of learning, and to be ambiguous) and hierarchical structures (which were found to be difficult to produce). Fran these concept maps and hierarchical structures I developed my Interaction Model of Learning which can be used to describe the chronological events concerned with cognition. Using the Interaction Model, the nature of cognition and the effect that CAL program structure has on this process is described. Various scenarios are presented as a means of showing the possible effects of program structure on learning. Four forms of concept learning activity and their relationship to learning valid and alternative conceptions are described. The findings from the study are particularly related to the work of Driver (1983), Marton (1976) and Entwistle (1981).
Hand controller commonality evaluation process
NASA Technical Reports Server (NTRS)
Stuart, Mark A.; Bierschwale, John M.; Wilmington, Robert P.; Adam, Susan C.; Diaz, Manuel F.; Jensen, Dean G.
1990-01-01
A hand controller evaluation process has been developed to determine the appropriate hand controller configurations for supporting remotely controlled devices. These devices include remote manipulator systems (RMS), dexterous robots, and remotely-piloted free flyers. Standard interfaces were developed to evaluate six different hand controllers in three test facilities including dynamic computer simulations, kinematic computer simulations, and physical simulations. The hand controllers under consideration were six degree-of-freedom (DOF) position and rate minimaster and joystick controllers, and three-DOF rate controllers. Task performance data, subjective comments, and anthropometric data obtained during tests were used for controller configuration recommendations to the SSF Program.
Langevin, Christian D.; Shoemaker, W. Barclay; Guo, Weixing
2003-01-01
SEAWAT-2000 is the latest release of the SEAWAT computer program for simulation of three-dimensional, variable-density, transient ground-water flow in porous media. SEAWAT-2000 was designed by combining a modified version of MODFLOW-2000 and MT3DMS into a single computer program. The code was developed using the MODFLOW-2000 concept of a process, which is defined as ?part of the code that solves a fundamental equation by a specified numerical method.? SEAWAT-2000 contains all of the processes distributed with MODFLOW-2000 and also includes the Variable-Density Flow Process (as an alternative to the constant-density Ground-Water Flow Process) and the Integrated MT3DMS Transport Process. Processes may be active or inactive, depending on simulation objectives; however, not all processes are compatible. For example, the Sensitivity and Parameter Estimation Processes are not compatible with the Variable-Density Flow and Integrated MT3DMS Transport Processes. The SEAWAT-2000 computer code was tested with the common variable-density benchmark problems and also with problems representing evaporation from a salt lake and rotation of immiscible fluids.
The Role of Mental Models in Learning to Program.
ERIC Educational Resources Information Center
Pirolli, Peter L.; Anderson, John R.
This study reports two experiments which indicate that the processes of providing subjects with insightful representations of example programs and guiding subjects through an "ideal" problem solving strategy facilitate learning. A production system model (GRAPES) has been developed that simulates problem-solving and learning in the…
Process Control Migration of 50 LPH Helium Liquefier
NASA Astrophysics Data System (ADS)
Panda, U.; Mandal, A.; Das, A.; Behera, M.; Pal, Sandip
2017-02-01
Two helium liquefier/refrigerators are operational at VECC while one is dedicated for the Superconducting Cyclotron. The first helium liquefier of 50 LPH capacity from Air Liquide has already completed fifteen years of operation without any major trouble. This liquefier is being controlled by Eurotherm PC3000 make PLC. This PLC has become obsolete since last seven years or so. Though we can still manage to run the PLC system with existing spares, risk of discontinuation of the operation is always there due to unavailability of spare. In order to eliminate the risk, an equivalent PLC control system based on Siemens S7-300 was thought of. For smooth migration, total programming was done keeping the same field input and output interface, nomenclature and graphset. New program is a mix of S7-300 Graph, STL and LAD languages. One to one program verification of the entire process graph was done manually. The total program was run in simulation mode. Matlab mathematical model was also used for plant control simulations. EPICS based SCADA was used for process monitoring. As of now the entire hardware and software is ready for direct replacement with minimum required set up time.
Development of weight/sizing design synthesis computer program. Volume 3: User Manual
NASA Technical Reports Server (NTRS)
Garrison, J. M.
1973-01-01
The user manual for the weight/sizing design synthesis program is presented. The program is applied to an analysis of the basic weight relationships for the space shuttle which contribute significant portions of the inert weight. The relationships measure the parameters of load, geometry, material, and environment. A verbal description of the processes simulated, data input procedures, output data, and values present in the program is included.
Robot graphic simulation testbed
NASA Technical Reports Server (NTRS)
Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.
1991-01-01
The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.
Sub-half-micron contact window design with 3D photolithography simulator
NASA Astrophysics Data System (ADS)
Brainerd, Steve K.; Bernard, Douglas A.; Rey, Juan C.; Li, Jiangwei; Granik, Yuri; Boksha, Victor V.
1997-07-01
In state of the art IC design and manufacturing certain lithography layers have unique requirements. Latitudes and tolerances that apply to contacts and polysilicon gates are tight for such critical layers. Industry experts are discussing the most cost effective ways to use feature- oriented equipment and materials already developed for these layers. Such requirements introduce new dimensions into the traditionally challenging task for the photolithography engineer when considering various combinations of multiple factors to optimize and control the process. In addition, he/she faces a rapidly increasing cost of experiments, limited time and scarce access to equipment to conduct them. All the reasons presented above support simulation as an ideal method to satisfy these demands. However lithography engineers may be easily dissatisfied with a simulation tool when discovering disagreement between the simulation and experimental data. The problem is that several parameters used in photolithography simulation are very process specific. Calibration, i.e. matching experimental and simulation data using a specific set of procedures allows one to effectively use the simulation tool. We present results of a simulation based approach to optimize photolithography processes for sub-0.5 micron contact windows. Our approach consists of: (1) 3D simulation to explore different lithographic options, (2) calibration to a range of process conditions with extensive use of specifically developed optimization techniques. The choice of a 3D simulator is essential because of 3D nature of the problem of contact window design. We use DEPICT 4.1. This program performs fast aerial image simulation as presented before. For 3D exposure the program uses an extension to three-dimensions of the high numerical aperture model combined with Fast Fourier Transforms for maximum performance and accuracy. We use Kim (U.C. Berkeley) model and the fast marching Level Set method respectively for the calculation of resist development rates and resist surface movement during development process. Calibration efforts were aimed at matching experimental results on contact windows obtained after exposure of a binary mask. Additionally, simulation was applied to conduct quantitative analysis of PSM design capabilities, optical proximity correction, and stepper parameter optimization. Extensive experiments covered exposure (ASML 5500/100D stepper), pre- and post-exposure bake and development (2.38% TMAH, puddle process) of JSR IX725D2G and TOK iP3500 photoresists films on 200 mm test wafers. `Aquatar' was used as top antireflective coating, SEM pictures of developed patterns were analyzed and compared with simulation results for different values of defocus, exposure energies, numerical aperture and partial coherence.
Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Clarke, Lauren; Gillanders, Elizabeth; Feuer, Eric J.
2013-01-01
Summary: Many simulation methods and programs have been developed to simulate genetic data of the human genome. These data have been widely used, for example, to predict properties of populations retrospectively or prospectively according to mathematically intractable genetic models, and to assist the validation, statistical inference and power analysis of a variety of statistical models. However, owing to the differences in type of genetic data of interest, simulation methods, evolutionary features, input and output formats, terminologies and assumptions for different applications, choosing the right tool for a particular study can be a resource-intensive process that usually involves searching, downloading and testing many different simulation programs. Genetic Simulation Resources (GSR) is a website provided by the National Cancer Institute (NCI) that aims to help researchers compare and choose the appropriate simulation tools for their studies. This website allows authors of simulation software to register their applications and describe them with well-defined attributes, thus allowing site users to search and compare simulators according to specified features. Availability: http://popmodels.cancercontrol.cancer.gov/gsr. Contact: gsr@mail.nih.gov PMID:23435068
MPPhys—A many-particle simulation package for computational physics education
NASA Astrophysics Data System (ADS)
Müller, Thomas
2014-03-01
In a first course to classical mechanics elementary physical processes like elastic two-body collisions, the mass-spring model, or the gravitational two-body problem are discussed in detail. The continuation to many-body systems, however, is deferred to graduate courses although the underlying equations of motion are essentially the same and although there is a strong motivation for high-school students in particular because of the use of particle systems in computer games. The missing link between the simple and the more complex problem is a basic introduction to solve the equations of motion numerically which could be illustrated, however, by means of the Euler method. The many-particle physics simulation package MPPhys offers a platform to experiment with simple particle simulations. The aim is to give a principle idea how to implement many-particle simulations and how simulation and visualization can be combined for interactive visual explorations. Catalogue identifier: AERR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERR_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 111327 No. of bytes in distributed program, including test data, etc.: 608411 Distribution format: tar.gz Programming language: C++, OpenGL, GLSL, OpenCL. Computer: Linux and Windows platforms with OpenGL support. Operating system: Linux and Windows. RAM: Source Code 4.5 MB Complete package 242 MB Classification: 14, 16.9. External routines: OpenGL, OpenCL Nature of problem: Integrate N-body simulations, mass-spring models Solution method: Numerical integration of N-body-simulations, 3D-Rendering via OpenGL. Running time: Problem dependent
1983-09-01
duplicate a continuous function on a digital computer, and thus the machine representatic- of the GMA is only a close approximation of the continuous...error process. Thus, the manner in which the GMA process is digitally replicated has an effect on the results of the simulation. The parameterization of...Information Center 2 Cameron Station Alexandria, Virginia 22314 2. Libary , Code 0142 2 Naval Postgraduate School Monterey, California 93943 3. Professor
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
A livability rating system of a high-rise housing and its computer simulation
NASA Astrophysics Data System (ADS)
Liu, Zhengrui
2017-09-01
Aiming at the problems in housing choosing and purchasing in the high-rise residential buildings, this paper considers the factors that affect various livable degrees and analyzes the important degrees of the factors through the Analytic Hierarchy Process (AHP), quantifying the various factors by 10 point scoring system. Accordingly, this paper puts forward a habitable housing index, validating the correctness of the indicators by simulating the process of housing choosing through computer program.
NASA Technical Reports Server (NTRS)
Kerr, Andrew W.
1990-01-01
The utilization of advanced simulation technology in the development of the non-real-time MANPRINT design tools in the Army/NASA Aircrew-Aircraft Integration (A3I) program is described. A description is then given of the Crew Station Research and Development Facilities, the primary tool for the application of MANPRINT principles. The purpose of the A3I program is to develop a rational, predictive methodology for helicopter cockpit system design that integrates human factors engineering with other principles at an early stage in the development process, avoiding the high cost of previous system design methods. Enabling technologies such as the MIDAS work station are examined, and the potential of low-cost parallel-processing systems is indicated.
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
1999-01-01
The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.
Bahreyni Toossi, M T; Moradi, H; Zare, H
2008-01-01
In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.
2016-09-01
Failure MTBCF Mean Time Between Critical Failure MIRV Multiple Independently-targetable Reentry Vehicle MK6LE MK6 Guidance System Life Extension...programs were the MK54 Lightweight Torpedo program, a Raytheon Radar program, and the Life Extension of the MK6 Guidance System (MK6LE) of the...activities throughout the later life -cycle phases. MBSE allowed the programs to manage the evolution of simulation capabilities, as well as to assess the
3D Heart: a new visual training method for electrocardiographic analysis.
Olson, Charles W; Lange, David; Chan, Jack-Kang; Olson, Kim E; Albano, Alfred; Wagner, Galen S; Selvester, Ronald H S
2007-01-01
This new training method is based on developing a sound understanding of the sequence in which electrical excitation spreads through both the normal and the infarcted myocardium. The student is made aware of the cardiac electrical performance through a series of 3-dimensional pictures during the excitation process. The electrocardiogram 3D Heart 3-dimensional program contains a variety of different activation simulations. Currently, this program enables the user to view the activation simulation for all of the following pathology examples: normal activation; large, medium, and small anterior myocardial infarction (MI); large, medium, and small posterolateral MI; large, medium, and small inferior MI. Simulations relating to other cardiac abnormalities, such as bundle branch block and left ventricular hypertrophy fasicular block, are being developed as part of a National Institute of Health (NIH) Phase 1 Small Business Innovation Research (SBIR) program.
ERIC Educational Resources Information Center
Irwin, Ruth E.
2013-01-01
It is not sufficient to just make changes in a nursing curriculum without a plan to evaluate the impact on program outcomes. This study sought to determine the outcomes of teaching the nursing process to Foundation of Nursing students in an Associate Degree Nursing program using a factorial design study. Four groups of students were taught the…
RMS active damping augmentation
NASA Technical Reports Server (NTRS)
Gilbert, Michael G.; Scott, Michael A.; Demeo, Martha E.
1992-01-01
The topics are presented in viewgraph form and include: RMS active damping augmentation; potential space station assembly benefits to CSI; LaRC/JSC bridge program; control law design process; draper RMS simulator; MIMO acceleration control laws improve damping; potential load reduction benefit; DRS modified to model distributed accelerations; accelerometer location; Space Shuttle aft cockpit simulator; simulated shuttle video displays; SES test goals and objectives; and SES modifications to support RMS active damping augmentation.
Quench simulations for superconducting elements in the LHC accelerator
NASA Astrophysics Data System (ADS)
Sonnemann, F.; Schmidt, R.
2000-08-01
The design of the protection system for the superconducting elements in an accelerator such as the large Hadron collider (LHC), now under construction at CERN, requires a detailed understanding of the thermo-hydraulic and electrodynamic processes during a quench. A numerical program (SPQR - simulation program for quench research) has been developed to evaluate temperature and voltage distributions during a quench as a function of space and time. The quench process is simulated by approximating the heat balance equation with the finite difference method in presence of variable cooling and powering conditions. The simulation predicts quench propagation along a superconducting cable, forced quenching with heaters, impact of eddy currents induced by a magnetic field change, and heat transfer through an insulation layer into helium, an adjacent conductor or other material. The simulation studies allowed a better understanding of experimental quench data and were used for determining the adequate dimensioning and protection of the highly stabilised superconducting cables for connecting magnets (busbars), optimising the quench heater strip layout for the main magnets, and studying quench back by induced eddy currents in the superconductor. After the introduction of the theoretical approach, some applications of the simulation model for the LHC dipole and corrector magnets are presented and the outcome of the studies is compared with experimental data.
BASIC Simulation Programs; Volumes I and II. Biology, Earth Science, Chemistry.
ERIC Educational Resources Information Center
Digital Equipment Corp., Maynard, MA.
Computer programs which teach concepts and processes related to biology, earth science, and chemistry are presented. The seven biology problems deal with aspects of genetics, evolution and natural selection, gametogenesis, enzymes, photosynthesis, and the transport of material across a membrane. Four earth science problems concern climates, the…
Teaching About the Constitution.
ERIC Educational Resources Information Center
White, Charles S.
1988-01-01
Reviews "The U.S. Constitution Then and Now," a two-unit program using the integrated database and word processing capabilities of AppleWorks. For grades 7-12, the units simulate the constitutional convention and the principles of free speech and privacy. Concludes that with adequate time, the program can provide a potentially powerful…
A functional language approach in high-speed digital simulation
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Lu, S.-L.
1983-01-01
A functional programming approach for a multi-microprocessor architecture is presented. The language, based on Backus FP, its intermediate form and the translation process are discussed and illustrated with an example. The approach allows performance analysis to be performed at a high level as an aid in program partitioning.
PCs: Key to the Future. Business Center Provides Sound Skills and Good Attitudes.
ERIC Educational Resources Information Center
Pay, Renee W.
1991-01-01
The Advanced Computing/Management Training Program at Jordan Technical Center (Sandy, Utah) simulates an automated office to teach five sets of skills: computer architecture and operating systems, word processing, data processing, communications skills, and management principles. (SK)
Accelerating Wright–Fisher Forward Simulations on the Graphics Processing Unit
Lawrie, David S.
2017-01-01
Forward Wright–Fisher simulations are powerful in their ability to model complex demography and selection scenarios, but suffer from slow execution on the Central Processor Unit (CPU), thus limiting their usefulness. However, the single-locus Wright–Fisher forward algorithm is exceedingly parallelizable, with many steps that are so-called “embarrassingly parallel,” consisting of a vast number of individual computations that are all independent of each other and thus capable of being performed concurrently. The rise of modern Graphics Processing Units (GPUs) and programming languages designed to leverage the inherent parallel nature of these processors have allowed researchers to dramatically speed up many programs that have such high arithmetic intensity and intrinsic concurrency. The presented GPU Optimized Wright–Fisher simulation, or “GO Fish” for short, can be used to simulate arbitrary selection and demographic scenarios while running over 250-fold faster than its serial counterpart on the CPU. Even modest GPU hardware can achieve an impressive speedup of over two orders of magnitude. With simulations so accelerated, one can not only do quick parametric bootstrapping of previously estimated parameters, but also use simulated results to calculate the likelihoods and summary statistics of demographic and selection models against real polymorphism data, all without restricting the demographic and selection scenarios that can be modeled or requiring approximations to the single-locus forward algorithm for efficiency. Further, as many of the parallel programming techniques used in this simulation can be applied to other computationally intensive algorithms important in population genetics, GO Fish serves as an exciting template for future research into accelerating computation in evolution. GO Fish is part of the Parallel PopGen Package available at: http://dl42.github.io/ParallelPopGen/. PMID:28768689
ERIC Educational Resources Information Center
Mitchell, Claudia
2010-01-01
Competency standards require baccalaureate nursing graduates to demonstrate knowledge, understanding, and the ability to solve complex problems. In an effort to achieve these program outcomes, educators seek empirical evidence related to the learning process and the effect of innovative teaching strategies, such as simulation, on the learner.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-30
... population program HexSim. Though still at preliminary draft stage, population response simulations from this portion of the modeling process are available for public review by request from our office. These simulations do not estimate what will occur in the future, but provide comparative information on potential...
A computer program for the simulation of heat and moisture flow in soils
NASA Technical Reports Server (NTRS)
Camillo, P.; Schmugge, T. J.
1981-01-01
A computer program that simulates the flow of heat and moisture in soils is described. The space-time dependence of temperature and moisture content is described by a set of diffusion-type partial differential equations. The simulator uses a predictor/corrector to numerically integrate them, giving wetness and temperature profiles as a function of time. The simulator was used to generate solutions to diffusion-type partial differential equations for which analytical solutions are known. These equations include both constant and variable diffusivities, and both flux and constant concentration boundary conditions. In all cases, the simulated and analytic solutions agreed to within the error bounds which were imposed on the integrator. Simulations of heat and moisture flow under actual field conditions were also performed. Ground truth data were used for the boundary conditions and soil transport properties. The qualitative agreement between simulated and measured profiles is an indication that the model equations are reasonably accurate representations of the physical processes involved.
NASA Astrophysics Data System (ADS)
Cassidy, J.; Zheng, Z.; Xu, Y.; Betz, V.; Lilge, L.
2017-04-01
Background: The majority of de novo cancers are diagnosed in low and middle-income countries, which often lack the resources to provide adequate therapeutic options. None or minimally invasive therapies such as Photodynamic Therapy (PDT) or photothermal therapies could become part of the overall treatment options in these countries. However, widespread acceptance is hindered by the current empirical training of surgeons in these optical techniques and a lack of easily usable treatment optimizing tools. Methods: Based on image processing programs, ITK-SNAP, and the publicly available FullMonte light propagation software, a work plan is proposed that allows for personalized PDT treatment planning. Starting with, contoured clinical CT or MRI images, the generation of 3D tetrahedral models in silico, execution of the Monte Carlo simulation and presentation of the 3D fluence rate, Φ, [mWcm-2] distribution a treatment plan optimizing photon source placement is developed. Results: Permitting 1-2 days for the installation of the required programs, novices can generate their first fluence, H [Jcm-2] or Φ distribution in a matter of hours. This is reduced to 10th of minutes with some training. Executing the photon simulation calculations is rapid and not the performance limiting process. Largest sources of errors are uncertainties in the contouring and unknown tissue optical properties. Conclusions: The presented FullMonte simulation is the fastest tetrahedral based photon propagation program and provides the basis for PDT treatment planning processes, enabling a faster proliferation of low cost, minimal invasive personalized cancer therapies.
Anderson, Roberta; Armour, Elwood; Beeckler, Courtney; Briner, Valerie; Choflet, Amanda; Cox, Andrea; Fader, Amanda N; Hannah, Marie N; Hobbs, Robert; Huang, Ellen; Kiely, Marilyn; Lee, Junghoon; Morcos, Marc; McMillan, Paige E; Miller, Dave; Ng, Sook Kien; Prasad, Rashmi; Souranis, Annette; Thomsen, Robert; DeWeese, Theodore L; Viswanathan, Akila N
As a core component of a new gynecologic cancer radiation program, we envisioned, structured, and implemented a novel Interventional Radiation Oncology (IRO) unit and magnetic resonance (MR)-brachytherapy environment in an existing MR simulator. We describe the external and internal processes required over a 6-8 month time frame to develop a clinical and research program for gynecologic brachytherapy and to successfully convert an MR simulator into an IRO unit. Support of the institution and department resulted in conversion of an MR simulator to a procedural suite. Development of the MR gynecologic brachytherapy program required novel equipment, staffing, infrastructural development, and cooperative team development with anesthetists, nurses, therapists, physicists, and physicians to ensure a safe and functional environment. Creation of a separate IRO unit permitted a novel billing structure. The creation of an MR-brachytherapy environment in an MR simulator is feasible. Developing infrastructure includes several collaborative elements. Unique to the field of radiation oncology, formalizing the space as an Interventional Radiation Oncology unit permits a sustainable financial structure. Copyright © 2018 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Comparison of cyclic correlation algorithm implemented in matlab and python
NASA Astrophysics Data System (ADS)
Carr, Richard; Whitney, James
Simulation is a necessary step for all engineering projects. Simulation gives the engineers an approximation of how their devices will perform under different circumstances, without hav-ing to build, or before building a physical prototype. This is especially true for space bound devices, i.e., space communication systems, where the impact of system malfunction or failure is several orders of magnitude over that of terrestrial applications. Therefore having a reliable simulation tool is key in developing these devices and systems. Math Works Matrix Laboratory (MATLAB) is a matrix based software used by scientists and engineers to solve problems and perform complex simulations. MATLAB has a number of applications in a wide variety of fields which include communications, signal processing, image processing, mathematics, eco-nomics and physics. Because of its many uses MATLAB has become the preferred software for many engineers; it is also very expensive, especially for students and startups. One alternative to MATLAB is Python. The Python is a powerful, easy to use, open source programming environment that can be used to perform many of the same functions as MATLAB. Python programming environment has been steadily gaining popularity in niche programming circles. While there are not as many function included in the software as MATLAB, there are many open source functions that have been developed that are available to be downloaded for free. This paper illustrates how Python can implement the cyclic correlation algorithm and com-pares the results to the cyclic correlation algorithm implemented in the MATLAB environment. Some of the characteristics to be compared are the accuracy and precision of the results, and the length of the programs. The paper will demonstrate that Python is capable of performing simulations of complex algorithms such cyclic correlation.
OʼHara, Susan
2014-01-01
Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.
ASTEP user's guide and software documentation
NASA Technical Reports Server (NTRS)
Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.
1974-01-01
The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.
Data processing for the DMSP microwave radiometer system
NASA Technical Reports Server (NTRS)
Rigone, J. L.; Stogryn, A. P.
1977-01-01
A software program was developed and tested to process microwave radiometry data to be acquired by the microwave sensor (SSM/T) on the Defense Meteorological Satellite Program spacecraft. The SSM/T 7-channel microwave radiometer and systems data will be data-linked to Air Force Global Weather Central (AFGWC) where they will be merged with ephemeris data prior to product processing for use in the AFGWC upper air data base (UADB). The overall system utilizes an integrated design to provide atmospheric temperature soundings for global applications. The fully automated processing at AFGWC was accomplished by four related computer processor programs to produce compatible UADB soundings, evaluate system performance, and update the a priori developed inversion matrices. Tests with simulated data produced results significantly better than climatology.
[Innovation in healthcare processes and patient safety using clinical simulation].
Rojo, E; Maestre, J M; Díaz-Mendi, A R; Ansorena, L; Del Moral, I
2016-01-01
Many excellent ideas are never implemented or generalised by healthcare organisations. There are two related paradigms: thinking that individuals primarily change through accumulating knowledge, and believing that the dissemination of that knowledge within the organisation is the key element to facilitate change. As an alternative, a description and evaluation of a simulation-based inter-professional team training program conducted in a Regional Health Service to promote and facilitate change is presented. The Department of Continuing Education completed the needs assessment using the proposals presented by clinical units and management. Skills and behaviors that could be learned using simulation were selected, and all personnel from the units participating were included. Experiential learning principles based on clinical simulation and debriefing, were used for the instructional design. The Kirkpatrick model was used to evaluate the program. Objectives included: a) decision-making and teamwork skills training in high prevalence diseases with a high rate of preventable complications; b) care processes reorganisation to improve efficiency, while maintaining patient safety; and, c) implementation of new complex techniques with a long learning curve, and high preventable complications rate. Thirty clinical units organised 39 training programs in the 3 public hospitals, and primary care of the Regional Health Service during 2013-2014. Over 1,559 healthcare professionals participated, including nursing assistants, nurses and physicians. Simulation in healthcare to train inter-professional teams can promote and facilitate change in patient care, and organisational re-engineering. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
A real-time, dual processor simulation of the rotor system research aircraft
NASA Technical Reports Server (NTRS)
Mackie, D. B.; Alderete, T. S.
1977-01-01
A real-time, man-in-the loop, simulation of the rotor system research aircraft (RSRA) was conducted. The unique feature of this simulation was that two digital computers were used in parallel to solve the equations of the RSRA mathematical model. The design, development, and implementation of the simulation are documented. Program validation was discussed, and examples of data recordings are given. This simulation provided an important research tool for the RSRA project in terms of safe and cost-effective design analysis. In addition, valuable knowledge concerning parallel processing and a powerful simulation hardware and software system was gained.
Urbina-Villalba, German
2009-03-01
The first algorithm for Emulsion Stability Simulations (ESS) was presented at the V Conferencia Iberoamericana sobre Equilibrio de Fases y Diseño de Procesos [Luis, J.; García-Sucre, M.; Urbina-Villalba, G. Brownian Dynamics Simulation of Emulsion Stability In: Equifase 99. Libro de Actas, 1(st) Ed., Tojo J., Arce, A., Eds.; Solucion's: Vigo, Spain, 1999; Volume 2, pp. 364-369]. The former version of the program consisted on a minor modification of the Brownian Dynamics algorithm to account for the coalescence of drops. The present version of the program contains elaborate routines for time-dependent surfactant adsorption, average diffusion constants, and Ostwald ripening.
Sailfish: A flexible multi-GPU implementation of the lattice Boltzmann method
NASA Astrophysics Data System (ADS)
Januszewski, M.; Kostur, M.
2014-09-01
We present Sailfish, an open source fluid simulation package implementing the lattice Boltzmann method (LBM) on modern Graphics Processing Units (GPUs) using CUDA/OpenCL. We take a novel approach to GPU code implementation and use run-time code generation techniques and a high level programming language (Python) to achieve state of the art performance, while allowing easy experimentation with different LBM models and tuning for various types of hardware. We discuss the general design principles of the code, scaling to multiple GPUs in a distributed environment, as well as the GPU implementation and optimization of many different LBM models, both single component (BGK, MRT, ELBM) and multicomponent (Shan-Chen, free energy). The paper also presents results of performance benchmarks spanning the last three NVIDIA GPU generations (Tesla, Fermi, Kepler), which we hope will be useful for researchers working with this type of hardware and similar codes. Catalogue identifier: AETA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AETA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Lesser General Public License, version 3 No. of lines in distributed program, including test data, etc.: 225864 No. of bytes in distributed program, including test data, etc.: 46861049 Distribution format: tar.gz Programming language: Python, CUDA C, OpenCL. Computer: Any with an OpenCL or CUDA-compliant GPU. Operating system: No limits (tested on Linux and Mac OS X). RAM: Hundreds of megabytes to tens of gigabytes for typical cases. Classification: 12, 6.5. External routines: PyCUDA/PyOpenCL, Numpy, Mako, ZeroMQ (for multi-GPU simulations), scipy, sympy Nature of problem: GPU-accelerated simulation of single- and multi-component fluid flows. Solution method: A wide range of relaxation models (LBGK, MRT, regularized LB, ELBM, Shan-Chen, free energy, free surface) and boundary conditions within the lattice Boltzmann method framework. Simulations can be run in single or double precision using one or more GPUs. Restrictions: The lattice Boltzmann method works for low Mach number flows only. Unusual features: The actual numerical calculations run exclusively on GPUs. The numerical code is built dynamically at run-time in CUDA C or OpenCL, using templates and symbolic formulas. The high-level control of the simulation is maintained by a Python process. Additional comments: !!!!! The distribution file for this program is over 45 Mbytes and therefore is not delivered directly when Download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. !!!!! Running time: Problem-dependent, typically minutes (for small cases or short simulations) to hours (large cases or long simulations).
Transient state kinetics tutorial using the kinetics simulation program, KINSIM.
Wachsstock, D H; Pollard, T D
1994-01-01
This article provides an introduction to a computer tutorial on transient state kinetics. The tutorial uses our Macintosh version of the computer program, KINSIM, that calculates the time course of reactions. KINSIM is also available for other popular computers. This program allows even those investigators not mathematically inclined to evaluate the rate constants for the transitions between the intermediates in any reaction mechanism. These rate constants are one of the insights that are essential for understanding how biochemical processes work at the molecular level. The approach is applicable not only to enzyme reactions but also to any other type of process of interest to biophysicists, cell biologists, and molecular biologists in which concentrations change with time. In principle, the same methods could be used to characterize time-dependent, large-scale processes in ecology and evolution. Completion of the tutorial takes students 6-10 h. This investment is rewarded by a deep understanding of the principles of chemical kinetics and familiarity with the tools of kinetics simulation as an approach to solve everyday problems in the laboratory. PMID:7811941
Assembly-line Simulation Program
NASA Technical Reports Server (NTRS)
Chamberlain, Robert G.; Zendejas, Silvino; Malhotra, Shan
1987-01-01
Costs and profits estimated for models based on user inputs. Standard Assembly-line Manufacturing Industry Simulation (SAMIS) program generalized so useful for production-line manufacturing companies. Provides accurate and reliable means of comparing alternative manufacturing processes. Used to assess impact of changes in financial parameters as cost of resources and services, inflation rates, interest rates, tax policies, and required rate of return of equity. Most important capability is ability to estimate prices manufacturer would have to receive for its products to recover all of costs of production and make specified profit. Written in TURBO PASCAL.
NASA Technical Reports Server (NTRS)
Nosenchuck, D. M.; Littman, M. G.
1986-01-01
The Navier-Stokes computer (NSC) has been developed for solving problems in fluid mechanics involving complex flow simulations that require more speed and capacity than provided by current and proposed Class VI supercomputers. The machine is a parallel processing supercomputer with several new architectural elements which can be programmed to address a wide range of problems meeting the following criteria: (1) the problem is numerically intensive, and (2) the code makes use of long vectors. A simulation of two-dimensional nonsteady viscous flows is presented to illustrate the architecture, programming, and some of the capabilities of the NSC.
Flow behavior in liquid molding
NASA Technical Reports Server (NTRS)
Hunston, D.; Phelan, F.; Parnas, R.
1992-01-01
The liquid molding (LM) process for manufacturing polymer composites with structural properties has the potential to significantly lower fabrication costs and increase production rates. LM includes both resin transfer molding and structural reaction injection molding. To achieve this potential, however, the underlying science base must be improved to facilitate effective process optimization and implementation of on-line process control. The National Institute of Standards and Technology (NIST) has a major program in LM that includes materials characterization, process simulation models, on-line process monitoring and control, and the fabrication of test specimens. The results of this program are applied to real parts through cooperative projects with industry. The key feature in the effort is a comprehensive and integrated approach to the processing science aspects of LM. This paper briefly outlines the NIST program and uses several examples to illustrate the work.
CUDAEASY - a GPU accelerated cosmological lattice program
NASA Astrophysics Data System (ADS)
Sainio, J.
2010-05-01
This paper presents, to the author's knowledge, the first graphics processing unit (GPU) accelerated program that solves the evolution of interacting scalar fields in an expanding universe. We present the implementation in NVIDIA's Compute Unified Device Architecture (CUDA) and compare the performance to other similar programs in chaotic inflation models. We report speedups between one and two orders of magnitude depending on the used hardware and software while achieving small errors in single precision. Simulations that used to last roughly one day to compute can now be done in hours and this difference is expected to increase in the future. The program has been written in the spirit of LATTICEEASY and users of the aforementioned program should find it relatively easy to start using CUDAEASY in lattice simulations. The program is available at http://www.physics.utu.fi/theory/particlecosmology/cudaeasy/ under the GNU General Public License.
Simulation of Planetary Formation using Python
NASA Astrophysics Data System (ADS)
Bufkin, James; Bixler, David
2015-03-01
A program to simulate planetary formation was developed in the Python programming language. The program consists of randomly placed and massed bodies surrounding a central massive object in order to approximate a protoplanetary disk. The orbits of these bodies are time-stepped, with accelerations, velocities and new positions calculated in each step. Bodies are allowed to merge if their disks intersect. Numerous parameters (orbital distance, masses, number of particles, etc.) were varied in order to optimize the program. The program uses an iterative difference equation approach to solve the equations of motion using a kinematic model. Conservation of energy and angular momentum are not specifically forced, but conservation of momentum is forced during the merging of bodies. The initial program was created in Visual Python (VPython) but the current intention is to allow for higher particle count and faster processing by utilizing PyOpenCl and PyOpenGl. Current results and progress will be reported.
Zhmurov, A; Dima, R I; Kholodov, Y; Barsegov, V
2010-11-01
Theoretical exploration of fundamental biological processes involving the forced unraveling of multimeric proteins, the sliding motion in protein fibers and the mechanical deformation of biomolecular assemblies under physiological force loads is challenging even for distributed computing systems. Using a C(α)-based coarse-grained self organized polymer (SOP) model, we implemented the Langevin simulations of proteins on graphics processing units (SOP-GPU program). We assessed the computational performance of an end-to-end application of the program, where all the steps of the algorithm are running on a GPU, by profiling the simulation time and memory usage for a number of test systems. The ∼90-fold computational speedup on a GPU, compared with an optimized central processing unit program, enabled us to follow the dynamics in the centisecond timescale, and to obtain the force-extension profiles using experimental pulling speeds (v(f) = 1-10 μm/s) employed in atomic force microscopy and in optical tweezers-based dynamic force spectroscopy. We found that the mechanical molecular response critically depends on the conditions of force application and that the kinetics and pathways for unfolding change drastically even upon a modest 10-fold increase in v(f). This implies that, to resolve accurately the free energy landscape and to relate the results of single-molecule experiments in vitro and in silico, molecular simulations should be carried out under the experimentally relevant force loads. This can be accomplished in reasonable wall-clock time for biomolecules of size as large as 10(5) residues using the SOP-GPU package. © 2010 Wiley-Liss, Inc.
WE-D-204-02: Errors and Process Improvements in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fontenla, D.
2016-06-15
Speakers in this session will present overview and details of a specific rotation or feature of their Medical Physics Residency Program that is particularly exceptional and noteworthy. The featured rotations include foundational topics executed with exceptional acumen and innovative educational rotations perhaps not commonly found in Medical Physics Residency Programs. A site-specific clinical rotation will be described, where the medical physics resident follows the physician and medical resident for two weeks into patient consultations, simulation sessions, target contouring sessions, planning meetings with dosimetry, patient follow up visits, and tumor boards, to gain insight into the thought processes of the radiationmore » oncologist. An incident learning rotation will be described where the residents learns about and practices evaluating clinical errors and investigates process improvements for the clinic. The residency environment at a Canadian medical physics residency program will be described, where the training and interactions with radiation oncology residents is integrated. And the first month rotation will be described, where the medical physics resident rotates through the clinical areas including simulation, dosimetry, and treatment units, gaining an overview of the clinical flow and meeting all the clinical staff to begin the residency program. This session will be of particular interest to residency programs who are interested in adopting or adapting these curricular ideas into their programs and to residency candidates who want to learn about programs already employing innovative practices. Learning Objectives: To learn about exceptional and innovative clinical rotations or program features within existing Medical Physics Residency Programs. To understand how to adopt/adapt innovative curricular designs into your own Medical Physics Residency Program, if appropriate.« less
Generalize aerodynamic coefficient table storage, checkout and interpolation for aircraft simulation
NASA Technical Reports Server (NTRS)
Neuman, F.; Warner, N.
1973-01-01
The set of programs described has been used for rapidly introducing, checking out and very efficiently using aerodynamic tables in complex aircraft simulations on the IBM 360. The preprocessor program reads in tables with different names and dimensions and stores then on disc storage according to the specified dimensions. The tables are read in from IBM cards in a format which is convenient to reduce the data from the original graphs. During table processing, new auxiliary tables are generated which are required for table cataloging and for efficient interpolation. In addition, DIMENSION statements for the tables as well as READ statements are punched so that they may be used in other programs for readout of the data from disc without chance of programming errors. A quick data checking graphical output for all tables is provided in a separate program.
NASA Technical Reports Server (NTRS)
Aster, R. W.; Chamberlain, R. G.; Zendejas, S. C.; Lee, T. S.; Malhotra, S.
1986-01-01
Company-wide or process-wide production simulated. Price Estimation Guidelines (IPEG) program provides simple, accurate estimates of prices of manufactured products. Simplification of SAMIS allows analyst with limited time and computing resources to perform greater number of sensitivity studies. Although developed for photovoltaic industry, readily adaptable to standard assembly-line type of manufacturing industry. IPEG program estimates annual production price per unit. IPEG/PC program written in TURBO PASCAL.
Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias
2011-10-01
Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.
METAGUI. A VMD interface for analyzing metadynamics and molecular dynamics simulations
NASA Astrophysics Data System (ADS)
Biarnés, Xevi; Pietrucci, Fabio; Marinelli, Fabrizio; Laio, Alessandro
2012-01-01
We present a new computational tool, METAGUI, which extends the VMD program with a graphical user interface that allows constructing a thermodynamic and kinetic model of a given process simulated by large-scale molecular dynamics. The tool is specially designed for analyzing metadynamics based simulations. The huge amount of diverse structures generated during such a simulation is partitioned into a set of microstates (i.e. structures with similar values of the collective variables). Their relative free energies are then computed by a weighted-histogram procedure and the most relevant free energy wells are identified by diagonalization of the rate matrix followed by a commitor analysis. All this procedure leads to a convenient representation of the metastable states and long-time kinetics of the system which can be compared with experimental data. The tool allows to seamlessly switch between a collective variables space representation of microstates and their atomic structure representation, which greatly facilitates the set-up and analysis of molecular dynamics simulations. METAGUI is based on the output format of the PLUMED plugin, making it compatible with a number of different molecular dynamics packages like AMBER, NAMD, GROMACS and several others. The METAGUI source files can be downloaded from the PLUMED web site ( http://www.plumed-code.org). Program summaryProgram title: METAGUI Catalogue identifier: AEKH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 117 545 No. of bytes in distributed program, including test data, etc.: 8 516 203 Distribution format: tar.gz Programming language: TK/TCL, Fortran Computer: Any computer with a VMD installation and capable of running an executable produced by a gfortran compiler Operating system: Linux, Unix OS-es RAM: 1 073 741 824 bytes Classification: 23 External routines: A VMD installation ( http://www.ks.uiuc.edu/Research/vmd/) Nature of problem: Extract thermodynamic data and build a kinetic model of a given process simulated by metadynamics or molecular dynamics simulations, and provide this information on a dual representation that allows navigating and exploring the molecular structures corresponding to each point along the multi-dimensional free energy hypersurface. Solution method: Graphical-user interface linked to VMD that clusterizes the simulation trajectories in the space of a set of collective variables and assigns each frame to a given microstate, determines the free energy of each microstate by a weighted histogram analysis method, and identifies the most relevant free energy wells (kinetic basins) by diagonalization of the rate matrix followed by a commitor analysis. Restrictions: Input format files compatible with PLUMED and all the MD engines supported by PLUMED and VMD. Running time: A few minutes.
Coon, William F.
2003-01-01
A computer model of hydrologic and water-quality processes of the Irondequoit Creek basin in Monroe and Ontario Counties, N.Y., was developed during 2000-02 to enable water-resources managers to simulate the effects of future development and stormwater-detention basins on peak flows and water quality of Irondequoit Creek and its tributaries. The model was developed with the program Hydrological Simulation Program-Fortran (HSPF) such that proposed or hypothetical land-use changes and instream stormwater-detention basins could be simulated, and their effects on peak flows and loads of total suspended solids, total phosphorus, ammonia-plus-organic nitrogen, and nitrate-plus-nitrite nitrogen could be analyzed, through an interactive computer program known as Generation and Analysis of Model Simulation Scenarios for Watersheds (GenScn). This report is a user's manual written to guide the Irondequoit Creek Watershed Collaborative in (1) the creation of land-use and flow-detention scenarios for simulation by the HSPF model, and (2) the use of GenScn to analyze the results of these simulations. These analyses can, in turn, aid the group in making basin-wide water-resources-management decisions.
Coon, William F.
2008-01-01
A computer model of hydrologic and water-quality processes of the Onondaga Lake basin in Onondaga County, N.Y., was developed during 2003-07 to assist water-resources managers in making basin-wide management decisions that could affect peak flows and the water quality of tributaries to Onondaga Lake. The model was developed with the Hydrological Simulation Program-Fortran (HSPF) and was designed to allow simulation of proposed or hypothetical land-use changes, best-management practices (BMPs), and instream stormwater-detention basins such that their effects on flows and loads of suspended sediment, orthophosphate, total phosphorus, ammonia, organic nitrogen, and nitrate could be analyzed. Extreme weather conditions, such as intense storms and prolonged droughts, can be simulated through manipulation of the precipitation record. Model results obtained from different scenarios can then be compared and analyzed through an interactive computer program known as Generation and Analysis of Model Simulation Scenarios for Watersheds (GenScn). Background information on HSPF and GenScn is presented to familiarize the user with these two programs. Step-by-step examples are provided on (1) the creation of land-use, BMP, and stormflow-detention scenarios for simulation by the HSPF model, and (2) the analysis of simulation results through GenScn.
Neural Processing of Musical and Vocal Emotions Through Cochlear Implants Simulation.
Ahmed, Duha G; Paquette, Sebastian; Zeitouni, Anthony; Lehmann, Alexandre
2018-05-01
Cochlear implants (CIs) partially restore the sense of hearing in the deaf. However, the ability to recognize emotions in speech and music is reduced due to the implant's electrical signal limitations and the patient's altered neural pathways. Electrophysiological correlations of these limitations are not yet well established. Here we aimed to characterize the effect of CIs on auditory emotion processing and, for the first time, directly compare vocal and musical emotion processing through a CI-simulator. We recorded 16 normal hearing participants' electroencephalographic activity while listening to vocal and musical emotional bursts in their original form and in a degraded (CI-simulated) condition. We found prolonged P50 latency and reduced N100-P200 complex amplitude in the CI-simulated condition. This points to a limitation in encoding sound signals processed through CI simulation. When comparing the processing of vocal and musical bursts, we found a delay in latency with the musical bursts compared to the vocal bursts in both conditions (original and CI-simulated). This suggests that despite the cochlear implants' limitations, the auditory cortex can distinguish between vocal and musical stimuli. In addition, it adds to the literature supporting the complexity of musical emotion. Replicating this study with actual CI users might lead to characterizing emotional processing in CI users and could ultimately help develop optimal rehabilitation programs or device processing strategies to improve CI users' quality of life.
Computer simulation of stochastic processes through model-sampling (Monte Carlo) techniques.
Sheppard, C W.
1969-03-01
A simple Monte Carlo simulation program is outlined which can be used for the investigation of random-walk problems, for example in diffusion, or the movement of tracers in the blood circulation. The results given by the simulation are compared with those predicted by well-established theory, and it is shown how the model can be expanded to deal with drift, and with reflexion from or adsorption at a boundary.
Software Design for Interactive Graphic Radiation Treatment Simulation Systems*
Kalet, Ira J.; Sweeney, Christine; Jacky, Jonathan
1990-01-01
We examine issues in the design of interactive computer graphic simulation programs for radiation treatment planning (RTP), as well as expert system programs that automate parts of the RTP process, in light of ten years of experience at designing, building and using such programs. An experiment in object-oriented design using standard Pascal shows that while some advantage is gained from the design, it is still difficult to achieve modularity and to integrate expert system components. A new design based on the Common LISP Object System (CLOS) is described. This series of designs for RTP software shows that this application benefits in specific ways from object-oriented design methods and appropriate languages and tools.
Simulation of ultra-high energy photon propagation with PRESHOWER 2.0
NASA Astrophysics Data System (ADS)
Homola, P.; Engel, R.; Pysz, A.; Wilczyński, H.
2013-05-01
In this paper we describe a new release of the PRESHOWER program, a tool for Monte Carlo simulation of propagation of ultra-high energy photons in the magnetic field of the Earth. The PRESHOWER program is designed to calculate magnetic pair production and bremsstrahlung and should be used together with other programs to simulate extensive air showers induced by photons. The main new features of the PRESHOWER code include a much faster algorithm applied in the procedures of simulating the processes of gamma conversion and bremsstrahlung, update of the geomagnetic field model, and a minor correction. The new simulation procedure increases the flexibility of the code so that it can also be applied to other magnetic field configurations such as, for example, encountered in the vicinity of the sun or neutron stars. Program summaryProgram title: PRESHOWER 2.0 Catalog identifier: ADWG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3968 No. of bytes in distributed program, including test data, etc.: 37198 Distribution format: tar.gz Programming language: C, FORTRAN 77. Computer: Intel-Pentium based PC. Operating system: Linux or Unix. RAM:< 100 kB Classification: 1.1. Does the new version supercede the previous version?: Yes Catalog identifier of previous version: ADWG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 173 (2005) 71 Nature of problem: Simulation of a cascade of particles initiated by UHE photon in magnetic field. Solution method: The primary photon is tracked until its conversion into an e+ e- pair. If conversion occurs each individual particle in the resultant preshower is checked for either bremsstrahlung radiation (electrons) or secondary gamma conversion (photons). Reasons for new version: Slow and outdated algorithm in the old version (a significant speed up is possible); Extension of the program to allow simulations also for extraterrestrial magnetic field configurations (e.g. neutron stars) and very long path lengths. Summary of revisions: A veto algorithm was introduced in the gamma conversion and bremsstrahlung tracking procedures. The length of the tracking step is now variable along the track and depends on the probability of the process expected to occur. The new algorithm reduces significantly the number of tracking steps and speeds up the execution of the program. The geomagnetic field model has been updated to IGRF-11, allowing for interpolations up to the year 2015. Numerical Recipes procedures to calculate modified Bessel functions have been replaced with an open source CERN routine DBSKA. One minor bug has been fixed. Restrictions: Gamma conversion into particles other than an electron pair is not considered. Spatial structure of the cascade is neglected. Additional comments: The following routines are supplied in the package, IGRF [1, 2], DBSKA [3], ran2 [4] Running time: 100 preshower events with primary energy 1020 eV require a 2.66 GHz CPU time of about 200 sec.; at the energy of 1021 eV, 600 sec.
Using a Microcomputer in the Teaching of Gas-Phase Equilibria: A Numerical Simulation.
ERIC Educational Resources Information Center
Hayward, Roger
1995-01-01
Describes a computer program that can model the equilibrium processes in the production of ammonia from hydrogen and nitrogen, sulfur trioxide from sulfur dioxide and oxygen, and the nitrogen dioxide-dinitrogen tetroxide equilibrium. Provides information about downloading the program ChemEquilibrium from the World Wide Web. (JRH)
Landscape analysis software tools
Don Vandendriesche
2008-01-01
Recently, several new computer programs have been developed to assist in landscape analysis. The âSequential Processing Routine for Arraying Yieldsâ (SPRAY) program was designed to run a group of stands with particular treatment activities to produce vegetation yield profiles for forest planning. SPRAY uses existing Forest Vegetation Simulator (FVS) software coupled...
NEFP Decision Process: "A Computer Simulation for Planning School Finance Programs." User Manual.
ERIC Educational Resources Information Center
Boardman, Gerald R.; And Others
The National Educational Finance Project has developed a computerized model designed to simulate the consequences of alternative decisions in regard to the financing of public elementary and secondary education. This manual describes a users orientation to that model. The model was designed as an operational prototype for States to use in a…
A Presidential Election Simulation: Creating a School-Wide Interdisciplinary Program
ERIC Educational Resources Information Center
Joyce, Helen M.
2008-01-01
Given the low turnout among younger voters, it is important to seek innovative ways of engaging students in the electoral process, and a presidential year like this one offers exciting opportunities for doing so. In this article, the author describes her experiences with a schoolwide project designed by herself and colleagues that simulates the…
Design and Simulation of an Electrothermal Actuator Based Rotational Drive
NASA Astrophysics Data System (ADS)
Beeson, Sterling; Dallas, Tim
2008-10-01
As a participant in the Micro and Nano Device Engineering (MANDE) Research Experience for Undergraduates program at Texas Tech University, I learned how MEMS devices operate and the limits of their operation. Using specialized AutoCAD-based design software and the ANSYS simulation program, I learned the MEMS fabrication process used at Sandia National Labs, the design limitations of this process, the abilities and drawbacks of micro devices, and finally, I redesigned a MEMS device called the Chevron Torsional Ratcheting Actuator (CTRA). Motion is achieved through electrothermal actuation. The chevron (bent-beam) actuators cause a ratcheting motion on top of a hub-less gear so that as voltage is applied the CTRA spins. The voltage applied needs to be pulsed and the frequency of the pulses determine the angular frequency of the device. The main objective was to design electromechanical structures capable of transforming the electrical signals into mechanical motion without overheating. The design was optimized using finite element analysis in ANSYS allowing multi-physics simulations of our model system.
Power processing methodology. [computerized design of spacecraft electric power systems
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hansen, I. G.; Hayden, J. H.
1974-01-01
Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1980-01-01
A new formulation is proposed for the problem of parameter estimation of dynamic systems with both process and measurement noise. The formulation gives estimates that are maximum likelihood asymptotically in time. The means used to overcome the difficulties encountered by previous formulations are discussed. It is then shown how the proposed formulation can be efficiently implemented in a computer program. A computer program using the proposed formulation is available in a form suitable for routine application. Examples with simulated and real data are given to illustrate that the program works well.
NASA Astrophysics Data System (ADS)
Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.
2017-05-01
The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.
Using Phun to Study ``Perpetual Motion'' Machines
NASA Astrophysics Data System (ADS)
Koreš, Jaroslav
2012-05-01
The concept of "perpetual motion" has a long history. The Indian astronomer and mathematician Bhaskara II (12th century) was the first person to describe a perpetual motion (PM) machine. An example of a 13th- century PM machine is shown in Fig. 1. Although the law of conservation of energy clearly implies the impossibility of PM construction, over the centuries numerous proposals for PM have been made, involving ever more elements of modern science in their construction. It is possible to test a variety of PM machines in the classroom using a program called Phun2 or its commercial version Algodoo.3 The programs are designed to simulate physical processes and we can easily simulate mechanical machines using them. They provide an intuitive graphical environment controlled with a mouse; a programming language is not needed. This paper describes simulations of four different (supposed) PM machines.4
NASA Technical Reports Server (NTRS)
Meng, J. C. S.; Thomson, J. A. L.
1975-01-01
A data analysis program constructed to assess LDV system performance, to validate the simulation model, and to test various vortex location algorithms is presented. Real or simulated Doppler spectra versus range and elevation is used and the spatial distributions of various spectral moments or other spectral characteristics are calculated and displayed. Each of the real or simulated scans can be processed by one of three different procedures: simple frequency or wavenumber filtering, matched filtering, and deconvolution filtering. The final output is displayed as contour plots in an x-y coordinate system, as well as in the form of vortex tracks deduced from the maxima of the processed data. A detailed analysis of run number 1023 and run number 2023 is presented to demonstrate the data analysis procedure. Vortex tracks and system range resolutions are compared with theoretical predictions.
NASA Technical Reports Server (NTRS)
Jones, D. W.
1971-01-01
The navigation and guidance process for the Jupiter, Saturn and Uranus planetary encounter phases of the 1977 Grand Tour interior mission was simulated. Reference approach navigation accuracies were defined and the relative information content of the various observation types were evaluated. Reference encounter guidance requirements were defined, sensitivities to assumed simulation model parameters were determined and the adequacy of the linear estimation theory was assessed. A linear sequential estimator was used to provide an estimate of the augmented state vector, consisting of the six state variables of position and velocity plus the three components of a planet position bias. The guidance process was simulated using a nonspherical model of the execution errors. Computation algorithms which simulate the navigation and guidance process were derived from theory and implemented into two research-oriented computer programs, written in FORTRAN.
Casino physics in the classroom
NASA Astrophysics Data System (ADS)
Whitney, Charles A.
1986-12-01
This article describes a seminar on the elements of probability and random processes that is computer centered and focuses on Monte Carlo simulations of processes such as coin flips, random walks on a lattice, and the behavior of photons and atoms in a gas. Representative computer programs are also described.
NASA Astrophysics Data System (ADS)
Webb, R. M.; Wolock, D. M.; Linard, J. I.; Wieczorek, M. E.
2004-12-01
Process-based flow and transport simulation models can help increase understanding of how hydrologic flow paths affect biogeochemical mixing and reactions in watersheds. This presentation describes the Water, Energy, and Biogeochemical Model (WEBMOD), a new model designed to simulate water and chemical transport in both pristine and agricultural watersheds. WEBMOD simulates streamflow using TOPMODEL algorithms and also simulates irrigation, canopy interception, snowpack, and tile-drain flow; these are important processes for successful multi-year simulations of agricultural watersheds. In addition, the hydrologic components of the model are linked to the U.S. Geological Survey's (USGS) geochemical model PHREEQC such that solute chemistry for the hillslopes and streams also are computed. Model development, execution, and calibration take place within the USGS Modular Modeling System. WEBMOD is being validated at ten research watersheds. Five of these watersheds are nearly pristine and comprise the USGS Water, Energy, and Biogeochemical Budget (WEBB) Program field sites: Loch Vale, Colorado; Trout Lake, Wisconsin; Sleepers River, Vermont; Panola Mountain, Georgia; and the Luquillo Experimental Forest, Puerto Rico. The remaining five watersheds contain intensely cultivated fields being studied by USGS National Water Quality Assessment Program: Merced River, California; Granger Drain, Washington; Maple Creek, Nebraska; Sugar Creek, Indiana; and Morgan Creek, Delaware. Model calibration improved understanding of observed variations in soil moisture, solute concentrations, and stream discharge at the five WEBB watersheds and is now being set up to simulate the processes at the five agricultural watersheds that are now ending their first year of data collection.
Java simulations of embedded control systems.
Farias, Gonzalo; Cervin, Anton; Arzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco
2010-01-01
This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt.
Java Simulations of Embedded Control Systems
Farias, Gonzalo; Cervin, Anton; Årzén, Karl-Erik; Dormido, Sebastián; Esquembre, Francisco
2010-01-01
This paper introduces a new Open Source Java library suited for the simulation of embedded control systems. The library is based on the ideas and architecture of TrueTime, a toolbox of Matlab devoted to this topic, and allows Java programmers to simulate the performance of control processes which run in a real time environment. Such simulations can improve considerably the learning and design of multitasking real-time systems. The choice of Java increases considerably the usability of our library, because many educators program already in this language. But also because the library can be easily used by Easy Java Simulations (EJS), a popular modeling and authoring tool that is increasingly used in the field of Control Education. EJS allows instructors, students, and researchers with less programming capabilities to create advanced interactive simulations in Java. The paper describes the ideas, implementation, and sample use of the new library both for pure Java programmers and for EJS users. The JTT library and some examples are online available on http://lab.dia.uned.es/jtt. PMID:22163674
Computer Simulation of Electron Positron Annihilation Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, y
2003-10-02
With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfacesmore » between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create the whole QCD shower as a tree structure generated by a multiple Poisson process. Working with the whole shower allows us to include correlations between gluon emissions from different sources. QCD destructive interference is controlled by the implementation of ''angular-ordering,'' as in the HERWIG Monte Carlo program. We discuss methods for systematic improvement of the approach to include higher order QCD effects.« less
NASA Astrophysics Data System (ADS)
Mani, N. J.; Waliser, D. E.; Jiang, X.
2014-12-01
While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.
JIMM: the next step for mission-level models
NASA Astrophysics Data System (ADS)
Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.
2001-09-01
The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.
2009-04-22
Implementation Issues Another RCIP implementation risk is program management burnout . The ACRI program manager specifically identified the potential...of burnout in his program management team due to the repeated, intense Integration phases. To investigate the possibility and severity of this risk to...the ACRI simulation. This suggests that the burnout risk will be larger for RCIP than it was for ACRI. Successfully implementing a sustainable RCIP
Barlow, Paul M.; Moench, Allen F.
2011-01-01
The computer program WTAQ simulates axial-symmetric flow to a well pumping from a confined or unconfined (water-table) aquifer. WTAQ calculates dimensionless or dimensional drawdowns that can be used with measured drawdown data from aquifer tests to estimate aquifer hydraulic properties. Version 2 of the program, which is described in this report, provides an alternative analytical representation of drainage to water-table aquifers from the unsaturated zone than that which was available in the initial versions of the code. The revised drainage model explicitly accounts for hydraulic characteristics of the unsaturated zone, specifically, the moisture retention and relative hydraulic conductivity of the soil. The revised program also retains the original conceptualizations of drainage from the unsaturated zone that were available with version 1 of the program to provide alternative approaches to simulate the drainage process. Version 2 of the program includes all other simulation capabilities of the first versions, including partial penetration of the pumped well and of observation wells and piezometers, well-bore storage and skin effects at the pumped well, and delayed drawdown response of observation wells and piezometers.
ERIC Educational Resources Information Center
Bosley, Howard E.; And Others
"Video Processes Are Changing Teacher Education" by Howard Bosley (the first of five papers comprising this document) discusses the Multi-State Teacher Education Project (M-STEP) experimentation with media; it lists various uses of video processes, concentrating specifically on microteaching and the use of simulation and critical…
50 Years of Army Computing From ENIAC to MSRC
2000-09-01
processing capability. The scientifi c visualization program was started in 1984 to provide tools and expertise to help researchers graphically...and materials, forces modeling, nanoelectronics, electromagnetics and acoustics, signal image processing , and simulation and modeling. The ARL...mechanical and electrical calculating equipment, punch card data processing equipment, analog computers, and early digital machines. Before beginning, we
Holistic Nursing Simulation: A Concept Analysis.
Cohen, Bonni S; Boni, Rebecca
2018-03-01
Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.
Change and administrative barriers: nurse educators' perceptions concerning the use of simulators.
Abell, Cathy H; Keaster, Ric
2012-01-01
The purpose of this descriptive correlational research study was twofold: to examine the adoption of simulators in the nursing classroom and the relationship between adoption and nurse educators' perceptions of established change strategies as followed by program administrators. The use of simulators in education is important and requires many nurse educators to change their current teaching strategies. Data were collected from a purposive population using a demographic questionnaire, the nursing practice questionnaire (NPQ), and the change process survey. The overall diffusion score, as measured by the NPQ, was 2.6. A statistically significant correlation was noted between level of use and the perception of established change strategies being followed (r = .340, p < .01). Findings indicate that nurse educators adopt simulators sometimes when appropriate. Administrators of nursing programs can enhance this change by using established change strategies.
Trace contaminant control simulation computer program, version 8.1
NASA Technical Reports Server (NTRS)
Perry, J. L.
1994-01-01
The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various process technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. Included in the simulation are chemical and physical adsorption by activated charcoal, chemical adsorption by lithium hydroxide, absorption by humidity condensate, and low- and high-temperature catalytic oxidation. Means are provided for simulating regenerable as well as nonregenerable systems. The program provides an overall mass balance of chemical contaminants in a spacecraft cabin given specified generation rates. Removal rates are based on device flow rates specified by the user and calculated removal efficiencies based on cabin concentration and removal technology experimental data. Versions 1.0 through 8.0 are documented in NASA TM-108409. TM-108409 also contains a source file listing for version 8.0. Changes to version 8.0 are documented in this technical memorandum and a source file listing for the modified version, version 8.1, is provided. Detailed descriptions for the computer program subprograms are extracted from TM-108409 and modified as necessary to reflect version 8.1. Version 8.1 supersedes version 8.0. Information on a separate user's guide is available from the author.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, J.; Mowrey, J.
1995-12-01
This report describes the design, development and testing of process controls for selected system operations in the Browns Ferry Nuclear Plant (BFNP) Reactor Water Cleanup System (RWCU) using a Computer Simulation Platform which simulates the RWCU System and the BFNP Integrated Computer System (ICS). This system was designed to demonstrate the feasibility of the soft control (video touch screen) of nuclear plant systems through an operator console. The BFNP Integrated Computer System, which has recently. been installed at BFNP Unit 2, was simulated to allow for operator control functions of the modeled RWCU system. The BFNP Unit 2 RWCU systemmore » was simulated using the RELAP5 Thermal/Hydraulic Simulation Model, which provided the steady-state and transient RWCU process variables and simulated the response of the system to control system inputs. Descriptions of the hardware and software developed are also included in this report. The testing and acceptance program and results are also detailed in this report. A discussion of potential installation of an actual RWCU process control system in BFNP Unit 2 is included. Finally, this report contains a section on industry issues associated with installation of process control systems in nuclear power plants.« less
NASA Lunar Regolith Simulant Program
NASA Technical Reports Server (NTRS)
Edmunson, J.; Betts, W.; Rickman, D.; McLemore, C.; Fikes, J.; Stoeser, D.; Wilson, S.; Schrader, C.
2010-01-01
Lunar regolith simulant production is absolutely critical to returning man to the Moon. Regolith simulant is used to test hardware exposed to the lunar surface environment, simulate health risks to astronauts, practice in situ resource utilization (ISRU) techniques, and evaluate dust mitigation strategies. Lunar regolith simulant design, production process, and management is a cooperative venture between members of the NASA Marshall Space Flight Center (MSFC) and the U.S. Geological Survey (USGS). The MSFC simulant team is a satellite of the Dust group based at Glenn Research Center. The goals of the cooperative group are to (1) reproduce characteristics of lunar regolith using simulants, (2) produce simulants as cheaply as possible, (3) produce simulants in the amount needed, and (4) produce simulants to meet users? schedules.
NASA Astrophysics Data System (ADS)
Huang, Wei-Hsing
2017-04-01
Clay barrier plays a major role for the isolation of radioactive wastes in a underground repository. This paper investigates the resaturation behavior of clay barrier, with emphasis on the coupling effects of heat and moisture of buffer material in the near-field of a repository during groundwater intrusion processes. A locally available clay named "Zhisin clay" and a standard bentotine material were adopted in the laboratory program. Water uptake tests were conducted on clay specimens compacted at various densities to simulate the intrusion of groundwater into the buffer material. Soil suction of clay specimens was measured by psychrometers embedded in clay specimens and by vapor equilibrium technique conducted at varying temperatures. Using the soil water characteristic curve, an integration scheme was introduced to estimate the hydraulic conductivity of unsaturated clay. The finite element program ABAQUS was then employed to carry out the numerical simulation of the saturation process in the near field of a repository. Results of the numerical simulation were validated using the degree of saturation profile obtained from the water uptake tests on Zhisin clay. The numerical scheme was then extended to establish a model simulating the resaturation process after the closure of a repository. It was found that, due to the variation in suction and thermal conductivity with temperature of clay barrier material, the calculated temperature field shows a reduction as a result of incorporating the hydro-properties in the calculations.
Video Monitoring a Simulation-Based Quality Improvement Program in Bihar, India.
Dyer, Jessica; Spindler, Hilary; Christmas, Amelia; Shah, Malay Bharat; Morgan, Melissa; Cohen, Susanna R; Sterne, Jason; Mahapatra, Tanmay; Walker, Dilys
2018-04-01
Simulation-based training has become an accepted clinical training andragogy in high-resource settings with its use increasing in low-resource settings. Video recordings of simulated scenarios are commonly used by facilitators. Beyond using the videos during debrief sessions, researchers can also analyze the simulation videos to quantify technical and nontechnical skills during simulated scenarios over time. Little is known about the feasibility and use of large-scale systems to video record and analyze simulation and debriefing data for monitoring and evaluation in low-resource settings. This manuscript describes the process of designing and implementing a large-scale video monitoring system. Mentees and Mentors were consented and all simulations and debriefs conducted at 320 Primary Health Centers (PHCs) were video recorded. The system design, number of video recordings, and inter-rater reliability of the coded videos were assessed. The final dataset included a total of 11,278 videos. Overall, a total of 2,124 simulation videos were coded and 183 (12%) were blindly double-coded. For the double-coded sample, the average inter-rater reliability (IRR) scores were 80% for nontechnical skills, and 94% for clinical technical skills. Among 4,450 long debrief videos received, 216 were selected for coding and all were double-coded. Data quality of simulation videos was found to be very good in terms of recorded instances of "unable to see" and "unable to hear" in Phases 1 and 2. This study demonstrates that video monitoring systems can be effectively implemented at scale in resource limited settings. Further, video monitoring systems can play several vital roles within program implementation, including monitoring and evaluation, provision of actionable feedback to program implementers, and assurance of program fidelity.
Measuring the Process and Quality of Informed Consent for Clinical Research: Development and Testing
Cohn, Elizabeth Gross; Jia, Haomiao; Smith, Winifred Chapman; Erwin, Katherine; Larson, Elaine L.
2013-01-01
Purpose/Objectives To develop and assess the reliability and validity of an observational instrument, the Process and Quality of Informed Consent (P-QIC). Design A pilot study of the psychometrics of a tool designed to measure the quality and process of the informed consent encounter in clinical research. The study used professionally filmed, simulated consent encounters designed to vary in process and quality. Setting A major urban teaching hospital in the northeastern region of the United States. Sample 63 students enrolled in health-related programs participated in psychometric testing, 16 students participated in test-retest reliability, and 5 investigator-participant dyads were observed for the actual consent encounters. Methods For reliability and validity testing, students watched and rated videotaped simulations of four consent encounters intentionally varied in process and content and rated them with the proposed instrument. Test-retest reliability was established by raters watching the videotaped simulations twice. Inter-rater reliability was demonstrated by two simultaneous but independent raters observing an actual consent encounter. Main Research Variables The essential elements of information and communication for informed consent. Findings The initial testing of the P-QIC demonstrated reliable and valid psychometric properties in both the simulated standardized consent encounters and actual consent encounters in the hospital setting. Conclusions The P-QIC is an easy-to-use observational tool that provides a quick assessment of the areas of strength and areas that need improvement in a consent encounter. It can be used in the initial trainings of new investigators or consent administrators and in ongoing programs of improvement for informed consent. Implications for Nursing The development of a validated observational instrument will allow investigators to assess the consent process more accurately and evaluate strategies designed to improve it. PMID:21708532
Badal, Andreu; Badano, Aldo
2009-11-01
It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.
Data acquisition, processing and firing aid software for multichannel EMP simulation
NASA Astrophysics Data System (ADS)
Eumurian, Gregoire; Arbaud, Bruno
1986-08-01
Electromagnetic compatibility testing yields a large quantity of data for systematic analysis. An automated data acquisition system has been developed. It is based on standard EMP instrumentation which allows a pre-established program to be followed whilst orientating the measurements according to the results obtained. The system is controlled by a computer running interactive programs (multitask windows, scrollable menus, mouse, etc.) which handle the measurement channels, files, displays and process data in addition to providing an aid to firing.
Gr-GDHP: A New Architecture for Globalized Dual Heuristic Dynamic Programming.
Zhong, Xiangnan; Ni, Zhen; He, Haibo
2017-10-01
Goal representation globalized dual heuristic dynamic programming (Gr-GDHP) method is proposed in this paper. A goal neural network is integrated into the traditional GDHP method providing an internal reinforcement signal and its derivatives to help the control and learning process. From the proposed architecture, it is shown that the obtained internal reinforcement signal and its derivatives can be able to adjust themselves online over time rather than a fixed or predefined function in literature. Furthermore, the obtained derivatives can directly contribute to the objective function of the critic network, whose learning process is thus simplified. Numerical simulation studies are applied to show the performance of the proposed Gr-GDHP method and compare the results with other existing adaptive dynamic programming designs. We also investigate this method on a ball-and-beam balancing system. The statistical simulation results are presented for both the Gr-GDHP and the GDHP methods to demonstrate the improved learning and controlling performance.
Automatic mathematical modeling for real time simulation system
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1988-01-01
A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.
Educational Technology in Military Training Applications: A Current Assessment.
ERIC Educational Resources Information Center
Platt, William A.; Andrews, Dee H.
This chapter considers the history of instructional development (ID) in the military, with particular emphasis on the U.S. Navy. The ID process used at the Navy's Instructional Program Development Centers is presented, including the process for simulator development. An in-depth analysis of the problems encountered with educational technology…
Computer-Assisted Instruction: Authoring Languages. ERIC Digest.
ERIC Educational Resources Information Center
Reeves, Thomas C.
One of the most perplexing tasks in producing computer-assisted instruction (CAI) is the authoring process. Authoring is generally defined as the process of turning the flowcharts, control algorithms, format sheets, and other documentation of a CAI program's design into computer code that will operationalize the simulation on the delivery system.…
NASA Astrophysics Data System (ADS)
Patriarca, M.; Kuronen, A.; Robles, M.; Kaski, K.
2007-01-01
The study of crystal defects and the complex processes underlying their formation and time evolution has motivated the development of the program ALINE for interactive molecular dynamics experiments. This program couples a molecular dynamics code to a Graphical User Interface and runs on a UNIX-X11 Window System platform with the MOTIF library, which is contained in many standard Linux releases. ALINE is written in C, thus giving the user the possibility to modify the source code, and, at the same time, provides an effective and user-friendly framework for numerical experiments, in which the main parameters can be interactively varied and the system visualized in various ways. We illustrate the main features of the program through some examples of detection and dynamical tracking of point-defects, linear defects, and planar defects, such as stacking faults in lattice-mismatched heterostructures. Program summaryTitle of program:ALINE Catalogue identifier:ADYJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYJ_v1_0 Program obtainable from: CPC Program Library, Queen University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: Computers:DEC ALPHA 300, Intel i386 compatible computers, G4 Apple Computers Installations:Laboratory of Computational Engineering, Helsinki University of Technology, Helsinki, Finland Operating systems under which the program has been tested:True64 UNIX, Linux-i386, Mac OS X 10.3 and 10.4 Programming language used:Standard C and MOTIF libraries Memory required to execute with typical data:6 Mbytes but may be larger depending on the system size No. of lines in distributed program, including test data, etc.:16 901 No. of bytes in distributed program, including test data, etc.:449 559 Distribution format:tar.gz Nature of physical problem:Some phenomena involving defects take place inside three-dimensional crystals at times which can be hardly predicted. For this reason they are difficult to detect and track even within numerical experiments, especially when one is interested in studying their dynamical properties and time evolution. Furthermore, traditional simulation methods require the storage of a huge amount of data which in turn may imply a long work for their analysis. Method of solution:Simplifications of the simulation work described above strongly depend also on the computer performance. It has now become possible to realize some of such simplifications thanks to the real possibility of using interactive programs. The solution proposed here is based on the development of an interactive graphical simulation program both for avoiding large storage of data and the subsequent elaboration and analysis as well as for visualizing and tracking many phenomena inside three-dimensional samples. However, the full computational power of traditional simulation programs may not be available in general in programs with graphical user interfaces, due to their interactive nature. Nevertheless interactive programs can still be very useful for detecting processes difficult to visualize, restricting the range or making a fine tuning of the parameters, and tailoring the faster programs toward precise targets. Restrictions on the complexity of the problem:The restrictions on the applicability of the program are related to the computer resources available. The graphical interface and interactivity demand computational resources that depend on the particular numerical simulation to be performed. To preserve a balance between speed and resources, the choice of the number of atoms to be simulated is critical. With an average current computer, simulations of systems with more than 10 5 atoms may not be easily feasible on an interactive scheme. Another restriction is related to the fact that the program was originally designed to simulate systems in the solid phase, so that problems in the simulation may occur if some particular physical quantities are computed beyond the melting point. Typical running time:It depends on the machine architecture, system size, and user needs. Unusual features of the program:In the program, besides the window in which the system is represented in real space, an additional graphical window presenting the real time distribution histogram for different physical variables (such as kinetic or potential energy) is included. Such tool is very interesting for making demonstrative numerical experiments for teaching purposes as well as for research, e.g., for detecting and tracking crystal defects. The program includes: an initial condition builder, an interactive display of the simulation, a set of tools which allow the user to filter through different physical quantities the information—either displayed in real time or printed in the output files—and to perform an efficient search of the interesting regions of parameter space.
Glass fiber processing for the Moon/Mars program: Center director's discretionary fund final report
NASA Technical Reports Server (NTRS)
Tucker, D. S.; Ethridge, E.; Curreri, P.
1992-01-01
Glass fiber has been produced from two lunar soil simulants. These two materials simulate lunar mare soil and lunar highland soil compositions, respectively. Short fibers containing recrystallized areas were produced from the as-received simulants. Doping the highland simulant with 8 weight percent B2-O3 yielded a material which could be spun continuously. The effects of lunar gravity on glass fiber formation were studied utilizing NASA's KC-135 aircraft. Gravity was found to play a major role in final fiber diameter.
Jdpd: an open java simulation kernel for molecular fragment dissipative particle dynamics.
van den Broek, Karina; Kuhn, Hubert; Zielesny, Achim
2018-05-21
Jdpd is an open Java simulation kernel for Molecular Fragment Dissipative Particle Dynamics with parallelizable force calculation, efficient caching options and fast property calculations. It is characterized by an interface and factory-pattern driven design for simple code changes and may help to avoid problems of polyglot programming. Detailed input/output communication, parallelization and process control as well as internal logging capabilities for debugging purposes are supported. The new kernel may be utilized in different simulation environments ranging from flexible scripting solutions up to fully integrated "all-in-one" simulation systems.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
ms2: A molecular simulation tool for thermodynamic properties
NASA Astrophysics Data System (ADS)
Deublein, Stephan; Eckl, Bernhard; Stoll, Jürgen; Lishchuk, Sergey V.; Guevara-Carrion, Gabriela; Glass, Colin W.; Merker, Thorsten; Bernreuther, Martin; Hasse, Hans; Vrabec, Jadran
2011-11-01
This work presents the molecular simulation program ms2 that is designed for the calculation of thermodynamic properties of bulk fluids in equilibrium consisting of small electro-neutral molecules. ms2 features the two main molecular simulation techniques, molecular dynamics (MD) and Monte-Carlo. It supports the calculation of vapor-liquid equilibria of pure fluids and multi-component mixtures described by rigid molecular models on the basis of the grand equilibrium method. Furthermore, it is capable of sampling various classical ensembles and yields numerous thermodynamic properties. To evaluate the chemical potential, Widom's test molecule method and gradual insertion are implemented. Transport properties are determined by equilibrium MD simulations following the Green-Kubo formalism. ms2 is designed to meet the requirements of academia and industry, particularly achieving short response times and straightforward handling. It is written in Fortran90 and optimized for a fast execution on a broad range of computer architectures, spanning from single processor PCs over PC-clusters and vector computers to high-end parallel machines. The standard Message Passing Interface (MPI) is used for parallelization and ms2 is therefore easily portable to different computing platforms. Feature tools facilitate the interaction with the code and the interpretation of input and output files. The accuracy and reliability of ms2 has been shown for a large variety of fluids in preceding work. Program summaryProgram title:ms2 Catalogue identifier: AEJF_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJF_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Special Licence supplied by the authors No. of lines in distributed program, including test data, etc.: 82 794 No. of bytes in distributed program, including test data, etc.: 793 705 Distribution format: tar.gz Programming language: Fortran90 Computer: The simulation tool ms2 is usable on a wide variety of platforms, from single processor machines over PC-clusters and vector computers to vector-parallel architectures. (Tested with Fortran compilers: gfortran, Intel, PathScale, Portland Group and Sun Studio.) Operating system: Unix/Linux, Windows Has the code been vectorized or parallelized?: Yes. Message Passing Interface (MPI) protocol Scalability. Excellent scalability up to 16 processors for molecular dynamics and >512 processors for Monte-Carlo simulations. RAM:ms2 runs on single processors with 512 MB RAM. The memory demand rises with increasing number of processors used per node and increasing number of molecules. Classification: 7.7, 7.9, 12 External routines: Message Passing Interface (MPI) Nature of problem: Calculation of application oriented thermodynamic properties for rigid electro-neutral molecules: vapor-liquid equilibria, thermal and caloric data as well as transport properties of pure fluids and multi-component mixtures. Solution method: Molecular dynamics, Monte-Carlo, various classical ensembles, grand equilibrium method, Green-Kubo formalism. Restrictions: No. The system size is user-defined. Typical problems addressed by ms2 can be solved by simulating systems containing typically 2000 molecules or less. Unusual features: Feature tools are available for creating input files, analyzing simulation results and visualizing molecular trajectories. Additional comments: Sample makefiles for multiple operation platforms are provided. Documentation is provided with the installation package and is available at http://www.ms-2.de. Running time: The running time of ms2 depends on the problem set, the system size and the number of processes used in the simulation. Running four processes on a "Nehalem" processor, simulations calculating VLE data take between two and twelve hours, calculating transport properties between six and 24 hours.
Creating a sustainable, interprofessional-team training program: initial results.
Riggall, Virginia K; Smith, Charlene M
2015-01-01
The purpose of this program evaluation was to explore whether incorporating deliberate learning concepts, through the use of simulated patient scenarios to teach interprofessional collaboration skills to a healthcare team on one acute-care hospital unit, would improve the resuscitation response in the first 5 minutes on that unit. This was a pilot program evaluation utilizing a unit-based, clinical nurse specialist in the deployment of an interprofessional educational program involving simulation on an acute medical floor in a large tertiary-care hospital. Eighty-four staff members participated in 17 simulations. The sample included first-year internal-medicine residents, registered nurses, respiratory therapists, and patient care technicians. This was a program evaluation that used the TeamSTEPPS Teamwork Perceptions Questionnaire (T-TPQ) (Classroom slides: TeamSTEPPS essentials; http://www.ahrq.gov/professionals/education/curriculum-tools/teamstepps/instructor/essentials/slessentials.html#s3) during the presimulation/postsimulation sessions to assess the participants' perceptions of teamwork. Expected intervention behaviors were collected through observations of participants in the simulations and compared with the American Heart Association guidelines (Circulation 2010;122:S685-S670, S235-S337). Common perceptions of participants regarding the experience were obtained through open-ended evaluation questions. Fifty-three participants completed the pre- and post-T-TPQ. Mean scores in the leadership category of T-TPQ decreased significantly (P = .003) from the pretest (median, 2.167) to the T-TPQ posttest (median, 2.566). Only 35% of the groups administered a defibrillation during the ventricular fibrillation simulation scenario, and only 1 group delivered this shock within the American Heart Association's recommended time frame of 2 minutes (Circulation 2010;122:S235-S337). A single resuscitation simulation was not enough interventional dosage for staff to improve the resuscitation process. A longitudinal study should be conducted to determine the effectiveness of the program after staff members have repeated the program multiple times. A unit-based quality-improvement simulation training program could help improve the first-5-minute response and resuscitation skills of staff by increasing the frequency of unit-based training overseen by the unit's clinical nurse specialist.
Modeling of materials supply, demand and prices
NASA Technical Reports Server (NTRS)
1982-01-01
The societal, economic, and policy tradeoffs associated with materials processing and utilization, are discussed. The materials system provides the materials engineer with the system analysis required for formulate sound materials processing, utilization, and resource development policies and strategies. Materials system simulation and modeling research program including assessments of materials substitution dynamics, public policy implications, and materials process economics was expanded. This effort includes several collaborative programs with materials engineers, economists, and policy analysts. The technical and socioeconomic issues of materials recycling, input-output analysis, and technological change and productivity are examined. The major thrust areas in materials systems research are outlined.
U-10Mo Baseline Fuel Fabrication Process Description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hubbard, Lance R.; Arendt, Christina L.; Dye, Daniel F.
This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle ofmore » the USHPRR program. This document, along with the accompanying PFD, is updated regularly« less
A kinetics database and scripts for PHREEQC
NASA Astrophysics Data System (ADS)
Hu, B.; Zhang, Y.; Teng, Y.; Zhu, C.
2017-12-01
Kinetics of geochemical reactions has been increasingly used in numerical models to simulate coupled flow, mass transport, and chemical reactions. However, the kinetic data are scattered in the literature. To assemble a kinetic dataset for a modeling project is an intimidating task for most. In order to facilitate the application of kinetics in geochemical modeling, we assembled kinetics parameters into a database for the geochemical simulation program, PHREEQC (version 3.0). Kinetics data were collected from the literature. Our database includes kinetic data for over 70 minerals. The rate equations are also programmed into scripts with the Basic language. Using the new kinetic database, we simulated reaction path during the albite dissolution process using various rate equations in the literature. The simulation results with three different rate equations gave difference reaction paths at different time scale. Another application involves a coupled reactive transport model simulating the advancement of an acid plume in an acid mine drainage site associated with Bear Creek Uranium tailings pond. Geochemical reactions including calcite, gypsum, and illite were simulated with PHREEQC using the new kinetic database. The simulation results successfully demonstrated the utility of new kinetic database.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Wangda; McNeil, Andrew; Wetter, Michael
2013-05-23
Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach wasmore » evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.« less
Simulating an underwater vehicle self-correcting guidance system with Simulink
NASA Astrophysics Data System (ADS)
Fan, Hui; Zhang, Yu-Wen; Li, Wen-Zhe
2008-09-01
Underwater vehicles have already adopted self-correcting directional guidance algorithms based on multi-beam self-guidance systems, not waiting for research to determine the most effective algorithms. The main challenges facing research on these guidance systems have been effective modeling of the guidance algorithm and a means to analyze the simulation results. A simulation structure based on Simulink that dealt with both issues was proposed. Initially, a mathematical model of relative motion between the vehicle and the target was developed, which was then encapsulated as a subsystem. Next, steps for constructing a model of the self-correcting guidance algorithm based on the Stateflow module were examined in detail. Finally, a 3-D model of the vehicle and target was created in VRML, and by processing mathematical results, the model was shown moving in a visual environment. This process gives more intuitive results for analyzing the simulation. The results showed that the simulation structure performs well. The simulation program heavily used modularization and encapsulation, so has broad applicability to simulations of other dynamic systems.
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-01-01
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184
Saxton, Michael J
2007-01-01
Modeling obstructed diffusion is essential to the understanding of diffusion-mediated processes in the crowded cellular environment. Simple Monte Carlo techniques for modeling obstructed random walks are explained and related to Brownian dynamics and more complicated Monte Carlo methods. Random number generation is reviewed in the context of random walk simulations. Programming techniques and event-driven algorithms are discussed as ways to speed simulations.
NASA Astrophysics Data System (ADS)
Babu, K.; Prasanna Kumar, T. S.
2014-08-01
An indigenous, non-linear, and coupled finite element (FE) program has been developed to predict the temperature field and phase evolution during heat treatment of steels. The diffusional transformations during continuous cooling of steels were modeled using Johnson-Mehl-Avrami-Komogorov equation, and the non-diffusion transformation was modeled using Koistinen-Marburger equation. Cylindrical quench probes made of AISI 4140 steel of 20-mm diameter and 50-mm long were heated to 1123 K (850 °C), quenched in water, and cooled in air. The temperature history during continuous cooling was recorded at the selected interior locations of the quench probes. The probes were then sectioned at the mid plane and resultant microstructures were observed. The process of water quenching and air cooling of AISI 4140 steel probes was simulated with the heat flux boundary condition in the FE program. The heat flux for air cooling process was calculated through the inverse heat conduction method using the cooling curve measured during air cooling of a stainless steel 304L probe as an input. The heat flux for the water quenching process was calculated from a surface heat flux model proposed for quenching simulations. The isothermal transformation start and finish times of different phases were taken from the published TTT data and were also calculated using Kirkaldy model and Li model and used in the FE program. The simulated cooling curves and phases using the published TTT data had a good agreement with the experimentally measured values. The computation results revealed that the use of published TTT data was more reliable in predicting the phase transformation during heat treatment of low alloy steels than the use of the Kirkaldy or Li model.
Validation studies of the DOE-2 Building Energy Simulation Program. Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, R.; Winkelmann, F.
1998-06-01
This report documents many of the validation studies (Table 1) of the DOE-2 building energy analysis simulation program that have taken place since 1981. Results for several versions of the program are presented with the most recent study conducted in 1996 on version DOE-2.1E and the most distant study conducted in 1981 on version DOE-1.3. This work is part of an effort related to continued development of DOE-2, particularly in its use as a simulation engine for new specialized versions of the program such as the recently released RESFEN 3.1. RESFEN 3.1 is a program specifically dealing with analyzing themore » energy performance of windows in residential buildings. The intent in providing the results of these validation studies is to give potential users of the program a high degree of confidence in the calculated results. Validation studies in which calculated simulation data is compared to measured data have been conducted throughout the development of the DOE-2 program. Discrepancies discovered during the course of such work has resulted in improvements in the simulation algorithms. Table 2 provides a listing of additions and modifications that have been made to various versions of the program since version DOE-2.1A. One of the most significant recent changes in the program occurred with version DOE-2.1E. An improved algorithm for calculating the outside surface film coefficient was implemented. In addition, integration of the WINDOW 4 program was accomplished resulting in improved ability in analyzing window energy performance. Validation and verification of a program as sophisticated as DOE-2 must necessarily be limited because of the approximations inherent in the program. For example, the most accurate model of the heat transfer processes in a building would include a three-dimensional analysis. To justify such detailed algorithmic procedures would correspondingly require detailed information describing the building and/or HVAC system and energy plant parameters. Until building simulation programs can get this data directly from CAD programs, such detail would negate the usefulness of the program for the practicing engineers and architects who currently use the program. In addition, the validation studies discussed herein indicate that such detail is really unnecessary. The comparison of calculated and measured quantities have resulted in a satisfactory level of confidence that is sufficient for continued use of the DOE-2 program. However, additional validation is warranted, particularly at the component level, to further improve the program.« less
2004 research briefs :Materials and Process Sciences Center.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cieslak, Michael J.
2004-01-01
This report is the latest in a continuing series that highlights the recent technical accomplishments associated with the work being performed within the Materials and Process Sciences Center. Our research and development activities primarily address the materials-engineering needs of Sandia's Nuclear-Weapons (NW) program. In addition, we have significant efforts that support programs managed by the other laboratory business units. Our wide range of activities occurs within six thematic areas: Materials Aging and Reliability, Scientifically Engineered Materials, Materials Processing, Materials Characterization, Materials for Microsystems, and Materials Modeling and Simulation. We believe these highlights collectively demonstrate the importance that a strong materials-sciencemore » base has on the ultimate success of the NW program and the overall DOE technology portfolio.« less
Tornado detection data reduction and analysis
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1977-01-01
Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.
Standardized input for Hanford environmental impact statements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, B.A.
1981-05-01
Models and computer programs for simulating the environmental behavior of radionuclides in the environment and the resulting radiation dose to humans have been developed over the years by the Environmental Analysis Section staff, Ecological Sciences Department at the Pacific Northwest Laboratory (PNL). Methodologies have evolved for calculating raidation doses from many exposure pathways for any type of release mechanism. Depending on the situation or process being simulated, different sets of computer programs, assumptions, and modeling techniques must be used. This report is a compilation of recommended computer programs and necessary input information for use in calculating doses to members ofmore » the general public for environmental impact statements prepared for DOE activities to be conducted on or near the Hanford Reservation.« less
NASA Technical Reports Server (NTRS)
Scaffidi, C. A.; Stocklin, F. J.; Feldman, M. B.
1971-01-01
An L-band telemetry system designed to provide the capability of near-real-time processing of calibration data is described. The system also provides the capability of performing computerized spacecraft simulations, with the aircraft as a data source, and evaluating the network response. The salient characteristics of a telemetry analysis and simulation program (TASP) are discussed, together with the results of TASP testing. The results of the L-band system testing have successfully demonstrated the capability of near-real-time processing of telemetry test data, the control of the ground-received signal to within + or - 0.5 db, and the computer generation of test signals.
High-order finite difference formulations for the incompressible Navier-Stokes equations on the CM-5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tafti, D.
1995-12-01
The paper describes the features and implementation of a general purpose high-order accurate finite difference computer program for direct and large-eddy simulations of turbulence on the CM-5 in the data parallel mode. Benchmarking studies for a direct simulation of turbulent channel flow are discussed. Performance of up to 8.8 GFLOPS is obtained for the high-order formulations on 512 processing nodes of the CM-5. The execution time for a simulation with 24 million nodes in a domain with two periodic directions is in the range of 0.2 {mu}secs/time-step/degree of freedom on 512 processing nodes of the CM-5.
NASA Technical Reports Server (NTRS)
Fletcher, Lauren E.; Aldridge, Ann M.; Wheelwright, Charles; Maida, James
1997-01-01
Task illumination has a major impact on human performance: What a person can perceive in his environment significantly affects his ability to perform tasks, especially in space's harsh environment. Training for lighting conditions in space has long depended on physical models and simulations to emulate the effect of lighting, but such tests are expensive and time-consuming. To evaluate lighting conditions not easily simulated on Earth, personnel at NASA Johnson Space Center's (JSC) Graphics Research and Analysis Facility (GRAF) have been developing computerized simulations of various illumination conditions using the ray-tracing program, Radiance, developed by Greg Ward at Lawrence Berkeley Laboratory. Because these computer simulations are only as accurate as the data used, accurate information about the reflectance properties of materials and light distributions is needed. JSC's Lighting Environment Test Facility (LETF) personnel gathered material reflectance properties for a large number of paints, metals, and cloths used in the Space Shuttle and Space Station programs, and processed these data into reflectance parameters needed for the computer simulations. They also gathered lamp distribution data for most of the light sources used, and validated the ability to accurately simulate lighting levels by comparing predictions with measurements for several ground-based tests. The result of this study is a database of material reflectance properties for a wide variety of materials, and lighting information for most of the standard light sources used in the Shuttle/Station programs. The combination of the Radiance program and GRAF's graphics capability form a validated computerized lighting simulation capability for NASA.
Discrete time modelization of human pilot behavior
NASA Technical Reports Server (NTRS)
Cavalli, D.; Soulatges, D.
1975-01-01
This modelization starts from the following hypotheses: pilot's behavior is a time discrete process, he can perform only one task at a time and his operating mode depends on the considered flight subphase. Pilot's behavior was observed using an electro oculometer and a simulator cockpit. A FORTRAN program has been elaborated using two strategies. The first one is a Markovian process in which the successive instrument readings are governed by a matrix of conditional probabilities. In the second one, strategy is an heuristic process and the concepts of mental load and performance are described. The results of the two aspects have been compared with simulation data.
Microstructure Modeling of 3rd Generation Disk Alloy
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2008-01-01
The objective of this initiative, funded by NASA's Aviation Safety Program, is to model, validate, and predict, with high fidelity, the microstructural evolution of third-generation high-refractory Ni-based disc superalloys during heat treating and service conditions. This initiative is a natural extension of the DARPA-AIM (Accelerated Insertion of Materials) initiative with GE/Pratt-Whitney and with other process simulation tools. Strong collaboration with the NASA Glenn Research Center (GRC) is a key component of this initiative and the focus of this program is on industrially relevant disk alloys and heat treatment processes identified by GRC. Employing QuesTek s Computational Materials Dynamics technology and PrecipiCalc precipitation simulator, physics-based models are being used to achieve high predictive accuracy and precision. Combining these models with experimental data and probabilistic analysis, "virtual alloy design" can be performed. The predicted microstructures can be optimized to promote desirable features and concurrently eliminate nondesirable phases that can limit the reliability and durability of the alloys. The well-calibrated and well-integrated software tools that are being applied under the proposed program will help gas turbine disk alloy manufacturers, processing facilities, and NASA, to efficiently and effectively improve the performance of current and future disk materials.
Particle-In-Cell simulations of high pressure plasmas using graphics processing units
NASA Astrophysics Data System (ADS)
Gebhardt, Markus; Atteln, Frank; Brinkmann, Ralf Peter; Mussenbrock, Thomas; Mertmann, Philipp; Awakowicz, Peter
2009-10-01
Particle-In-Cell (PIC) simulations are widely used to understand the fundamental phenomena in low-temperature plasmas. Particularly plasmas at very low gas pressures are studied using PIC methods. The inherent drawback of these methods is that they are very time consuming -- certain stability conditions has to be satisfied. This holds even more for the PIC simulation of high pressure plasmas due to the very high collision rates. The simulations take up to very much time to run on standard computers and require the help of computer clusters or super computers. Recent advances in the field of graphics processing units (GPUs) provides every personal computer with a highly parallel multi processor architecture for very little money. This architecture is freely programmable and can be used to implement a wide class of problems. In this paper we present the concepts of a fully parallel PIC simulation of high pressure plasmas using the benefits of GPU programming.
pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2014-01-01
This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.
Granheim, Benedikte M; Shaw, Julie M; Mansah, Martha
2018-03-01
To identify how simulation and interprofessional learning are used together in undergraduate nursing programs and undertaken in schools of nursing to address interprofessional communication and collaboration. An integrative literature review. The databases CINAHL, ProQuest, PubMed, Scopus, PsycInfo and Science Direct were searched to identify articles from 2006 to 2016 that reported on the use of IPL and simulation together in undergraduate nursing education. Whittemore and Knafl's five step process was used to guide the integrative review of quantitative and qualitative literature. Only peer reviewed articles written in English that addressed undergraduate nursing studies, were included in the review. Articles that did not aim to improve communication and collaboration were excluded. All articles selected were examined to determine their contribution to interprofessional learning and simulation in undergraduate nursing knowledge. The faculties of nursing used interprofessional learning and simulation in undergraduate nursing programs that in some cases were connected to a specific course. A total of nine articles, eight research papers and one narrative report, that focused on collaboration and communication were selected for this review. Studies predominantly used nursing and medical student participants. None of the included studies identified prior student experience with interprofessional learning and simulation. Four key themes were identified: communication, collaboration/teamwork, learning in practice and understanding of roles, and communication. This review highlights the identified research relating to the combined teaching strategy of interprofessional learning and simulation that addressed communication and collaboration in undergraduate nursing programs. Further research into the implementation of interprofessional learning and simulation may benefit the emergent challenges. Information drawn from this review can be used in informing education and educational development in the future. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1987-01-01
The proceedings of the conference are presented. The objective was to provide a forum for the discussion of the structure and status of existing computer programs which are used to simulate the dynamics of a variety of tether applications in space. A major topic was different simulation models and the process of validating them. Guidance on future work in these areas was obtained from a panel discussion; the panel was composed of resource and technical managers and dynamic analysts in the tether field. The conclusions of this panel are also presented.
Selected bibliography on the modeling and control of plant processes
NASA Technical Reports Server (NTRS)
Viswanathan, M. M.; Julich, P. M.
1972-01-01
A bibliography of information pertinent to the problem of simulating plants is presented. Detailed simulations of constituent pieces are necessary to justify simple models which may be used for analysis. Thus, this area of study is necessary to support the Earth Resources Program. The report sums up the present state of the problem of simulating vegetation. This area holds the hope of major benefits to mankind through understanding the ecology of a region and in improving agricultural yield.
Role of in-situ simulation for training in healthcare: opportunities and challenges.
Kurup, Viji; Matei, Veronica; Ray, Jessica
2017-12-01
Simulation has now been acknowledged as an important part of training in healthcare, and most academic hospitals have a dedicated simulation center. In-situ simulation occurs in patient care units with scenarios involving healthcare professionals in their actual working environment. The purpose of this review is to describe the process of putting together the components of in-situ simulation for training programs and to review outcomes studied, and challenges with this approach. In-situ simulation has been used to 'test-drive' new centers, train personnel in new procedures in existing centers, for recertification training and to uncover latent threats in clinical care areas. It has also emerged as an attractive alternative to traditional simulations for institutions that do not have their own simulation center. In-situ simulation can be used to improve reliability and safety especially in areas of high risk, and in high-stress environments. It is also a reasonable and attractive alternative for programs that want to conduct interdisciplinary simulations for their trainees and faculty, and for those who do not have access to a fully functional simulation center. Further research needs to be done in assessing effectiveness of training using this method and the effect of such training on clinical outcomes.
Isaak, Robert S; Chen, Fei; Arora, Harendra; Martinelli, Susan M; Zvara, David A; Stiegler, Marjorie P
2017-09-01
Anesthesiology residency programs may need new simulation-based programs to prepare residents for the new Objective Structured Clinical Examination (OSCE) component of the American Board of Anesthesiology (ABA) Primary Certification process. The design of such programs may require significant resources, including faculty time, expertise, and funding, as are currently needed for structured oral examination (SOE) preparation. This survey analyzed the current state of US-based anesthesiology residency programs regarding simulation-based educational programming for SOE and OSCE preparation. An online survey was distributed to every anesthesiology residency program director in the United States. The survey included 15 to 46 questions, depending on each respondent's answers. The survey queried current practices and future plans regarding resident preparation specifically for the ABA APPLIED examination, with emphasis on the OSCE. Descriptive statistics were summarized. χ and Fisher exact tests were used to test the differences in proportions across groups. Spearman rank correlation was used to examine the association between ordinal variables. The responding 66 programs (49%) were a representative sample of all anesthesiology residencies (N = 136) in terms of geographical location (χ P = .58). There was a low response rate from small programs that have 12 or fewer clinical anesthesia residents. Ninety-one percent (95% confidence interval [CI], 84%-95%) of responders agreed that it is the responsibility of the program to specifically prepare residents for primary certification, and most agreed that it is important to practice SOEs (94%; 95% CI, 88%-97%) and OSCEs (89%; 95% CI, 83%-94%). While 100% of respondents reported providing mock SOEs, only 31% (95% CI, 24%-40%) of respondents provided mock OSCE experiences. Of those without an OSCE program, 75% (95% CI, 64%-83%) reported plans to start one. The most common reasons for not having an OSCE program already in place, and the perceived challenges for implementing an OSCE program, were the same: lack of time (faculty and residents), expertise in OSCE development and assessment, and funding. The results provide data from residency programs for benchmarking their simulation curriculum and ABA APPLIED Examination preparation offerings. Despite agreement that residency programs should prepare residents for the ABA APPLIED Examination, many programs have yet to implement an OSCE preparation program, in part due to lack of financial resources, faculty expertise, and time. Additionally, in contrast to the SOE, the OSCE is a new format for ABA primary certification. As a result, the lack of consensus concerning preparation needs could be related to the amount information that is available regarding the examination content and assessment process.
NASA Astrophysics Data System (ADS)
Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri
2015-04-01
Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.
Opticks : GPU Optical Photon Simulation for Particle Physics using NVIDIA® OptiX™
NASA Astrophysics Data System (ADS)
C, Blyth Simon
2017-10-01
Opticks is an open source project that integrates the NVIDIA OptiX GPU ray tracing engine with Geant4 toolkit based simulations. Massive parallelism brings drastic performance improvements with optical photon simulation speedup expected to exceed 1000 times Geant4 when using workstation GPUs. Optical photon simulation time becomes effectively zero compared to the rest of the simulation. Optical photons from scintillation and Cherenkov processes are allocated, generated and propagated entirely on the GPU, minimizing transfer overheads and allowing CPU memory usage to be restricted to optical photons that hit photomultiplier tubes or other photon detectors. Collecting hits into standard Geant4 hit collections then allows the rest of the simulation chain to proceed unmodified. Optical physics processes of scattering, absorption, scintillator reemission and boundary processes are implemented in CUDA OptiX programs based on the Geant4 implementations. Wavelength dependent material and surface properties as well as inverse cumulative distribution functions for reemission are interleaved into GPU textures providing fast interpolated property lookup or wavelength generation. Geometry is provided to OptiX in the form of CUDA programs that return bounding boxes for each primitive and ray geometry intersection positions. Some critical parts of the geometry such as photomultiplier tubes have been implemented analytically with the remainder being tessellated. OptiX handles the creation and application of a choice of acceleration structures such as boundary volume hierarchies and the transparent use of multiple GPUs. OptiX supports interoperation with OpenGL and CUDA Thrust that has enabled unprecedented visualisations of photon propagations to be developed using OpenGL geometry shaders to provide interactive time scrubbing and CUDA Thrust photon indexing to enable interactive history selection.
GPU-accelerated phase-field simulation of dendritic solidification in a binary alloy
NASA Astrophysics Data System (ADS)
Yamanaka, Akinori; Aoki, Takayuki; Ogawa, Satoi; Takaki, Tomohiro
2011-03-01
The phase-field simulation for dendritic solidification of a binary alloy has been accelerated by using a graphic processing unit (GPU). To perform the phase-field simulation of the alloy solidification on GPU, a program code was developed with computer unified device architecture (CUDA). In this paper, the implementation technique of the phase-field model on GPU is presented. Also, we evaluated the acceleration performance of the three-dimensional solidification simulation by using a single NVIDIA TESLA C1060 GPU and the developed program code. The results showed that the GPU calculation for 5763 computational grids achieved the performance of 170 GFLOPS by utilizing the shared memory as a software-managed cache. Furthermore, it can be demonstrated that the computation with the GPU is 100 times faster than that with a single CPU core. From the obtained results, we confirmed the feasibility of realizing a real-time full three-dimensional phase-field simulation of microstructure evolution on a personal desktop computer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naitoh, Masanori; Ujita, Hiroshi; Nagumo, Hiroichi
1997-07-01
The Nuclear Power Engineering Corporation (NUPEC) has initiated a long-term program to develop the simulation system {open_quotes}IMPACT{close_quotes} for analysis of hypothetical severe accidents in nuclear power plants. IMPACT employs advanced methods of physical modeling and numerical computation, and can simulate a wide spectrum of senarios ranging from normal operation to hypothetical, beyond-design-basis-accident events. Designed as a large-scale system of interconnected, hierarchical modules, IMPACT`s distinguishing features include mechanistic models based on first principles and high speed simulation on parallel processing computers. The present plan is a ten-year program starting from 1993, consisting of the initial one-year of preparatory work followed bymore » three technical phases: Phase-1 for development of a prototype system; Phase-2 for completion of the simulation system, incorporating new achievements from basic studies; and Phase-3 for refinement through extensive verification and validation against test results and available real plant data.« less
A Virtual Environment for Process Management. A Step by Step Implementation
ERIC Educational Resources Information Center
Mayer, Sergio Valenzuela
2003-01-01
In this paper it is presented a virtual organizational environment, conceived with the integration of three computer programs: a manufacturing simulation package, an automation of businesses processes (workflows), and business intelligence (Balanced Scorecard) software. It was created as a supporting tool for teaching IE, its purpose is to give…
NASA Astrophysics Data System (ADS)
Hu, G. F.; Damanpack, A. R.; Bodaghi, M.; Liao, W. H.
2017-12-01
The main objective of this paper is to introduce a 4D printing method to program shape memory polymers (SMPs) during fabrication process. Fused deposition modeling (FDM) as a filament-based printing method is employed to program SMPs during depositing the material. This method is implemented to fabricate complicated polymeric structures by self-bending features without need of any post-programming. Experiments are conducted to demonstrate feasibility of one-dimensional (1D)-to 2D and 2D-to-3D self-bending. It is shown that 3D printed plate structures can transform into masonry-inspired 3D curved shell structures by simply heating. Good reliability of SMP programming during printing process is also demonstrated. A 3D macroscopic constitutive model is established to simulate thermo-mechanical features of the printed SMPs. Governing equations are also derived to simulate programming mechanism during printing process and shape change of self-bending structures. In this respect, a finite element formulation is developed considering von-Kármán geometric nonlinearity and solved by implementing iterative Newton-Raphson scheme. The accuracy of the computational approach is checked with experimental results. It is demonstrated that the theoretical model is able to replicate the main characteristics observed in the experiments. This research is likely to advance the state of the art FDM 4D printing, and provide pertinent results and computational tool that are instrumental in design of smart materials and structures with self-bending features.
NASA Technical Reports Server (NTRS)
Jones, L. D.
1979-01-01
The Space Environment Test Division Post-Test Data Reduction Program processes data from test history tapes generated on the Flexible Data System in the Space Environment Simulation Laboratory at the National Aeronautics and Space Administration/Lyndon B. Johnson Space Center. The program reads the tape's data base records to retrieve the item directory conversion file, the item capture file and the process link file to determine the active parameters. The desired parameter names are read in by lead cards after which the periodic data records are read to determine parameter data level changes. The data is considered to be compressed rather than full sample rate. Tabulations and/or a tape for generating plots may be output.
Just, Sarah; Toschkoff, Gregor; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes; Knop, Klaus; Kleinebudde, Peter
2013-03-01
Coating of solid dosage forms is an important unit operation in the pharmaceutical industry. In recent years, numerical simulations of drug manufacturing processes have been gaining interest as process analytical technology tools. The discrete element method (DEM) in particular is suitable to model tablet-coating processes. For the development of accurate simulations, information on the material properties of the tablets is required. In this study, the mechanical parameters Young's modulus, coefficient of restitution (CoR), and coefficients of friction (CoF) of gastrointestinal therapeutic systems (GITS) and of active-coated GITS were measured experimentally. The dynamic angle of repose of these tablets in a drum coater was investigated to revise the CoF. The resulting values were used as input data in DEM simulations to compare simulation and experiment. A mean value of Young's modulus of 31.9 MPa was determined by the uniaxial compression test. The CoR was found to be 0.78. For both tablet-steel and tablet-tablet friction, active-coated GITS showed a higher CoF compared with GITS. According to the values of the dynamic angle of repose, the CoF was adjusted to obtain consistent tablet motion in the simulation and in the experiment. On the basis of this experimental characterization, mechanical parameters are integrated into DEM simulation programs to perform numerical analysis of coating processes.
Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen
2006-01-01
This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con
STEPS: Modeling and Simulating Complex Reaction-Diffusion Systems with Python
Wils, Stefan; Schutter, Erik De
2008-01-01
We describe how the use of the Python language improved the user interface of the program STEPS. STEPS is a simulation platform for modeling and stochastic simulation of coupled reaction-diffusion systems with complex 3-dimensional boundary conditions. Setting up such models is a complicated process that consists of many phases. Initial versions of STEPS relied on a static input format that did not cleanly separate these phases, limiting modelers in how they could control the simulation and becoming increasingly complex as new features and new simulation algorithms were added. We solved all of these problems by tightly integrating STEPS with Python, using SWIG to expose our existing simulation code. PMID:19623245
Programming PHREEQC calculations with C++ and Python a comparative study
Charlton, Scott R.; Parkhurst, David L.; Muller, Mike
2011-01-01
The new IPhreeqc module provides an application programming interface (API) to facilitate coupling of other codes with the U.S. Geological Survey geochemical model PHREEQC. Traditionally, loose coupling of PHREEQC with other applications required methods to create PHREEQC input files, start external PHREEQC processes, and process PHREEQC output files. IPhreeqc eliminates most of this effort by providing direct access to PHREEQC capabilities through a component object model (COM), a library, or a dynamically linked library (DLL). Input and calculations can be specified through internally programmed strings, and all data exchange between an application and the module can occur in computer memory. This study compares simulations programmed in C++ and Python that are tightly coupled with IPhreeqc modules to the traditional simulations that are loosely coupled to PHREEQC. The study compares performance, quantifies effort, and evaluates lines of code and the complexity of the design. The comparisons show that IPhreeqc offers a more powerful and simpler approach for incorporating PHREEQC calculations into transport models and other applications that need to perform PHREEQC calculations. The IPhreeqc module facilitates the design of coupled applications and significantly reduces run times. Even a moderate knowledge of one of the supported programming languages allows more efficient use of PHREEQC than the traditional loosely coupled approach.
14 CFR 60.5 - Quality management system.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.5 Quality... regular basis as described in QPS appendix E of this part. (b) The QMS program must provide a process for...
A Simulation Based Methodology to Examine the B-1B’s AN/ALQ-161 Maintenance Process
2010-03-01
Paper, 2001). “The Air Force does not think of, or advertise , bombers as interchangeable. The B-1, B-2 and B-52 all have a specific mission area and...Path Modeling ( CPM ), Goal Programming, EOQ, Nonlinear Programming Predictive Models unknown, ill-defined known or under the decision-maker’s
A Simulation Based Methodology to Examine the B-1B’s AN/ALQ-161 Maintenance Process
2010-03-01
2001). “The Air Force does not think of, or advertise , bombers as interchangeable. The B-1, B-2 and B-52 all have a specific mission area and 2...Modeling ( CPM ), Goal Programming, EOQ, Nonlinear Programming Predictive Models unknown, ill-defined known or under the decision-maker’s control
Markov Chains For Testing Redundant Software
NASA Technical Reports Server (NTRS)
White, Allan L.; Sjogren, Jon A.
1990-01-01
Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.
Water Hammer Simulations of Monomethylhydrazine Propellant
NASA Technical Reports Server (NTRS)
Burkhardt, Zachary; Ramachandran, N.; Majumdar, A.
2017-01-01
Fluid Transient analysis is important for the design of spacecraft propulsion system to ensure structural stability of the system in the event of sudden closing or opening of the valve. Generalized Fluid System Simulation Program (GFSSP), a general purpose flow network code developed at NASA/MSFC is capable of simulating pressure surge due to sudden opening or closing of valve when thermodynamic properties of real fluid are available for the entire range of simulation. Specifically GFSSP needs an accurate representation of pressure density relationship in order to predict pressure surge during a fluid transient. Unfortunately, the available thermodynamic property programs such as REFPROP, GASP or GASPAK do not provide the thermodynamic properties of Monomethylhydrazine(MMH). This work illustrates the process used for building a customized table of properties of state variables from available properties and speed of sound that is required by GFSSP for simulation. Good agreement was found between the simulations and measured data. This method can be adopted for modeling flow networks and systems with other fluids whose properties are not known in detail in order to obtain general technical insight.
NASA Astrophysics Data System (ADS)
Destyanto, A. R.; Putri, O. A.; Hidayatno, A.
2017-11-01
Due to the advantages that serious simulation game offered, many areas of studies, including energy, have used serious simulation games as their instruments. However, serious simulation games in the field of energy transition still have few attentions. In this study, serious simulation game is developed and tested as the activity of public education about energy transition which is a conversion from oil to natural gas program. The aim of the game development is to create understanding and awareness about the importance of energy transition for society in accelerating the process of energy transition in Indonesia since 1987 the energy transition program has not achieved the conversion target yet due to the lack of education about energy transition for society. Developed as a digital serious simulation game following the framework of integrated game design, the Transergy game has been tested to 15 users and then analysed. The result of verification and validation of the game shows that Transergy gives significance to the users for understanding and triggering the needs of oil to natural gas conversion.
Cryotherapy simulator for localized prostate cancer.
Hahn, James K; Manyak, Michael J; Jin, Ge; Kim, Dongho; Rewcastle, John; Kim, Sunil; Walsh, Raymond J
2002-01-01
Cryotherapy is a treatment modality that uses a technique to selectively freeze tissue and thereby cause controlled tissue destruction. The procedure involves placement of multiple small diameter probes through the perineum into the prostate tissue at selected spatial intervals. Transrectal ultrasound is used to properly position the cylindrical probes before activation of the liquid Argon cooling element, which lowers the tissue temperature below -40 degrees Centigrade. Tissue effect is monitored by transrectal ultrasound changes as well as thermocouples placed in the tissue. The computer-based cryotherapy simulation system mimics the major surgical steps involved in the procedure. The simulated real-time ultrasound display is generated from 3-D ultrasound datasets where the interaction of the ultrasound with the instruments as well as the frozen tissue is simulated by image processing. The thermal and mechanical simulations of the tissue are done using a modified finite-difference/finite-element method optimized for real-time performance. The simulator developed is a part of a comprehensive training program, including a computer-based learning system and hands-on training program with a proctor, designed to familiarize the physician with the technique and equipment involved.
Nonlinear relaxation algorithms for circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, R.A.
Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less
The Distributed Diagonal Force Decomposition Method for Parallelizing Molecular Dynamics Simulations
Boršnik, Urban; Miller, Benjamin T.; Brooks, Bernard R.; Janežič, Dušanka
2011-01-01
Parallelization is an effective way to reduce the computational time needed for molecular dynamics simulations. We describe a new parallelization method, the distributed-diagonal force decomposition method, with which we extend and improve the existing force decomposition methods. Our new method requires less data communication during molecular dynamics simulations than replicated data and current force decomposition methods, increasing the parallel efficiency. It also dynamically load-balances the processors' computational load throughout the simulation. The method is readily implemented in existing molecular dynamics codes and it has been incorporated into the CHARMM program, allowing its immediate use in conjunction with the many molecular dynamics simulation techniques that are already present in the program. We also present the design of the Force Decomposition Machine, a cluster of personal computers and networks that is tailored to running molecular dynamics simulations using the distributed diagonal force decomposition method. The design is expandable and provides various degrees of fault resilience. This approach is easily adaptable to computers with Graphics Processing Units because it is independent of the processor type being used. PMID:21793007
Numerical simulation of the processes in the normal incidence tube for high acoustic pressure levels
NASA Astrophysics Data System (ADS)
Fedotov, E. S.; Khramtsov, I. V.; Kustov, O. Yu.
2016-10-01
Numerical simulation of the acoustic processes in an impedance tube at high levels of acoustic pressure is a way to solve a problem of noise suppressing by liners. These studies used liner specimen that is one cylindrical Helmholtz resonator. The evaluation of the real and imaginary parts of the liner acoustic impedance and sound absorption coefficient was performed for sound pressure levels of 130, 140 and 150 dB. The numerical simulation used experimental data having been obtained on the impedance tube with normal incidence waves. At the first stage of the numerical simulation it was used the linearized Navier-Stokes equations, which describe well the imaginary part of the liner impedance whatever the sound pressure level. These equations were solved by finite element method in COMSOL Multiphysics program in axisymmetric formulation. At the second stage, the complete Navier-Stokes equations were solved by direct numerical simulation in ANSYS CFX in axisymmetric formulation. As the result, the acceptable agreement between numerical simulation and experiment was obtained.
Numerical simulation of controlled directional solidification under microgravity conditions
NASA Astrophysics Data System (ADS)
Holl, S.; Roos, D.; Wein, J.
The computer-assisted simulation of solidification processes influenced by gravity has gained increased importance during the previous years regarding ground-based as well as microgravity research. Depending on the specific needs of the investigator, the simulation model ideally covers a broad spectrum of applications. These primarily include the optimization of furnace design in interaction with selected process parameters to meet the desired crystallization conditions. Different approaches concerning the complexity of the simulation models as well as their dedicated applications will be discussed in this paper. Special emphasis will be put on the potential of software tools to increase the scientific quality and cost-efficiency of microgravity experimentation. The results gained so far in the context of TEXUS, FSLP, D-1 and D-2 (preparatory program) experiments, highlighting their simulation-supported preparation and evaluation will be discussed. An outlook will then be given on the possibilities to enhance the efficiency of pre-industrial research in the Columbus era through the incorporation of suitable simulation methods and tools.
BIOASPEN: System for technology development
NASA Technical Reports Server (NTRS)
1986-01-01
The public version of ASPEN was installed in the VAX 11/750 computer. To examine the idea of BIOASPEN, a test example (the manufacture of acetone, butanol, and ethanol through a biological route) was chosen for simulation. Previous reports on the BIOASPEN project revealed the limitations of ASPEN in modeling this process. To overcome some of the difficulties, modules were written for the acid and enzyme hydrolyzers, the fermentor, and a sterilizer. Information required for these modules was obtained from the literature whenever possible. Additional support modules necessary for interfacing with ASPEN were also written. Some of ASPEN subroutines were themselves altered in order to ensure the correct running of the simulation program. After testing of these additions and charges was completed, the Acetone-Butanol-Ethanol (ABE) process was simulated. A release of ASPEN (which contained the Economic Subsystem) was obtained and installed. This subsection was tested and numerous charges were made in the FORTRAN code. Capital investment and operating cost studies were performed on the ABE process. Some alternatives in certain steps of the ABE simulation were investigated in order to elucidate their effects on the overall economics of the process.
Remote control system for high-perfomance computer simulation of crystal growth by the PFC method
NASA Astrophysics Data System (ADS)
Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei
2017-04-01
Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.
A Case Study Using Modeling and Simulation to Predict Logistics Supply Chain Issues
NASA Technical Reports Server (NTRS)
Tucker, David A.
2007-01-01
Optimization of critical supply chains to deliver thousands of parts, materials, sub-assemblies, and vehicle structures as needed is vital to the success of the Constellation Program. Thorough analysis needs to be performed on the integrated supply chain processes to plan, source, make, deliver, and return critical items efficiently. Process modeling provides simulation technology-based, predictive solutions for supply chain problems which enable decision makers to reduce costs, accelerate cycle time and improve business performance. For example, United Space Alliance, LLC utilized this approach in late 2006 to build simulation models that recreated shuttle orbiter thruster failures and predicted the potential impact of thruster removals on logistics spare assets. The main objective was the early identification of possible problems in providing thruster spares for the remainder of the Shuttle Flight Manifest. After extensive analysis the model results were used to quantify potential problems and led to improvement actions in the supply chain. Similarly the proper modeling and analysis of Constellation parts, materials, operations, and information flows will help ensure the efficiency of the critical logistics supply chains and the overall success of the program.
A constitutive model and numerical simulation of sintering processes at macroscopic level
NASA Astrophysics Data System (ADS)
Wawrzyk, Krzysztof; Kowalczyk, Piotr; Nosewicz, Szymon; Rojek, Jerzy
2018-01-01
This paper presents modelling of both single and double-phase powder sintering processes at the macroscopic level. In particular, its constitutive formulation, numerical implementation and numerical tests are described. The macroscopic constitutive model is based on the assumption that the sintered material is a continuous medium. The parameters of the constitutive model for material under sintering are determined by simulation of sintering at the microscopic level using a micro-scale model. Numerical tests were carried out for a cylindrical specimen under hydrostatic and uniaxial pressure. Results of macroscopic analysis are compared against the microscopic model results. Moreover, numerical simulations are validated by comparison with experimental results. The simulations and preparation of the model are carried out by Abaqus FEA - a software for finite element analysis and computer-aided engineering. A mechanical model is defined by the user procedure "Vumat" which is developed by the first author in Fortran programming language. Modelling presented in the paper can be used to optimize and to better understand the process.
GTKDynamo: a PyMOL plug-in for QC/MM hybrid potential simulations
Bachega, José Fernando R.; Timmers, Luís Fernando S.M.; Assirati, Lucas; Bachega, Leonardo R.; Field, Martin J.; Wymore, Troy
2014-01-01
Hybrid quantum chemical (QC)/molecular mechanical (MM) potentials are very powerful tools for molecular simulation. They are especially useful for studying processes in condensed phase systems, such as chemical reactions, that involve a relatively localized change in electronic structure and where the surrounding environment contributes to these changes but can be represented with more computationally efficient functional forms. Despite their utility, however, these potentials are not always straightforward to apply since the extent of significant electronic structure changes occurring in the condensed phase process may not be intuitively obvious. To facilitate their use we have developed an open-source graphical plug-in, GTKDynamo, that links the PyMOL visualization program and the pDynamo QC/MM simulation library. This article describes the implementation of GTKDynamo and its capabilities and illustrates its application to QC/MM simulations. PMID:24137667
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koopman, D.
2011-07-14
A program was conducted to systematically evaluate potential impacts of the proposed Small Column Ion Exchange (SCIX) process on the Defense Waste Processing Facility (DWPF) Chemical Processing Cell (CPC). The program involved a series of interrelated tasks. Past studies of the impact of crystalline silicotitanate (CST) and monosodium titanate (MST) on DWPF were reviewed. Paper studies and material balance calculations were used to establish reasonable bounding levels of CST and MST in sludge. Following the paper studies, Sludge Batch 10 (SB10) simulant was modified to have both bounding and intermediate levels of MST and ground CST. The SCIX flow sheetmore » includes grinding of the CST which is larger than DWPF frit when not ground. Nominal ground CST was not yet available, therefore a similar CST ground previously in Savannah River National Laboratory (SRNL) was used. It was believed that this CST was over ground and that it would bound the impact of nominal CST on sludge slurry properties. Lab-scale simulations of the DWPF CPC were conducted using SB10 simulants with no, intermediate, and bounding levels of CST and MST. Tests included both the Sludge Receipt and Adjustment Tank (SRAT) and Slurry Mix Evaporator (SME) cycles. Simulations were performed at high and low acid stoichiometry. A demonstration of the extended CPC flowsheet was made that included streams from the site interim salt processing operations. A simulation using irradiated CST and MST was also completed. An extensive set of rheological measurements was made to search for potential adverse consequences of CST and MST and slurry rheology in the CPC. The SCIX CPC impact program was conducted in parallel with a program to evaluate the impact of SCIX on the final DWPF glass waste form and on the DWPF melter throughput. The studies must be considered together when evaluating the full impact of SCIX on DWPF. Due to the fact that the alternant flowsheet for DWPF has not been selected, this study did not consider the impact of proposed future alternative DWPF CPC flowsheets. The impact of the SCIX streams on DWPF processing using the selected flowsheet need to be considered as part of the technical baseline studies for coupled processing with the selected flowsheet. In addition, the downstream impact of aluminum dissolution on waste containing CST and MST has not yet been evaluated. The current baseline would not subject CST to the aluminum dissolution process and technical concerns with performing the dissolution with CST have been expressed. Should this option become feasible, the downstream impact should be considered. The main area of concern for DWPF from aluminum dissolution is an impact on rheology. The SCIX project is planning for SRNL to complete MST, CST, and sludge rheology testing to evaluate any expected changes. The impact of ground CST transport and flush water on the DWPF CPC feed tank (and potential need for decanting) has not been defined or studied.« less
Water recovery and management test support modeling for Space Station Freedom
NASA Technical Reports Server (NTRS)
Mohamadinejad, Habib; Bacskay, Allen S.
1990-01-01
The water-recovery and management (WRM) subsystem proposed for the Space Station Freedom program is outlined, and its computerized modeling and simulation based on a Computer Aided System Engineering and Analysis (CASE/A) program are discussed. A WRM test model consisting of a pretreated urine processing (TIMES), hygiene water processing (RO), RO brine processing using TIMES, and hygiene water storage is presented. Attention is drawn to such end-user equipment characteristics as the shower, dishwasher, clotheswasher, urine-collection facility, and handwash. The transient behavior of pretreated-urine, RO waste-hygiene, and RO brine tanks is assessed, as well as the total input/output to or from the system. The model is considered to be beneficial for pretest analytical predictions as a program cost-saving feature.
A Tool to Simulate the Transmission, Reception, and Execution of Interactive TV Applications
Kulesza, Raoni; Rodrigues, Thiago; Machado, Felipe A. L.; Santos, Celso A. S.
2017-01-01
The emergence of Interactive Digital Television (iDTV) opened a set of technological possibilities that go beyond those offered by conventional TV. Among these opportunities we can highlight interactive contents that run together with linear TV program (television service where the viewer has to watch a scheduled TV program at the particular time it is offered and on the particular channel it is presented on). However, developing interactive contents for this new platform is not as straightforward as, for example, developing Internet applications. One of the options to make this development process easier and safer is to use an iDTV simulator. However, after having investigated some of the existing iDTV simulation environments, we have found a limitation: these simulators mainly present solutions focused on the TV receiver, whose interactive content must be loaded in advance by the programmer to a local repository (e.g., Hard Drive, USB). Therefore, in this paper, we propose a tool, named BiS (Broadcast iDTV content Simulator), which makes possible a broader solution for the simulation of interactive contents. It allows simulating the transmission of interactive content along with the linear TV program (simulating the transmission of content over the air and in broadcast to the receivers). To enable this, we defined a generic and easy-to-customize communication protocol that was implemented in the tool. The proposed environment differs from others because it allows simulating reception of both linear content and interactive content while running Java applications to allow such a content presentation. PMID:28280770
History of the numerical aerodynamic simulation program
NASA Technical Reports Server (NTRS)
Peterson, Victor L.; Ballhaus, William F., Jr.
1987-01-01
The Numerical Aerodynamic Simulation (NAS) program has reached a milestone with the completion of the initial operating configuration of the NAS Processing System Network. This achievement is the first major milestone in the continuing effort to provide a state-of-the-art supercomputer facility for the national aerospace community and to serve as a pathfinder for the development and use of future supercomputer systems. The underlying factors that motivated the initiation of the program are first identified and then discussed. These include the emergence and evolution of computational aerodynamics as a powerful new capability in aerodynamics research and development, the computer power required for advances in the discipline, the complementary nature of computation and wind tunnel testing, and the need for the government to play a pathfinding role in the development and use of large-scale scientific computing systems. Finally, the history of the NAS program is traced from its inception in 1975 to the present time.
Cosmic dust analog simulation in a microgravity environment: The STARDUST program
NASA Technical Reports Server (NTRS)
Ferguson, F.; Lilleleht, L. U.; Nuth, J.; Stephens, J. R.; Bussoletti, E.; Carotenuto, L.; Colangeli, L.; Dell'aversana, P.; Mele, F.; Mennella, V.
1995-01-01
We have undertaken a project called STARDUST which is a collaboration with Italian and American investigators. The goals of this program are to study the condensation and coagulation of refractory materials from the vapor and to study the properties of the resulting grains as analogs to cosmic dust particles. To reduce thermal convective currents and to develop valuable experience in designing an experiment for the Gas-Grain Simulation Facility aboard Space Station, Freedom we have built and flown a new chamber to study these processes under periods of microgravity available on NASA's KC-135 Research Aircraft. Preliminary results from flights with magnesium and zinc are discussed.
NASA Technical Reports Server (NTRS)
Salas, Manuel D.
2007-01-01
The research program of the aerodynamics, aerothermodynamics and plasmadynamics discipline of NASA's Hypersonic Project is reviewed. Details are provided for each of its three components: 1) development of physics-based models of non-equilibrium chemistry, surface catalytic effects, turbulence, transition and radiation; 2) development of advanced simulation tools to enable increased spatial and time accuracy, increased geometrical complexity, grid adaptation, increased physical-processes complexity, uncertainty quantification and error control; and 3) establishment of experimental databases from ground and flight experiments to develop better understanding of high-speed flows and to provide data to validate and guide the development of simulation tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badal, Andreu; Badano, Aldo
Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-raymore » imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.« less
NASA Astrophysics Data System (ADS)
Destyanto, A. R.; Silalahi, T. D.; Hidayatno, A.
2017-11-01
System dynamic modeling is widely used to predict and simulate the energy system in several countries. One of the applications of system dynamics is to evaluate national energy policy alternatives, and energy efficiency analysis. Using system dynamic modeling, this research aims to evaluate the energy transition policy that has been implemented in Indonesia on the past conversion program of kerosene to LPG for household cook fuel consumption, which considered as successful energy transition program implemented since 2007. This research is important since Indonesia considered not yet succeeded to execute another energy transition program on conversion program of oil fuel to gas fuel for transportation that has started since 1989. The aim of this research is to explore which policy intervention that has significant contribution to support or even block the conversion program. Findings in this simulation show that policy intervention to withdraw the kerosene supply and government push to increase production capacity of the support equipment industries (gas stove, regulator, and LPG Cylinder) is the main influence on the success of the program conversion program.
PyCOOL — A Cosmological Object-Oriented Lattice code written in Python
NASA Astrophysics Data System (ADS)
Sainio, J.
2012-04-01
There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive the symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sainio, J., E-mail: jani.sainio@utu.fi; Department of Physics and Astronomy, University of Turku, FI-20014 Turku
There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive themore » symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.« less
Design-based research in designing the model for educating simulation facilitators.
Koivisto, Jaana-Maija; Hannula, Leena; Bøje, Rikke Buus; Prescott, Stephen; Bland, Andrew; Rekola, Leena; Haho, Päivi
2018-03-01
The purpose of this article is to introduce the concept of design-based research, its appropriateness in creating education-based models, and to describe the process of developing such a model. The model was designed as part of the Nurse Educator Simulation based learning project, funded by the EU's Lifelong Learning program (2013-1-DK1-LEO05-07053). The project partners were VIA University College, Denmark, the University of Huddersfield, UK and Metropolia University of Applied Sciences, Finland. As an outcome of the development process, "the NESTLED model for educating simulation facilitators" (NESTLED model) was generated. This article also illustrates five design principles that could be applied to other pedagogies. Copyright © 2018 Elsevier Ltd. All rights reserved.
Urology technical and non-technical skills development: the emerging role of simulation.
Rashid, Prem; Gianduzzo, Troy R J
2016-04-01
To review the emerging role of technical and non-technical simulation in urological education and training. A review was conducted to examine the current role of simulation in urology training. A PUBMED search of the terms 'urology training', 'urology simulation' and 'urology education' revealed 11,504 titles. Three hundred and fifty-seven abstracts were identified as English language, peer reviewed papers pertaining to the role of simulation in urology and related topics. Key papers were used to explore themes. Some cross-referenced papers were also included. There is an ongoing need to ensure that training time is efficiently utilised while ensuring that optimal technical and non-technical skills are achieved. Changing working conditions and the need to minimise patient harm by inadvertent errors must be taken into account. Simulation models for specific technical aspects have been the mainstay of graduated step-wise low and high fidelity training. Whole scenario environments as well as non-technical aspects can be slowly incorporated into the curriculum. Doing so should also help define what have been challenging competencies to teach and evaluate. Dedicated time, resources and trainer up-skilling are important. Concurrent studies are needed to help evaluate the effectiveness of introducing step-wise simulation for technical and non-technical competencies. Simulation based learning remains the best avenue of progressing surgical education. Technical and non-technical simulation could be used in the selection process. There are good economic, logistic and safety reasons to pursue the process of ongoing development of simulation co-curricula. While the role of simulation is assured, its progress will depend on a structured program that takes advantage of what can be delivered via this medium. Overall, simulation can be developed further for urological training programs to encompass technical and non-technical skill development at all stages, including recertification. © 2015 The Authors BJU International © 2015 BJU International Published by John Wiley & Sons Ltd.
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
An Evaluative Review of Simulated Dynamic Smart 3d Objects
NASA Astrophysics Data System (ADS)
Romeijn, H.; Sheth, F.; Pettit, C. J.
2012-07-01
Three-dimensional (3D) modelling of plants can be an asset for creating agricultural based visualisation products. The continuum of 3D plants models ranges from static to dynamic objects, also known as smart 3D objects. There is an increasing requirement for smarter simulated 3D objects that are attributed mathematically and/or from biological inputs. A systematic approach to plant simulation offers significant advantages to applications in agricultural research, particularly in simulating plant behaviour and the influences of external environmental factors. This approach of 3D plant object visualisation is primarily evident from the visualisation of plants using photographed billboarded images, to more advanced procedural models that come closer to simulating realistic virtual plants. However, few programs model physical reactions of plants to external factors and even fewer are able to grow plants based on mathematical and/or biological parameters. In this paper, we undertake an evaluation of plant-based object simulation programs currently available, with a focus upon the components and techniques involved in producing these objects. Through an analytical review process we consider the strengths and weaknesses of several program packages, the features and use of these programs and the possible opportunities in deploying these for creating smart 3D plant-based objects to support agricultural research and natural resource management. In creating smart 3D objects the model needs to be informed by both plant physiology and phenology. Expert knowledge will frame the parameters and procedures that will attribute the object and allow the simulation of dynamic virtual plants. Ultimately, biologically smart 3D virtual plants that react to changes within an environment could be an effective medium to visually represent landscapes and communicate land management scenarios and practices to planners and decision-makers.
The Communication Link and Error ANalysis (CLEAN) simulator
NASA Technical Reports Server (NTRS)
Ebel, William J.; Ingels, Frank M.; Crowe, Shane
1993-01-01
During the period July 1, 1993 through December 30, 1993, significant developments to the Communication Link and Error ANalysis (CLEAN) simulator were completed and include: (1) Soft decision Viterbi decoding; (2) node synchronization for the Soft decision Viterbi decoder; (3) insertion/deletion error programs; (4) convolutional encoder; (5) programs to investigate new convolutional codes; (6) pseudo-noise sequence generator; (7) soft decision data generator; (8) RICE compression/decompression (integration of RICE code generated by Pen-Shu Yeh at Goddard Space Flight Center); (9) Markov Chain channel modeling; (10) percent complete indicator when a program is executed; (11) header documentation; and (12) help utility. The CLEAN simulation tool is now capable of simulating a very wide variety of satellite communication links including the TDRSS downlink with RFI. The RICE compression/decompression schemes allow studies to be performed on error effects on RICE decompressed data. The Markov Chain modeling programs allow channels with memory to be simulated. Memory results from filtering, forward error correction encoding/decoding, differential encoding/decoding, channel RFI, nonlinear transponders and from many other satellite system processes. Besides the development of the simulation, a study was performed to determine whether the PCI provides a performance improvement for the TDRSS downlink. There exist RFI with several duty cycles for the TDRSS downlink. We conclude that the PCI does not improve performance for any of these interferers except possibly one which occurs for the TDRS East. Therefore, the usefulness of the PCI is a function of the time spent transmitting data to the WSGT through the TDRS East transponder.
Discrete element simulation of charging and mixed layer formation in the ironmaking blast furnace
NASA Astrophysics Data System (ADS)
Mitra, Tamoghna; Saxén, Henrik
2016-11-01
The burden distribution in the ironmaking blast furnace plays an important role for the operation as it affects the gas flow distribution, heat and mass transfer, and chemical reactions in the shaft. This work studies certain aspects of burden distribution by small-scale experiments and numerical simulation by the discrete element method (DEM). Particular attention is focused on the complex layer-formation process and the problems associated with estimating the burden layer distribution by burden profile measurements. The formation of mixed layers is studied, and a computational method for estimating the extent of the mixed layer, as well as its voidage, is proposed and applied on the results of the DEM simulations. In studying a charging program and its resulting burden distribution, the mixed layers of coke and pellets were found to show lower voidage than the individual burden layers. The dynamic evolution of the mixed layer during the charging process is also analyzed. The results of the study can be used to gain deeper insight into the complex charging process of the blast furnace, which is useful in the design of new charging programs and for mathematical models that do not consider the full behavior of the particles in the burden layers.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-29
... aquifer (U.S. EPA, 1987, Sole Source Aquifer Designation Decision Process, Petition Review Guidance... the petition; U.S. Geological Survey, 2011, Conceptual Model and Numerical Simulation of the...
Computer Science Techniques Applied to Parallel Atomistic Simulation
NASA Astrophysics Data System (ADS)
Nakano, Aiichiro
1998-03-01
Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.
Leake, S.A.; Prudic, David E.
1988-01-01
The process of permanent compaction is not routinely included in simulations of groundwater flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U. S. Geological Survey modular finite-difference groundwater flow model. The new program is called the Interbed-Storage Package. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of skeletal component of elastic specific storage and thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the groundwater flow model by adding an additional term to the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum head. Another package that allows for a time-varying specified-head boundary is also documented. This package was written to reduce the data requirements for test simulations of the Interbed-Storage Package. (USGS)
Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...
NASA Technical Reports Server (NTRS)
1976-01-01
Contractural requirements, project planning, equipment specifications, and technical data for space shuttle biological experiment payloads are presented. Topics discussed are: (1) urine collection and processing on the space shuttle, (2) space processing of biochemical and biomedical materials, (3) mission simulations, and (4) biomedical equipment.
Kim, Sunghee; Shin, Gisoo
2016-02-01
Since previous studies on simulation-based education have been focused on fundamental nursing skills for nursing students in South Korea, there is little research available that focuses on clinical nurses in simulation-based training. Further, there is a paucity of research literature related to the integration of the nursing process into simulation training particularly in the emergency nursing care of high-risk maternal and neonatal patients. The purpose of this study was to identify the effects of nursing process-based simulation on knowledge, attitudes, and skills for maternal and child emergency nursing care in clinical nurses in South Korea. Data were collected from 49 nurses, 25 in the experimental group and 24 in the control group, from August 13 to 14, 2013. This study was an equivalent control group pre- and post-test experimental design to compare the differences in knowledge, attitudes, and skills for maternal and child emergency nursing care between the experimental group and the control group. The experimental group was trained by the nursing process-based simulation training program, while the control group received traditional methods of training for maternal and child emergency nursing care. The experimental group was more likely to improve knowledge, attitudes, and skills required for clinical judgment about maternal and child emergency nursing care than the control group. Among five stages of nursing process in simulation, the experimental group was more likely to improve clinical skills required for nursing diagnosis and nursing evaluation than the control group. These results will provide valuable information on developing nursing process-based simulation training to improve clinical competency in nurses. Further research should be conducted to verify the effectiveness of nursing process-based simulation with more diverse nurse groups on more diverse subjects in the future. Copyright © 2015 Elsevier Ltd. All rights reserved.
Numerical simulation of the SAGD process coupled with geomechanical behavior
NASA Astrophysics Data System (ADS)
Li, Pingke
Canada has vast oil sand resources. While a large portion of this resource can be recovered by surface mining techniques, a majority is located at depths requiring the application of in situ recovery technologies. Although a number of in situ recovery technologies exist, the steam assisted gravity drainage (SAGD) process has emerged as one of the most promising technologies to develop the in situ oil sands resources. During the SAGD operations, saturated steam is continuously injected into the oil sands reservoir, which induces pore pressure and stress variations. As a result, reservoir parameters and processes may also vary, particularly when tensile and shear failure occur. This geomechanical effect is obvious for oil sands material because oil sands have the in situ interlocked fabric. The conventional reservoir simulation generally does not take this coupled mechanism into consideration. Therefore, this research is to improve the reservoir simulation techniques of the SAGD process applied in the development of oil sands and heavy oil reservoirs. The analyses of the decoupled reservoir geomechanical simulation results show that the geomechanical behavior in SAGD has obvious impact on reservoir parameters, such as absolute permeability. The issues with the coupled reservoir geomechanical simulations of the SAGD process have been clarified and the permeability variations due to geomechanical behaviors in the SAGD process investigated. A methodology of sequentially coupled reservoir geomechanical simulation technique was developed based on the reservoir simulator, EXOTHERM, and the geomechanical simulator, FLAC. In addition, a representative geomechanical model of oil sands material was summarized in this research. Finally, this reservoir geomechanical simulation methodology was verified with the UTF Phase A SAGD project and applied in a SAGD operation with gas-over-bitumen geometry. Based on this methodology, the geomechanical effect on the SAGD production performance can be quantified. This research program involves the analyses of laboratory testing results obtained from literatures. However, no laboratory testing was conducted in the process of this research.
Generic Modeling of a Life Support System for Process Technology Comparison
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2015-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. Test article, a 3.35m long with the diameter of 20 cm incline line, was filled with liquid (LH2)or gaseous hydrogen (GH2) and then purged with gaseous helium (GHe). Total of 10 tests were conducted. Influences of GHe flow rates and initial temperatures were evaluated. Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer, was utilized to model and simulate selective tests.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
NASA Astrophysics Data System (ADS)
Simunovic, S.; Zacharia, T.; Baltas, N.; Spalding, D. B.
1995-03-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. The Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.
MPI implementation of PHOENICS: A general purpose computational fluid dynamics code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, S.; Zacharia, T.; Baltas, N.
1995-04-01
PHOENICS is a suite of computational analysis programs that are used for simulation of fluid flow, heat transfer, and dynamical reaction processes. The parallel version of the solver EARTH for the Computational Fluid Dynamics (CFD) program PHOENICS has been implemented using Message Passing Interface (MPI) standard. Implementation of MPI version of PHOENICS makes this computational tool portable to a wide range of parallel machines and enables the use of high performance computing for large scale computational simulations. MPI libraries are available on several parallel architectures making the program usable across different architectures as well as on heterogeneous computer networks. Themore » Intel Paragon NX and MPI versions of the program have been developed and tested on massively parallel supercomputers Intel Paragon XP/S 5, XP/S 35, and Kendall Square Research, and on the multiprocessor SGI Onyx computer at Oak Ridge National Laboratory. The preliminary testing results of the developed program have shown scalable performance for reasonably sized computational domains.« less
Advective transport observations with MODPATH-OBS--documentation of the MODPATH observation process
Hanson, R.T.; Kauffman, L.K.; Hill, M.C.; Dickinson, J.E.; Mehl, S.W.
2013-01-01
The MODPATH-OBS computer program described in this report is designed to calculate simulated equivalents for observations related to advective groundwater transport that can be represented in a quantitative way by using simulated particle-tracking data. The simulated equivalents supported by MODPATH-OBS are (1) distance from a source location at a defined time, or proximity to an observed location; (2) time of travel from an initial location to defined locations, areas, or volumes of the simulated system; (3) concentrations used to simulate groundwater age; and (4) percentages of water derived from contributing source areas. Although particle tracking only simulates the advective component of conservative transport, effects of non-conservative processes such as retardation can be approximated through manipulation of the effective-porosity value used to calculate velocity based on the properties of selected conservative tracers. This program can also account for simple decay or production, but it cannot account for diffusion. Dispersion can be represented through direct simulation of subsurface heterogeneity and the use of many particles. MODPATH-OBS acts as a postprocessor to MODPATH, so that the sequence of model runs generally required is MODFLOW, MODPATH, and MODPATH-OBS. The version of MODFLOW and MODPATH that support the version of MODPATH-OBS presented in this report are MODFLOW-2005 or MODFLOW-LGR, and MODPATH-LGR. MODFLOW-LGR is derived from MODFLOW-2005, MODPATH 5, and MODPATH 6 and supports local grid refinement. MODPATH-LGR is derived from MODPATH 5. It supports the forward and backward tracking of particles through locally refined grids and provides the output needed for MODPATH_OBS. For a single grid and no observations, MODPATH-LGR results are equivalent to MODPATH 5. MODPATH-LGR and MODPATH-OBS simulations can use nearly all of the capabilities of MODFLOW-2005 and MODFLOW-LGR; for example, simulations may be steady-state, transient, or a combination. Though the program name MODPATH-OBS specifically refers to observations, the program also can be used to calculate model prediction of observations. MODPATH-OBS is primarily intended for use with separate programs that conduct sensitivity analysis, data needs assessment, parameter estimation, and uncertainty analysis, such as UCODE_2005, and PEST. In many circumstances, refined grids in selected parts of a model are important to simulated hydraulics, detailed inflows and outflows, or other system characteristics. MODFLOW-LGR and MODPATH-LGR support accurate local grid refinement in which both mass (flows) and energy (head) are conserved across the local grid boundary. MODPATH-OBS is designed to take advantage of these capabilities. For example, particles tracked between a pumping well and a nearby stream, which are simulated poorly if a river and well are located in a single large grid cell, can be simulated with improved accuracy using a locally refined grid in MODFLOW-LGR, MODPATH-LGR, and MODPATH-OBS. The locally-refined-grid approach can provide more accurate simulated equivalents to observed transport between the well and the river. The documentation presented here includes a brief discussion of previous work, description of the methods, and detailed descriptions of the required input files and how the output files are typically used.
Taylor, Charles J.; Williamson, Tanja N.; Newson, Jeremy K.; Ulery, Randy L.; Nelson, Hugh L.; Cinotto, Peter J.
2012-01-01
This report describes Phase II modifications made to the Water Availability Tool for Environmental Resources (WATER), which applies the process-based TOPMODEL approach to simulate or predict stream discharge in surface basins in the Commonwealth of Kentucky. The previous (Phase I) version of WATER did not provide a means of identifying sinkhole catchments or accounting for the effects of karst (internal) drainage in a TOPMODEL-simulated basin. In the Phase II version of WATER, sinkhole catchments are automatically identified and delineated as internally drained subbasins, and a modified TOPMODEL approach (called the sinkhole drainage process, or SDP-TOPMODEL) is applied that calculates mean daily discharges for the basin based on summed area-weighted contributions from sinkhole drain-age (SD) areas and non-karstic topographically drained (TD) areas. Results obtained using the SDP-TOPMODEL approach were evaluated for 12 karst test basins located in each of the major karst terrains in Kentucky. Visual comparison of simulated hydrographs and flow-duration curves, along with statistical measures applied to the simulated discharge data (bias, correlation, root mean square error, and Nash-Sutcliffe efficiency coefficients), indicate that the SDPOPMODEL approach provides acceptably accurate estimates of discharge for most flow conditions and typically provides more accurate simulation of stream discharge in karstic basins compared to the standard TOPMODEL approach. Additional programming modifications made to the Phase II version of WATER included implementation of a point-and-click graphical user interface (GUI), which fully automates the delineation of simulation-basin boundaries and improves the speed of input-data processing. The Phase II version of WATER enables the user to select a pour point anywhere on a stream reach of interest, and the program will automatically delineate all upstream areas that contribute drainage to that point. This capability enables automatic delineation of a simulation basin of any size (area) and having any level of stream-network complexity. WATER then automatically identifies the presence of sinkholes catchments within the simulation basin boundaries; extracts and compiles the necessary climatic, topographic, and basin characteristics datasets; and runs the SDP-TOPMODEL approach to estimate daily mean discharges (streamflow).
Accelerating 3D Hall MHD Magnetosphere Simulations with Graphics Processing Units
NASA Astrophysics Data System (ADS)
Bard, C.; Dorelli, J.
2017-12-01
The resolution required to simulate planetary magnetospheres with Hall magnetohydrodynamics result in program sizes approaching several hundred million grid cells. These would take years to run on a single computational core and require hundreds or thousands of computational cores to complete in a reasonable time. However, this requires access to the largest supercomputers. Graphics processing units (GPUs) provide a viable alternative: one GPU can do the work of roughly 100 cores, bringing Hall MHD simulations of Ganymede within reach of modest GPU clusters ( 8 GPUs). We report our progress in developing a GPU-accelerated, three-dimensional Hall magnetohydrodynamic code and present Hall MHD simulation results for both Ganymede (run on 8 GPUs) and Mercury (56 GPUs). We benchmark our Ganymede simulation with previous results for the Galileo G8 flyby, namely that adding the Hall term to ideal MHD simulations changes the global convection pattern within the magnetosphere. Additionally, we present new results for the G1 flyby as well as initial results from Hall MHD simulations of Mercury and compare them with the corresponding ideal MHD runs.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
ERIC Educational Resources Information Center
Xiang, Lin
2011-01-01
This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…
NASA Astrophysics Data System (ADS)
Czerepicki, A.; Koniak, M.
2017-06-01
The paper presents a method of modelling the processes of aging lithium-ion batteries, its implementation as a computer application and results for battery state estimation. Authors use previously developed behavioural battery model, which was built using battery operating characteristics obtained from the experiment. This model was implemented in the form of a computer program using a database to store battery characteristics. Batteries aging process is a new extended functionality of the model. Algorithm of computer simulation uses a real measurements of battery capacity as a function of the battery charge and discharge cycles number. Simulation allows to take into account the incomplete cycles of charge or discharge battery, which are characteristic for transport powered by electricity. The developed model was used to simulate the battery state estimation for different load profiles, obtained by measuring the movement of the selected means of transport.
U.S. Climate Change Technology Program: Strategic Plan
2006-09-01
and Long Term, provides details on the 85 technologies in the R&D portfolio. 21 (Figure 2-1) Continuing Process The United States, in partnership with...locations may be centered near or in residential locations, and work processes and products may be more commonly communicated or delivered via digital... chemical properties, along with advanced methods to simulate processes , will stem from advances in computational technology. Current Portfolio The current
Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences
NASA Technical Reports Server (NTRS)
1979-01-01
An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.
Parkhurst, David L.; Kipp, Kenneth L.; Engesgaard, Peter; Charlton, Scott R.
2004-01-01
The computer program PHAST simulates multi-component, reactive solute transport in three-dimensional saturated ground-water flow systems. PHAST is a versatile ground-water flow and solute-transport simulator with capabilities to model a wide range of equilibrium and kinetic geochemical reactions. The flow and transport calculations are based on a modified version of HST3D that is restricted to constant fluid density and constant temperature. The geochemical reactions are simulated with the geochemical model PHREEQC, which is embedded in PHAST. PHAST is applicable to the study of natural and contaminated ground-water systems at a variety of scales ranging from laboratory experiments to local and regional field scales. PHAST can be used in studies of migration of nutrients, inorganic and organic contaminants, and radionuclides; in projects such as aquifer storage and recovery or engineered remediation; and in investigations of the natural rock-water interactions in aquifers. PHAST is not appropriate for unsaturated-zone flow, multiphase flow, density-dependent flow, or waters with high ionic strengths. A variety of boundary conditions are available in PHAST to simulate flow and transport, including specified-head, flux, and leaky conditions, as well as the special cases of rivers and wells. Chemical reactions in PHAST include (1) homogeneous equilibria using an ion-association thermodynamic model; (2) heterogeneous equilibria between the aqueous solution and minerals, gases, surface complexation sites, ion exchange sites, and solid solutions; and (3) kinetic reactions with rates that are a function of solution composition. The aqueous model (elements, chemical reactions, and equilibrium constants), minerals, gases, exchangers, surfaces, and rate expressions may be defined or modified by the user. A number of options are available to save results of simulations to output files. The data may be saved in three formats: a format suitable for viewing with a text editor; a format suitable for exporting to spreadsheets and post-processing programs; or in Hierarchical Data Format (HDF), which is a compressed binary format. Data in the HDF file can be visualized on Windows computers with the program Model Viewer and extracted with the utility program PHASTHDF; both programs are distributed with PHAST. Operator splitting of the flow, transport, and geochemical equations is used to separate the three processes into three sequential calculations. No iterations between transport and reaction calculations are implemented. A three-dimensional Cartesian coordinate system and finite-difference techniques are used for the spatial and temporal discretization of the flow and transport equations. The non-linear chemical equilibrium equations are solved by a Newton-Raphson method, and the kinetic reaction equations are solved by a Runge-Kutta or an implicit method for integrating ordinary differential equations. The PHAST simulator may require large amounts of memory and long Central Processing Unit (CPU) times. To reduce the long CPU times, a parallel version of PHAST has been developed that runs on a multiprocessor computer or on a collection of computers that are networked. The parallel version requires Message Passing Interface, which is currently (2004) freely available. The parallel version is effective in reducing simulation times. This report documents the use of the PHAST simulator, including running the simulator, preparing the input files, selecting the output files, and visualizing the results. It also presents four examples that verify the numerical method and demonstrate the capabilities of the simulator. PHAST requires three input files. Only the flow and transport file is described in detail in this report. The other two files, the chemistry data file and the database file, are identical to PHREEQC files and the detailed description of these files is found in the PHREEQC documentation.
JETSPIN: A specific-purpose open-source software for simulations of nanofiber electrospinning
NASA Astrophysics Data System (ADS)
Lauricella, Marco; Pontrelli, Giuseppe; Coluzza, Ivan; Pisignano, Dario; Succi, Sauro
2015-12-01
We present the open-source computer program JETSPIN, specifically designed to simulate the electrospinning process of nanofibers. Its capabilities are shown with proper reference to the underlying model, as well as a description of the relevant input variables and associated test-case simulations. The various interactions included in the electrospinning model implemented in JETSPIN are discussed in detail. The code is designed to exploit different computational architectures, from single to parallel processor workstations. This paper provides an overview of JETSPIN, focusing primarily on its structure, parallel implementations, functionality, performance, and availability.
Analysis of simulated image sequences from sensors for restricted-visibility operations
NASA Technical Reports Server (NTRS)
Kasturi, Rangachar
1991-01-01
A real time model of the visible output from a 94 GHz sensor, based on a radiometric simulation of the sensor, was developed. A sequence of images as seen from an aircraft as it approaches for landing was simulated using this model. Thirty frames from this sequence of 200 x 200 pixel images were analyzed to identify and track objects in the image using the Cantata image processing package within the visual programming environment provided by the Khoros software system. The image analysis operations are described.
Version 4.0 of code Java for 3D simulation of the CCA model
NASA Astrophysics Data System (ADS)
Fan, Linyu; Liao, Jianwei; Zuo, Junsen; Zhang, Kebo; Li, Chao; Xiong, Hailing
2018-07-01
This paper presents a new version Java code for the three-dimensional simulation of Cluster-Cluster Aggregation (CCA) model to replace the previous version. Many redundant traverses of clusters-list in the program were totally avoided, so that the consumed simulation time is significantly reduced. In order to show the aggregation process in a more intuitive way, we have labeled different clusters with varied colors. Besides, a new function is added for outputting the particle's coordinates of aggregates in file to benefit coupling our model with other models.
Hybrid and electric advanced vehicle systems (heavy) simulation
NASA Technical Reports Server (NTRS)
Hammond, R. A.; Mcgehee, R. K.
1981-01-01
A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.
Feinstein, Robert E; Yager, Joel
2017-10-30
Violence in psychiatric outpatient settings is a ubiquitous concern. This article describes the development, implementation, and evaluation of a live threat violence simulation exercise, designed to reduce the risk of future outpatient clinic violence and minimize the effects of future incidents on staff. The psychiatric outpatient clinic at the University of Colorado Hospital developed, implemented, and evaluated a 4-hour live violence threat simulation exercise as a companion to a 7-hour violence prevention program. The simulation includes an orientation, two threat simulation scenarios, three debriefings, satisfaction surveys, problem identification, action plans, and annual safety and process improvements. The authors have conducted live violence simulation exercises from 2011-2016, and have collected survey data about our annual simulation exercise from 2014-2016. Each year ≥ 52% of participants responded, and each year ≥ 90% of respondents rated the simulation as "very helpful/helpful", ≥ 86% believed themselves to be "much better/better" prepared to deal with violent episodes, and < 2% of participants experienced post-simulation side effects such as worries about past trauma; anxiety; sleep problems; increase in workplace concerns. From 2011-2016, the clinic experienced 4 major violent episodes and 36 episodes of potential violence with no staff injuries and minimal psychological sequelae to one staff member. Violence prevention efforts and the development of close police/staff relationships may have contributed to these fortunate outcomes. Satisfaction surveys suggest that the simulations are very helpful/helpful, with participants feeling much better/ better prepared to manage violence. The exercises led the authors to initiate staff safety related behavioral changes as well as physical space and safety processes improvements. The violence prevention program and simulation exercises have promoted excellent relationships with police and a consistent safety record over six years. This approach may be useful for other psychiatric outpatient departments.
Using Simulation to Implement an OR Cardiac Arrest Crisis Checklist.
Dagey, Darleen
2017-01-01
Crisis checklists are cognitive aids used to coordinate care during critical events. Simulation training is a method to validate process improvement initiatives such as checklist implementation. In response to concerns staff members expressed regarding their comfort level when responding to infrequent occurrences such as cardiac arrest and other OR emergencies, the OR Comprehensive Unit-based Safety Program team at our facility decided to institute the use of crisis checklists in the OR during critical events. We provided 90-minute education sessions, simulation opportunities, and debriefings to help staff members become more comfortable using these checklists. Based on program evaluations, 80% of staff members who participated in the training expressed an increased comfort level when caring for a patient in cardiac arrest. Copyright © 2017 AORN, Inc. Published by Elsevier Inc. All rights reserved.
NASA/FAA helicopter simulator workshop
NASA Technical Reports Server (NTRS)
Larsen, William E. (Editor); Randle, Robert J., Jr. (Editor); Bray, Richard S. (Editor); Zuk, John (Editor)
1992-01-01
A workshop was convened by the FAA and NASA for the purpose of providing a forum at which leading designers, manufacturers, and users of helicopter simulators could initiate and participate in a development process that would facilitate the formulation of qualification standards by the regulatory agency. Formal papers were presented, special topics were discussed in breakout sessions, and a draft FAA advisory circular defining specifications for helicopter simulators was presented and discussed. A working group of volunteers was formed to work with the National Simulator Program Office to develop a final version of the circular. The workshop attracted 90 individuals from a constituency of simulator manufacturers, training organizations, the military, civil regulators, research scientists, and five foreign countries.
NASA Astrophysics Data System (ADS)
Abdul Ghani, B.
2005-09-01
"TEA CO 2 Laser Simulator" has been designed to simulate the dynamic emission processes of the TEA CO 2 laser based on the six-temperature model. The program predicts the behavior of the laser output pulse (power, energy, pulse duration, delay time, FWHM, etc.) depending on the physical and geometrical input parameters (pressure ratio of gas mixture, reflecting area of the output mirror, media length, losses, filling and decay factors, etc.). Program summaryTitle of program: TEA_CO2 Catalogue identifier: ADVW Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVW Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: P.IV DELL PC Setup: Atomic Energy Commission of Syria, Scientific Services Department, Mathematics and Informatics Division Operating system: MS-Windows 9x, 2000, XP Programming language: Delphi 6.0 No. of lines in distributed program, including test data, etc.: 47 315 No. of bytes in distributed program, including test data, etc.:7 681 109 Distribution format:tar.gz Classification: 15 Laser Physics Nature of the physical problem: "TEA CO 2 Laser Simulator" is a program that predicts the behavior of the laser output pulse by studying the effect of the physical and geometrical input parameters on the characteristics of the output laser pulse. The laser active medium consists of a CO 2-N 2-He gas mixture. Method of solution: Six-temperature model, for the dynamics emission of TEA CO 2 laser, has been adapted in order to predict the parameters of laser output pulses. A simulation of the laser electrical pumping was carried out using two approaches; empirical function equation (8) and differential equation (9). Typical running time: The program's running time mainly depends on both integration interval and step; for a 4 μs period of time and 0.001 μs integration step (defaults values used in the program), the running time will be about 4 seconds. Restrictions on the complexity: Using a very small integration step might leads to stop the program run due to the huge number of calculating points and to a small paging file size of the MS-Windows virtual memory. In such case, it is recommended to enlarge the paging file size to the appropriate size, or to use a bigger value of integration step.
Kumar, Arunaz; Nestel, Debra; East, Christine; Hay, Margaret; Lichtwark, Irene; McLelland, Gayle; Bentley, Deidre; Hall, Helen; Fernando, Shavi; Hobson, Sebastian; Larmour, Luke; Dekoninck, Philip; Wallace, Euan M
2018-02-01
Simulation-based programs are increasingly being used to teach obstetrics and gynaecology examinations, but it is difficult to establish student learning acquired through them. Assessment may test student learning but its role in learning itself is rarely recognised. We undertook this study to assess medical and midwifery student learning through a simulation program using a pre-test and post-test design and also to evaluate use of assessment as a method of learning. The interprofessional simulation education program consisted of a brief pre-reading document, a lecture, a video demonstration and a hands-on workshop. Over a 24-month period, 405 medical and 104 midwifery students participated in the study and were assessed before and after the program. Numerical data were analysed using paired t-test and one-way analysis of variance. Students' perceptions of the role of assessment in learning were qualitatively analysed. The post-test scores were significantly higher than the pre-test (P < 0.001) with improvements in scores in both medical and midwifery groups. Students described the benefit of assessment on learning in preparation of the assessment, reinforcement of learning occurring during assessment and reflection on performance cementing previous learning as a post-assessment effect. Both medical and midwifery students demonstrated a significant improvement in their test scores and for most students the examination process itself was a positive learning experience. © 2017 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.
Hydraulic Characteristics Of Two Bicycle-Safe Grate Inlet Designs
DOT National Transportation Integrated Search
1988-12-01
Expert Systems are computer programs designed to include a simulation of the reasoning and decision-making processes of human experts. This report provides a set of general guidelines for the development and distribution of highway related expert sys...
The 2nd Symposium on the Frontiers of Massively Parallel Computations
NASA Technical Reports Server (NTRS)
Mills, Ronnie (Editor)
1988-01-01
Programming languages, computer graphics, neural networks, massively parallel computers, SIMD architecture, algorithms, digital terrain models, sort computation, simulation of charged particle transport on the massively parallel processor and image processing are among the topics discussed.
Microstructure Modeling of 3rd Generation Disk Alloys
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2010-01-01
The objective of this program is to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool will be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishment achieved during the second year (2008) of the program is summarized. The activities of this year include final selection of multicomponent thermodynamics and mobility databases, precipitate surface energy determination from nucleation experiment, multiscale comparison of predicted versus measured intragrain precipitation microstructure in quench samples showing good agreement, isothermal coarsening experiment and interaction of grain boundary and intergrain precipitates, primary microstructure of subsolvus treatment, and finally the software implementation plan for the third year of the project. In the following year, the calibrated models and simulation tools will be validated against an independently developed experimental data set, with actual disc heat treatment process conditions. Furthermore, software integration and implementation will be developed to provide material engineers valuable information in order to optimize the processing of the 3rd generation gas turbine disc alloys.
MODFLOW-2005 : the U.S. Geological Survey modular ground-water model--the ground-water flow process
Harbaugh, Arlen W.
2005-01-01
This report presents MODFLOW-2005, which is a new version of the finite-difference ground-water model commonly called MODFLOW. Ground-water flow is simulated using a block-centered finite-difference approach. Layers can be simulated as confined or unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and rivers, also can be simulated. The report includes detailed explanations of physical and mathematical concepts on which the model is based, an explanation of how those concepts are incorporated in the modular structure of the computer program, instructions for using the model, and details of the computer code. The modular structure consists of a MAIN Program and a series of highly independent subroutines. The subroutines are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system that is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving the set of simultaneous equations resulting from the finite-difference method. Several solution methods are incorporated, including the Preconditioned Conjugate-Gradient method. The division of the program into packages permits the user to examine specific hydrologic features of the model independently. This also facilitates development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program also are designed to permit maximum flexibility. The program is designed to allow other capabilities, such as transport and optimization, to be incorporated, but this report is limited to describing the ground-water flow capability. The program is written in Fortran 90 and will run without modification on most computers that have a Fortran 90 compiler.
WE-D-204-04: Learning the Ropes: Clinical Immersion in the First Month of Residency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dieterich, S.
Speakers in this session will present overview and details of a specific rotation or feature of their Medical Physics Residency Program that is particularly exceptional and noteworthy. The featured rotations include foundational topics executed with exceptional acumen and innovative educational rotations perhaps not commonly found in Medical Physics Residency Programs. A site-specific clinical rotation will be described, where the medical physics resident follows the physician and medical resident for two weeks into patient consultations, simulation sessions, target contouring sessions, planning meetings with dosimetry, patient follow up visits, and tumor boards, to gain insight into the thought processes of the radiationmore » oncologist. An incident learning rotation will be described where the residents learns about and practices evaluating clinical errors and investigates process improvements for the clinic. The residency environment at a Canadian medical physics residency program will be described, where the training and interactions with radiation oncology residents is integrated. And the first month rotation will be described, where the medical physics resident rotates through the clinical areas including simulation, dosimetry, and treatment units, gaining an overview of the clinical flow and meeting all the clinical staff to begin the residency program. This session will be of particular interest to residency programs who are interested in adopting or adapting these curricular ideas into their programs and to residency candidates who want to learn about programs already employing innovative practices. Learning Objectives: To learn about exceptional and innovative clinical rotations or program features within existing Medical Physics Residency Programs. To understand how to adopt/adapt innovative curricular designs into your own Medical Physics Residency Program, if appropriate.« less
WE-D-204-00: Session in Memory of Franca Kuchnir: Excellence in Medical Physics Residency Education
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Speakers in this session will present overview and details of a specific rotation or feature of their Medical Physics Residency Program that is particularly exceptional and noteworthy. The featured rotations include foundational topics executed with exceptional acumen and innovative educational rotations perhaps not commonly found in Medical Physics Residency Programs. A site-specific clinical rotation will be described, where the medical physics resident follows the physician and medical resident for two weeks into patient consultations, simulation sessions, target contouring sessions, planning meetings with dosimetry, patient follow up visits, and tumor boards, to gain insight into the thought processes of the radiationmore » oncologist. An incident learning rotation will be described where the residents learns about and practices evaluating clinical errors and investigates process improvements for the clinic. The residency environment at a Canadian medical physics residency program will be described, where the training and interactions with radiation oncology residents is integrated. And the first month rotation will be described, where the medical physics resident rotates through the clinical areas including simulation, dosimetry, and treatment units, gaining an overview of the clinical flow and meeting all the clinical staff to begin the residency program. This session will be of particular interest to residency programs who are interested in adopting or adapting these curricular ideas into their programs and to residency candidates who want to learn about programs already employing innovative practices. Learning Objectives: To learn about exceptional and innovative clinical rotations or program features within existing Medical Physics Residency Programs. To understand how to adopt/adapt innovative curricular designs into your own Medical Physics Residency Program, if appropriate.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, W.
Speakers in this session will present overview and details of a specific rotation or feature of their Medical Physics Residency Program that is particularly exceptional and noteworthy. The featured rotations include foundational topics executed with exceptional acumen and innovative educational rotations perhaps not commonly found in Medical Physics Residency Programs. A site-specific clinical rotation will be described, where the medical physics resident follows the physician and medical resident for two weeks into patient consultations, simulation sessions, target contouring sessions, planning meetings with dosimetry, patient follow up visits, and tumor boards, to gain insight into the thought processes of the radiationmore » oncologist. An incident learning rotation will be described where the residents learns about and practices evaluating clinical errors and investigates process improvements for the clinic. The residency environment at a Canadian medical physics residency program will be described, where the training and interactions with radiation oncology residents is integrated. And the first month rotation will be described, where the medical physics resident rotates through the clinical areas including simulation, dosimetry, and treatment units, gaining an overview of the clinical flow and meeting all the clinical staff to begin the residency program. This session will be of particular interest to residency programs who are interested in adopting or adapting these curricular ideas into their programs and to residency candidates who want to learn about programs already employing innovative practices. Learning Objectives: To learn about exceptional and innovative clinical rotations or program features within existing Medical Physics Residency Programs. To understand how to adopt/adapt innovative curricular designs into your own Medical Physics Residency Program, if appropriate.« less
WE-D-204-01: Site-Specific Clinical Rotation: Into the Minds of the Radiation Oncologists
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrickson, K.
2016-06-15
Speakers in this session will present overview and details of a specific rotation or feature of their Medical Physics Residency Program that is particularly exceptional and noteworthy. The featured rotations include foundational topics executed with exceptional acumen and innovative educational rotations perhaps not commonly found in Medical Physics Residency Programs. A site-specific clinical rotation will be described, where the medical physics resident follows the physician and medical resident for two weeks into patient consultations, simulation sessions, target contouring sessions, planning meetings with dosimetry, patient follow up visits, and tumor boards, to gain insight into the thought processes of the radiationmore » oncologist. An incident learning rotation will be described where the residents learns about and practices evaluating clinical errors and investigates process improvements for the clinic. The residency environment at a Canadian medical physics residency program will be described, where the training and interactions with radiation oncology residents is integrated. And the first month rotation will be described, where the medical physics resident rotates through the clinical areas including simulation, dosimetry, and treatment units, gaining an overview of the clinical flow and meeting all the clinical staff to begin the residency program. This session will be of particular interest to residency programs who are interested in adopting or adapting these curricular ideas into their programs and to residency candidates who want to learn about programs already employing innovative practices. Learning Objectives: To learn about exceptional and innovative clinical rotations or program features within existing Medical Physics Residency Programs. To understand how to adopt/adapt innovative curricular designs into your own Medical Physics Residency Program, if appropriate.« less
Residency Training in Robotic General Surgery: A Survey of Program Directors
George, Lea C.; O'Neill, Rebecca
2018-01-01
Objective Robotic surgery continues to expand in minimally invasive surgery; however, the literature is insufficient to understand the current training process for general surgery residents. Therefore, the objectives of this study were to identify the current approach to and perspectives on robotic surgery training. Methods An electronic survey was distributed to general surgery program directors identified by the Accreditation Council for Graduate Medical Education website. Multiple choice and open-ended questions regarding current practices and opinions on robotic surgery training in general surgery residency programs were used. Results 20 program directors were surveyed, a majority being from medium-sized programs (4–7 graduating residents per year). Most respondents (73.68%) had a formal robotic surgery curriculum at their institution, with 63.16% incorporating simulation training. Approximately half of the respondents believe that more time should be dedicated to robotic surgery training (52.63%), with simulation training prior to console use (84.21%). About two-thirds of the respondents (63.16%) believe that a formal robotic surgery curriculum should be established as a part of general surgery residency, with more than half believing that exposure should occur in postgraduate year one (55%). Conclusion A formal robotics curriculum with simulation training and early surgical exposure for general surgery residents should be given consideration in surgical residency training. PMID:29854454
Residency Training in Robotic General Surgery: A Survey of Program Directors.
George, Lea C; O'Neill, Rebecca; Merchant, Aziz M
2018-01-01
Robotic surgery continues to expand in minimally invasive surgery; however, the literature is insufficient to understand the current training process for general surgery residents. Therefore, the objectives of this study were to identify the current approach to and perspectives on robotic surgery training. An electronic survey was distributed to general surgery program directors identified by the Accreditation Council for Graduate Medical Education website. Multiple choice and open-ended questions regarding current practices and opinions on robotic surgery training in general surgery residency programs were used. 20 program directors were surveyed, a majority being from medium-sized programs (4-7 graduating residents per year). Most respondents (73.68%) had a formal robotic surgery curriculum at their institution, with 63.16% incorporating simulation training. Approximately half of the respondents believe that more time should be dedicated to robotic surgery training (52.63%), with simulation training prior to console use (84.21%). About two-thirds of the respondents (63.16%) believe that a formal robotic surgery curriculum should be established as a part of general surgery residency, with more than half believing that exposure should occur in postgraduate year one (55%). A formal robotics curriculum with simulation training and early surgical exposure for general surgery residents should be given consideration in surgical residency training.
LACIE performance predictor FOC users manual
NASA Technical Reports Server (NTRS)
1976-01-01
The LACIE Performance Predictor (LPP) is a computer simulation of the LACIE process for predicting worldwide wheat production. The simulation provides for the introduction of various errors into the system and provides estimates based on these errors, thus allowing the user to determine the impact of selected error sources. The FOC LPP simulates the acquisition of the sample segment data by the LANDSAT Satellite (DAPTS), the classification of the agricultural area within the sample segment (CAMS), the estimation of the wheat yield (YES), and the production estimation and aggregation (CAS). These elements include data acquisition characteristics, environmental conditions, classification algorithms, the LACIE aggregation and data adjustment procedures. The operational structure for simulating these elements consists of the following key programs: (1) LACIE Utility Maintenance Process, (2) System Error Executive, (3) Ephemeris Generator, (4) Access Generator, (5) Acquisition Selector, (6) LACIE Error Model (LEM), and (7) Post Processor.
Parallel grid library for rapid and flexible simulation development
NASA Astrophysics Data System (ADS)
Honkonen, I.; von Alfthan, S.; Sandroos, A.; Janhunen, P.; Palmroth, M.
2013-04-01
We present an easy to use and flexible grid library for developing highly scalable parallel simulations. The distributed cartesian cell-refinable grid (dccrg) supports adaptive mesh refinement and allows an arbitrary C++ class to be used as cell data. The amount of data in grid cells can vary both in space and time allowing dccrg to be used in very different types of simulations, for example in fluid and particle codes. Dccrg transfers the data between neighboring cells on different processes transparently and asynchronously allowing one to overlap computation and communication. This enables excellent scalability at least up to 32 k cores in magnetohydrodynamic tests depending on the problem and hardware. In the version of dccrg presented here part of the mesh metadata is replicated between MPI processes reducing the scalability of adaptive mesh refinement (AMR) to between 200 and 600 processes. Dccrg is free software that anyone can use, study and modify and is available at https://gitorious.org/dccrg. Users are also kindly requested to cite this work when publishing results obtained with dccrg. Catalogue identifier: AEOM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOM_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU Lesser General Public License version 3 No. of lines in distributed program, including test data, etc.: 54975 No. of bytes in distributed program, including test data, etc.: 974015 Distribution format: tar.gz Programming language: C++. Computer: PC, cluster, supercomputer. Operating system: POSIX. The code has been parallelized using MPI and tested with 1-32768 processes RAM: 10 MB-10 GB per process Classification: 4.12, 4.14, 6.5, 19.3, 19.10, 20. External routines: MPI-2 [1], boost [2], Zoltan [3], sfc++ [4] Nature of problem: Grid library supporting arbitrary data in grid cells, parallel adaptive mesh refinement, transparent remote neighbor data updates and load balancing. Solution method: The simulation grid is represented by an adjacency list (graph) with vertices stored into a hash table and edges into contiguous arrays. Message Passing Interface standard is used for parallelization. Cell data is given as a template parameter when instantiating the grid. Restrictions: Logically cartesian grid. Running time: Running time depends on the hardware, problem and the solution method. Small problems can be solved in under a minute and very large problems can take weeks. The examples and tests provided with the package take less than about one minute using default options. In the version of dccrg presented here the speed of adaptive mesh refinement is at most of the order of 106 total created cells per second. http://www.mpi-forum.org/. http://www.boost.org/. K. Devine, E. Boman, R. Heaphy, B. Hendrickson, C. Vaughan, Zoltan data management services for parallel dynamic applications, Comput. Sci. Eng. 4 (2002) 90-97. http://dx.doi.org/10.1109/5992.988653. https://gitorious.org/sfc++.
NASA Technical Reports Server (NTRS)
Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike
2001-01-01
Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and integrating a real-time ship motion model developed based on a batch model from Naval Surface Warfare Center. Engineering development and integration of a three degrees-of-freedom (DOF) dynamic seat to simulate high frequency rotor-dynamics dependent motion cues for use in conjunction with the large motion system was accomplished. The development of an LHA visual model in several different levels of resolution and an aural cueing system in which three separate fidelity levels could be selected were developed. VMS also integrated a PC-based E&S simFUSION system to investigate cost effective IG alternatives. The DIMSS project consists of three phases that follow an approved Validation, Verification and accreditation (VV&A) process. The first phase will support the accreditation of the individual subsystems and models. The second will follow the verification and validation of the integrated subsystems and models, and will address fidelity requirements of the integrated models and subsystems. The third and final phase will allow the verification and validation of the full system integration. This VV&A process will address the utility of the simulated WOD launch and recovery envelope. Simulations supporting the first two stages have been completed and the data is currently being reviewed and analyzed.
Design of neurophysiologically motivated structures of time-pulse coded neurons
NASA Astrophysics Data System (ADS)
Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lobodzinska, Raisa F.
2009-04-01
The common methodology of biologically motivated concept of building of processing sensors systems with parallel input and picture operands processing and time-pulse coding are described in paper. Advantages of such coding for creation of parallel programmed 2D-array structures for the next generation digital computers which require untraditional numerical systems for processing of analog, digital, hybrid and neuro-fuzzy operands are shown. The optoelectronic time-pulse coded intelligent neural elements (OETPCINE) simulation results and implementation results of a wide set of neuro-fuzzy logic operations are considered. The simulation results confirm engineering advantages, intellectuality, circuit flexibility of OETPCINE for creation of advanced 2D-structures. The developed equivalentor-nonequivalentor neural element has power consumption of 10mW and processing time about 10...100us.
VPython: Writing Real-time 3D Physics Programs
NASA Astrophysics Data System (ADS)
Chabay, Ruth
2001-06-01
VPython (http://cil.andrew.cmu.edu/projects/visual) combines the Python programming language with an innovative 3D graphics module called Visual, developed by David Scherer. Designed to make 3D physics simulations accessible to novice programmers, VPython allows the programmer to write a purely computational program without any graphics code, and produces an interactive realtime 3D graphical display. In a program 3D objects are created and their positions modified by computational algorithms. Running in a separate thread, the Visual module monitors the positions of these objects and renders them many times per second. Using the mouse, one can zoom and rotate to navigate through the scene. After one hour of instruction, students in an introductory physics course at Carnegie Mellon University, including those who have never programmed before, write programs in VPython to model the behavior of physical systems and to visualize fields in 3D. The Numeric array processing module allows the construction of more sophisticated simulations and models as well. VPython is free and open source. The Visual module is based on OpenGL, and runs on Windows, Linux, and Macintosh.
NASA Technical Reports Server (NTRS)
Fritsch, J. Michael; Kain, John S.
1996-01-01
Research efforts focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (KF) and the Betts-Miller (BM) schemes. The second system was the June 10-11, 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.
Experimental Simulations to Understand the Lunar and Martian Surficial Processes
NASA Astrophysics Data System (ADS)
Zhao, Y. Y. S.; Li, X.; Tang, H.; Li, Y.; Zeng, X.; Chang, R.; Li, S.; Zhang, S.; Jin, H.; Mo, B.; Li, R.; Yu, W.; Wang, S.
2016-12-01
In support with China's Lunar and Mars exploration programs and beyond, our center is dedicated to understand the surficial processes and environments of planetary bodies. Over the latest several years, we design, build and optimize experimental simulation facilities and utilize them to test hypotheses and evaluate affecting mechanisms under controlled conditions particularly relevant to the Moon and Mars. Among the fundamental questions to address, we emphasize on five major areas: (1) Micrometeorites bombardment simulation to evaluate the formation mechanisms of np-Fe0 which was found in lunar samples and the possible sources of Fe. (2) Solar wind implantation simulation to evaluate the alteration/amorphization/OH or H2O formation on the surface of target minerals or rocks. (3) Dusts mobility characteristics on the Moon and other planetary bodies by excitation different types of dust particles and measuring their movements. (4) Mars basaltic soil simulant development (e.g., Jining Martian Soil Simulant (JMSS-1)) and applications for scientific/engineering experiments. (5) Halogens (Cl and Br) and life essential elements (C, H, O, N, P, and S) distribution and speciation on Mars during surficial processes such as sedimentary- and photochemical- related processes. Depending on the variables of interest, the simulation systems provide flexibility to vary source of energy, temperature, pressure, and ambient gas composition in the reaction chambers. Also, simulation products can be observed or analyzed in-situ by various analyzer components inside the chamber, without interrupting the experimental conditions. In addition, behavior of elements and isotopes during certain surficial processes (e.g., evaporation, dissolution, etc.) can be theoretically predicted by our theoretical geochemistry group with thermodynamics-kinetics calculation and modeling, which supports experiment design and result interpretation.
NASA Technical Reports Server (NTRS)
Dubos, Gregory F.; Cornford, Steven
2012-01-01
While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".
Computational Control Workstation: Users' perspectives
NASA Technical Reports Server (NTRS)
Roithmayr, Carlos M.; Straube, Timothy M.; Tave, Jeffrey S.
1993-01-01
A Workstation has been designed and constructed for rapidly simulating motions of rigid and elastic multibody systems. We examine the Workstation from the point of view of analysts who use the machine in an industrial setting. Two aspects of the device distinguish it from other simulation programs. First, one uses a series of windows and menus on a computer terminal, together with a keyboard and mouse, to provide a mathematical and geometrical description of the system under consideration. The second hallmark is a facility for animating simulation results. An assessment of the amount of effort required to numerically describe a system to the Workstation is made by comparing the process to that used with other multibody software. The apparatus for displaying results as a motion picture is critiqued as well. In an effort to establish confidence in the algorithms that derive, encode, and solve equations of motion, simulation results from the Workstation are compared to answers obtained with other multibody programs. Our study includes measurements of computational speed.
Comas, Mercè; Arrospide, Arantzazu; Mar, Javier; Sala, Maria; Vilaprinyó, Ester; Hernández, Cristina; Cots, Francesc; Martínez, Juan; Castells, Xavier
2014-01-01
To assess the budgetary impact of switching from screen-film mammography to full-field digital mammography in a population-based breast cancer screening program. A discrete-event simulation model was built to reproduce the breast cancer screening process (biennial mammographic screening of women aged 50 to 69 years) combined with the natural history of breast cancer. The simulation started with 100,000 women and, during a 20-year simulation horizon, new women were dynamically entered according to the aging of the Spanish population. Data on screening were obtained from Spanish breast cancer screening programs. Data on the natural history of breast cancer were based on US data adapted to our population. A budget impact analysis comparing digital with screen-film screening mammography was performed in a sample of 2,000 simulation runs. A sensitivity analysis was performed for crucial screening-related parameters. Distinct scenarios for recall and detection rates were compared. Statistically significant savings were found for overall costs, treatment costs and the costs of additional tests in the long term. The overall cost saving was 1,115,857€ (95%CI from 932,147 to 1,299,567) in the 10th year and 2,866,124€ (95%CI from 2,492,610 to 3,239,638) in the 20th year, representing 4.5% and 8.1% of the overall cost associated with screen-film mammography. The sensitivity analysis showed net savings in the long term. Switching to digital mammography in a population-based breast cancer screening program saves long-term budget expense, in addition to providing technical advantages. Our results were consistent across distinct scenarios representing the different results obtained in European breast cancer screening programs.
Comas, Mercè; Arrospide, Arantzazu; Mar, Javier; Sala, Maria; Vilaprinyó, Ester; Hernández, Cristina; Cots, Francesc; Martínez, Juan; Castells, Xavier
2014-01-01
Objective To assess the budgetary impact of switching from screen-film mammography to full-field digital mammography in a population-based breast cancer screening program. Methods A discrete-event simulation model was built to reproduce the breast cancer screening process (biennial mammographic screening of women aged 50 to 69 years) combined with the natural history of breast cancer. The simulation started with 100,000 women and, during a 20-year simulation horizon, new women were dynamically entered according to the aging of the Spanish population. Data on screening were obtained from Spanish breast cancer screening programs. Data on the natural history of breast cancer were based on US data adapted to our population. A budget impact analysis comparing digital with screen-film screening mammography was performed in a sample of 2,000 simulation runs. A sensitivity analysis was performed for crucial screening-related parameters. Distinct scenarios for recall and detection rates were compared. Results Statistically significant savings were found for overall costs, treatment costs and the costs of additional tests in the long term. The overall cost saving was 1,115,857€ (95%CI from 932,147 to 1,299,567) in the 10th year and 2,866,124€ (95%CI from 2,492,610 to 3,239,638) in the 20th year, representing 4.5% and 8.1% of the overall cost associated with screen-film mammography. The sensitivity analysis showed net savings in the long term. Conclusions Switching to digital mammography in a population-based breast cancer screening program saves long-term budget expense, in addition to providing technical advantages. Our results were consistent across distinct scenarios representing the different results obtained in European breast cancer screening programs. PMID:24832200
1988-06-01
became apparent. ESC originally planned to confect a dedicated model, i.e., one specifically designed to address Korea. However, it reconsidered the...s) and should not be construed as an official US Department of the Army position, policy, or decision unless so designated by other official...model based on object-oriented programming design techniques, and uses the process view of simulation to achieve its purpose. As a direct con
Range Image Processing for Local Navigation of an Autonomous Land Vehicle.
1986-09-01
such as doing long term exploration missions on the surface of the planets which mankind may wish to investigate . Certainly, mankind will soon return...intelligence programming, walking technology, and vision sensors to name but a few. 10 The purpose of this thesis will be to investigate , by simulation...bitmap graphics, both of which are important to this simulation. Finally, the methodology for displaying the symbolic information generated by the
NASA Technical Reports Server (NTRS)
Dexter, Daniel E.; Varesic, Tony E.
2015-01-01
This document describes the design of the Integrated Mission Simulation (IMSim) federate multiphase initialization process. The main goal of multiphase initialization is to allow for data interdependencies during the federate initialization process. IMSim uses the High Level Architecture (HLA) IEEE 1516 [1] to provide the communication and coordination between the distributed parts of the simulation. They are implemented using the Runtime Infrastructure (RTI) from Pitch Technologies AB. This document assumes a basic understanding of IEEE 1516 HLA, and C++ programming. In addition, there are several subtle points in working with IEEE 1516 and the Pitch RTI that need to be understood, which are covered in Appendix A. Please note the C++ code samples shown in this document are for the IEEE 1516-2000 standard.
NASA Technical Reports Server (NTRS)
Cassidy, J. J., III
1978-01-01
NASCAP simulates the charging process for a complex object in either tenuous plasma (geosynchronous orbit) or ground test (electron gun source) environment. Program control words, the structure of user input files, and various user options available are described in this computer programmer's user manual.
Simulation and analysis of main steam control system based on heat transfer calculation
NASA Astrophysics Data System (ADS)
Huang, Zhenqun; Li, Ruyan; Feng, Zhongbao; Wang, Songhan; Li, Wenbo; Cheng, Jiwei; Jin, Yingai
2018-05-01
In this paper, after thermal power plant 300MW boiler was studied, mat lab was used to write calculation program about heat transfer process between the main steam and boiler flue gas and amount of water was calculated to ensure the main steam temperature keeping in target temperature. Then heat transfer calculation program was introduced into Simulink simulation platform based on control system multiple models switching and heat transfer calculation. The results show that multiple models switching control system based on heat transfer calculation not only overcome the large inertia of main stream temperature, a large hysteresis characteristic of main stream temperature, but also adapted to the boiler load changing.
Simulation validation and management
NASA Astrophysics Data System (ADS)
Illgen, John D.
1995-06-01
Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.
NASA Technical Reports Server (NTRS)
Wu, S. T. (Editor); Christensen, D. L.; Head, R. R.
1978-01-01
Demonstration projects, systems-subsystems simulation programs, applications (heating, cooling, agricultural, industrial), and climatic data testing (standards, economics, institutional) are the topics of the book. Economics of preheating water for commercial use and collecting, processing, and dissemination of data for the national demonstration program are discussed. Computer simulation of a solar energy system and graphical representation of solar collector performance are considered. Attention is given to solar driven heat pumps, solar cooling equipment, hybrid passive/active solar systems, and solar farm buildings. Evaluation of a thermographic scanning device for solar energy and conservation applications, use of meteorological data in system evaluation, and biomass conversion potential are presented.
Computer-aided engineering of semiconductor integrated circuits
NASA Astrophysics Data System (ADS)
Meindl, J. D.; Dutton, R. W.; Gibbons, J. F.; Helms, C. R.; Plummer, J. D.; Tiller, W. A.; Ho, C. P.; Saraswat, K. C.; Deal, B. E.; Kamins, T. I.
1980-07-01
Economical procurement of small quantities of high performance custom integrated circuits for military systems is impeded by inadequate process, device and circuit models that handicap low cost computer aided design. The principal objective of this program is to formulate physical models of fabrication processes, devices and circuits to allow total computer-aided design of custom large-scale integrated circuits. The basic areas under investigation are (1) thermal oxidation, (2) ion implantation and diffusion, (3) chemical vapor deposition of silicon and refractory metal silicides, (4) device simulation and analytic measurements. This report discusses the fourth year of the program.
Simulating Operations at a Spaceport
NASA Technical Reports Server (NTRS)
Nevins, Michael R.
2007-01-01
SPACESIM is a computer program for detailed simulation of operations at a spaceport. SPACESIM is being developed to greatly improve existing spaceports and to aid in designing, building, and operating future spaceports, given that there is a worldwide trend in spaceport operations from very expensive, research- oriented launches to more frequent commercial launches. From an operational perspective, future spaceports are expected to resemble current airports and seaports, for which it is necessary to resolve issues of safety, security, efficient movement of machinery and people, cost effectiveness, timeliness, and maximizing effectiveness in utilization of resources. Simulations can be performed, for example, to (1) simultaneously analyze launches of reusable and expendable rockets and identify bottlenecks arising from competition for limited resources or (2) perform what-if scenario analyses to identify optimal scenarios prior to making large capital investments. SPACESIM includes an object-oriented discrete-event-simulation engine. (Discrete- event simulation has been used to assess processes at modern seaports.) The simulation engine is built upon the Java programming language for maximum portability. Extensible Markup Language (XML) is used for storage of data to enable industry-standard interchange of data with other software. A graphical user interface facilitates creation of scenarios and analysis of data.
Modeling Engineered Nanomaterials (ENMs) Fate and ...
Under the Toxic Substances Control Act (TSCA), the Environmental Protection Agency (EPA) is required to perform new chemical reviews of engineered nanomaterials (ENMs) identified in pre-manufacture notices. However, environmental fate models developed for traditional contaminants are limited in their ability to simulate the environmental behavior of nanomaterials due to incomplete understanding and representation of the processes governing nanomaterial distribution in the environment and by scarce empirical data quantifying the interaction of nanomaterials with environmental surfaces. We have updated the Water Quality Analysis Simulation Program (WASP), version S, to incorporate nanomaterials as an explicitly simulated state variable. WASPS now has the capability to simulate nanomaterial fate and transport in surface waters and sediments using heteroaggregation, the kinetic process governing the attachment of nanomaterials to particles and subsequently ENM distribution in the aqueous and sediment phases. Unlike dissolved chemicals which use equilibrium partition coefficients, heteroaggregation consists of a particle collision rate and an attachment efficiency ( lXhet) that generally acts as a one direction process. To demonstrate, we used a derived a het value from sediment attachment studies to parameterize WASP for simulation of multi walled carbon nanotube (MWCNT) transport in Brier Creek, a coastal plain river located in central eastern Georgia, USA and a tr
Solid State Division progress report for period ending September 30, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, P.H.; Hinton, L.W.
1994-08-01
This report covers research progress in the Solid State Division from April 1, 1992, to September 30, 1993. During this period, the division conducted a broad, interdisciplinary materials research program with emphasis on theoretical solid state physics, neutron scattering, synthesis and characterization of materials, ion beam and laser processing, and the structure of solids and surfaces. This research effort was enhanced by new capabilities in atomic-scale materials characterization, new emphasis on the synthesis and processing of materials, and increased partnering with industry and universities. The theoretical effort included a broad range of analytical studies, as well as a new emphasismore » on numerical simulation stimulated by advances in high-performance computing and by strong interest in related division experimental programs. Superconductivity research continued to advance on a broad front from fundamental mechanisms of high-temperature superconductivity to the development of new materials and processing techniques. The Neutron Scattering Program was characterized by a strong scientific user program and growing diversity represented by new initiatives in complex fluids and residual stress. The national emphasis on materials synthesis and processing was mirrored in division research programs in thin-film processing, surface modification, and crystal growth. Research on advanced processing techniques such as laser ablation, ion implantation, and plasma processing was complemented by strong programs in the characterization of materials and surfaces including ultrahigh resolution scanning transmission electron microscopy, atomic-resolution chemical analysis, synchrotron x-ray research, and scanning tunneling microscopy.« less
SCE&G Cope Station simulator training program development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stottlemire, J.L.; Fabry, R.
1996-11-01
South Carolina Electric and Gas Company made a significant investment into meeting the needs of their customers in designing and building the new fossil Generating Station near Cope, South Carolina. Cope Station is a state-of-the-art, 385 MW plant, with equipment and design features that will provide the plant with the capabilities of achieving optimum availability and capability. SCE&G has also implemented a team concept approach to plant organization at Cope Station. The modern plant design, operating philosophy, and introduction of a large percentage of new operations personnel presented a tremendous challenge in preparing for plant commissioning and commercial operation. SCE&G`smore » answer to this challenge was to hire an experienced operations trainer, and implement a comprehensive training program. An important part of the training investment was the procurement of a plant specific control room simulator. SCE&G, through tailored collaboration with the Electric Power Research Institute (EPRI), developed a specification for a simulator with the features necessary for training the initial plant staff as well as advanced operator training. The high-fidelity CRT based training simulator is a stimulated system that completely and accurately simulates the various plant systems, process startups, shutdowns, normal operating scenarios, and malfunctions. The process model stimulates a Foxboro Distributed Control System consisting of twelve control processors, five WP51 work stations, and one AW51 file server. The workstations, file server and support hardware and software necessary to interface with ESSCOR`s FSIM4 software was provided by Foxoboro.« less
NASA Astrophysics Data System (ADS)
Manninen, L. M.
1993-12-01
The document describes TKKMOD, a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout, with special emphasis on the battery submodel and its use in simulation. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project 'Engineering Design Tools for Wind-Diesel Systems' (JOUR-0078). WDLTOOLS serves as the user interface and processes the input and output data of different logistic simulation models developed by the project participants. TKKMOD cannot be run without this shell. The report only describes the simulation principles and model specific parameters of TKKMOD and gives model specific user instructions. The input and output data processing performed outside this model is described in the documentation of the shell. The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, losses in the system components, diesel fuel consumption, and the number of diesel engine starts.
Fast simulation of Proton Induced X-Ray Emission Tomography using CUDA
NASA Astrophysics Data System (ADS)
Beasley, D. G.; Marques, A. C.; Alves, L. C.; da Silva, R. C.
2013-07-01
A new 3D Proton Induced X-Ray Emission Tomography (PIXE-T) and Scanning Transmission Ion Microscopy Tomography (STIM-T) simulation software has been developed in Java and uses NVIDIA™ Common Unified Device Architecture (CUDA) to calculate the X-ray attenuation for large detector areas. A challenge with PIXE-T is to get sufficient counts while retaining a small beam spot size. Therefore a high geometric efficiency is required. However, as the detector solid angle increases the calculations required for accurate reconstruction of the data increase substantially. To overcome this limitation, the CUDA parallel computing platform was used which enables general purpose programming of NVIDIA graphics processing units (GPUs) to perform computations traditionally handled by the central processing unit (CPU). For simulation performance evaluation, the results of a CPU- and a CUDA-based simulation of a phantom are presented. Furthermore, a comparison with the simulation code in the PIXE-Tomography reconstruction software DISRA (A. Sakellariou, D.N. Jamieson, G.J.F. Legge, 2001) is also shown. Compared to a CPU implementation, the CUDA based simulation is approximately 30× faster.
NASA Astrophysics Data System (ADS)
Faber, Tracy L.; Garcia, Ernest V.; Lalush, David S.; Segars, W. Paul; Tsui, Benjamin M.
2001-05-01
The spline-based Mathematical Cardiac Torso (MCAT) phantom is a realistic software simulation designed to simulate single photon emission computed tomographic (SPECT) data. It incorporates a heart model of known size and shape; thus, it is invaluable for measuring accuracy of acquisition, reconstruction, and post-processing routines. New functionality has been added by replacing the standard heart model with left ventricular (LV) epicaridal and endocardial surface points detected from actual patient SPECT perfusion studies. LV surfaces detected from standard post-processing quantitation programs are converted through interpolation in space and time into new B-spline models. Perfusion abnormalities are added to the model based on results of standard perfusion quantification. The new LV is translated and rotated to fit within existing atria and right ventricular models, which are scaled based on the size of the LV. Simulations were created for five different patients with myocardial infractions who had undergone SPECT perfusion imaging. Shape, size, and motion of the resulting activity map were compared visually to the original SPECT images. In all cases, size, shape and motion of simulated LVs matched well with the original images. Thus, realistic simulations with known physiologic and functional parameters can be created for evaluating efficacy of processing algorithms.
Rizal, Datu; Tani, Shinichi; Nishiyama, Kimitoshi; Suzuki, Kazuhiko
2006-10-11
In this paper, a novel methodology in batch plant safety and reliability analysis is proposed using a dynamic simulator. A batch process involving several safety objects (e.g. sensors, controller, valves, etc.) is activated during the operational stage. The performance of the safety objects is evaluated by the dynamic simulation and a fault propagation model is generated. By using the fault propagation model, an improved fault tree analysis (FTA) method using switching signal mode (SSM) is developed for estimating the probability of failures. The timely dependent failures can be considered as unavailability of safety objects that can cause the accidents in a plant. Finally, the rank of safety object is formulated as performance index (PI) and can be estimated using the importance measures. PI shows the prioritization of safety objects that should be investigated for safety improvement program in the plants. The output of this method can be used for optimal policy in safety object improvement and maintenance. The dynamic simulator was constructed using Visual Modeler (VM, the plant simulator, developed by Omega Simulation Corp., Japan). A case study is focused on the loss of containment (LOC) incident at polyvinyl chloride (PVC) batch process which is consumed the hazardous material, vinyl chloride monomer (VCM).
Advanced physical-chemical life support systems research
NASA Technical Reports Server (NTRS)
Evanich, Peggy L.
1988-01-01
A proposed NASA space research and technology development program will provide adequate data for designing closed loop life support systems for long-duration manned space missions. This program, referred to as the Pathfinder Physical-Chemical Closed Loop Life Support Program, is to identify and develop critical chemical engineering technologies for the closure of air and water loops within the spacecraft, surface habitats or mobility devices. Computerized simulation can be used both as a research and management tool. Validated models will guide the selection of the best known applicable processes and in the development of new processes. For the integration of the habitat system, a biological subsystem would be introduced to provide food production and to enhance the physical-chemical life support functions on an ever-increasing basis.
Markov chains for testing redundant software
NASA Technical Reports Server (NTRS)
White, Allan L.; Sjogren, Jon A.
1988-01-01
A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.
Legacy model integration for enhancing hydrologic interdisciplinary research
NASA Astrophysics Data System (ADS)
Dozier, A.; Arabi, M.; David, O.
2013-12-01
Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.
Composite Study Of Aerosol Long-Range Transport Events From East Asia And North America
NASA Astrophysics Data System (ADS)
Jiang, X.; Waliser, D. E.; Guan, B.; Xavier, P.; Petch, J.; Klingaman, N. P.; Woolnough, S.
2011-12-01
While the Madden-Julian Oscillation (MJO) exerts pronounced influences on global climate and weather systems, current general circulation models (GCMs) exhibit rather limited capability in representing this prominent tropical variability mode. Meanwhile, the fundamental physics of the MJO are still elusive. Given the central role of the diabatic heating for prevailing MJO theories and demands for reducing the model deficiencies in simulating the MJO, a global model inter-comparison project on diabatic processes and vertical heating structure associated with the MJO has been coordinated through a joint effort by the WCRP-WWRP/THORPEX YOTC MJO Task Force and GEWEX GASS Program. In this presentation, progress of this model inter-comparison project will be reported, with main focus on climate simulations from about 27 atmosphere-only and coupled GCMs. Vertical structures of heating and diabatic processes associated with the MJO based on multi-model simulations will be presented along with their reanalysis and satellite estimate counterparts. Key processes possibly responsible for a realistic simulation of the MJO, including moisture-convection interaction, gross moist stability, ocean coupling, and surface heat flux, will be discussed.
Unifying Model-Based and Reactive Programming within a Model-Based Executive
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Cognitive simulation as a tool for cognitive task analysis.
Roth, E M; Woods, D D; Pople, H E
1992-10-01
Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Originally developed in 1999, an updated version 8.8.0 with bug fixes was released on September 30th, 2017. EnergyPlus™ is a whole building energy simulation program that engineers, architects, and researchers use to model both energy consumption—for heating, cooling, ventilation, lighting and plug and process loads—and water use in buildings. EnergyPlus is a console-based program that reads input and writes output to text files. It ships with a number of utilities including IDF-Editor for creating input files using a simple spreadsheet-like interface, EP-Launch for managing input and output files and performing batch simulations, and EP-Compare for graphically comparing the results ofmore » two or more simulations. Several comprehensive graphical interfaces for EnergyPlus are also available. DOE does most of its work with EnergyPlus using the OpenStudio® software development kit and suite of applications. DOE releases major updates to EnergyPlus twice annually.« less
Mathematical model of salt cavern leaching for gas storage in high-insoluble salt formations.
Li, Jinlong; Shi, Xilin; Yang, Chunhe; Li, Yinping; Wang, Tongtao; Ma, Hongling
2018-01-10
A mathematical model is established to predict the salt cavern development during leaching in high-insoluble salt formations. The salt-brine mass transfer rate is introduced, and the effects of the insoluble sediments on the development of the cavern are included. Considering the salt mass conservation in the cavern, the couple equations of the cavern shape, brine concentration and brine velocity are derived. According to the falling and accumulating rules of the insoluble particles, the governing equations of the insoluble sediments are deduced. A computer program using VC++ language is developed to obtain the numerical solution of these equations. To verify the proposed model, the leaching processes of two salt caverns of Jintan underground gas storage are simulated by the program, using the actual geological and technological parameters. The same simulation is performed by the current mainstream leaching software in China. The simulation results of the two programs are compared with the available field data. It shows that the proposed software is more accurate on the shape prediction of the cavern bottom and roof, which demonstrates the reliability and applicability of the model.
A new algorithm for modeling friction in dynamic mechanical systems
NASA Technical Reports Server (NTRS)
Hill, R. E.
1988-01-01
A method of modeling friction forces that impede the motion of parts of dynamic mechanical systems is described. Conventional methods in which the friction effect is assumed a constant force, or torque, in a direction opposite to the relative motion, are applicable only to those cases where applied forces are large in comparison to the friction, and where there is little interest in system behavior close to the times of transitions through zero velocity. An algorithm is described that provides accurate determination of friction forces over a wide range of applied force and velocity conditions. The method avoids the simulation errors resulting from a finite integration interval used in connection with a conventional friction model, as is the case in many digital computer-based simulations. The algorithm incorporates a predictive calculation based on initial conditions of motion, externally applied forces, inertia, and integration step size. The predictive calculation in connection with an external integration process provides an accurate determination of both static and Coulomb friction forces and resulting motions in dynamic simulations. Accuracy of the results is improved over that obtained with conventional methods and a relatively large integration step size is permitted. A function block for incorporation in a specific simulation program is described. The general form of the algorithm facilitates implementation with various programming languages such as FORTRAN or C, as well as with other simulation programs.
NASA Technical Reports Server (NTRS)
Stevens, N. J.
1979-01-01
Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.
NIGHTHAWK simulates the fate and transport of biogeochemically reactive contaminants in the saturated subsurface. Version 1.2 supports batch and one- dimensional advective-dispersive-reactive transport involving a number of biogeochemical processes, including: microbially-mediate...
NASA Astrophysics Data System (ADS)
Xiang, Lin
This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8 th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on natural selection implemented in a charter school of a major California city during spring semester of 2009. Eight 8th grade students, two boys and six girls, participated in this study. All of them were low socioeconomic status (SES). English was a second language for all of them, but they had been identified as fluent English speakers at least a year before the study. None of them had learned either natural selection or programming before the study. The study spanned over 7 weeks and was comprised of two study phases. In phase one the subject students learned natural selection in science classroom and how to do programming in NetLogo, an ABPM tool, in a computer lab; in phase two, the subject students were asked to program a simulation of adaptation based on the natural selection model in NetLogo. Both qualitative and quantitative data were collected in this study. The data resources included (1) pre and post test questionnaire, (2) student in-class worksheet, (3) programming planning sheet, (4) code-conception matching sheet, (5) student NetLogo projects, (6) videotaped programming processes, (7) final interview, and (8) investigator's field notes. Both qualitative and quantitative approaches were applied to analyze the gathered data. The findings suggested that students made progress on understanding adaptation phenomena and natural selection at the end of ABPM-supported MBI learning but the progress was limited. These students still held some misconceptions in their conceptual models, such as the idea that animals need to "learn" to adapt into the environment. Besides, their models of natural selection appeared to be incomplete and many relationships among the model ideas had not been well established by the end of the study. Most of them did not treat the natural selection model as a whole but only focused on some ideas within the model. Very few of them could scientifically apply the natural selection model to interpret other evolutionary phenomena. The findings about participating students' programming processes revealed these processes were composed of consecutive programming cycles. The cycle typically included posing a task, constructing and running program codes, and examining the resulting simulation. Students held multiple ideas and applied various programming strategies in these cycles. Students were involved in MBI at each step of a cycle. Three types of ideas, six programming strategies and ten MBI actions were identified out of the processes. The relationships among these ideas, strategies and actions were also identified and described. Findings suggested that ABPM activities could support MBI by (1) exposing students' personal models and understandings, (2) provoking and supporting a series of model-based inquiry activities, such as elaborating target phenomena, abstracting patterns, and revising conceptual models, and (3) provoking and supporting tangible and productive conversations among students, as well as between the instructor and students. Findings also revealed three programming behaviors that appeared to impede productive MBI, including (1) solely phenomenon-orientated programming, (2) transplanting program codes, and (3) blindly running procedures. Based on the findings, I propose a general modeling process in ABPM activities, summarize the ways in which MBI can be supported in ABPM activities and constrained by multiple factors, and suggest the implications of this study in the future ABPM-assisted science instructional design and research.
Talk the Talk: Implementing a Communication Curriculum for Surgical Residents.
Newcomb, Anna B; Trickey, Amber W; Porrey, Melissa; Wright, Jeffrey; Piscitani, Franco; Graling, Paula; Dort, Jonathan
The Accreditation Council for Graduate Medical Education milestones provide a framework of specific interpersonal and communication skills that surgical trainees should aim to master. However, training and assessment of resident nontechnical skills remains challenging. We aimed to develop and implement a curriculum incorporating interactive learning principles such as group discussion and simulation-based scenarios to formalize instruction in patient-centered communication skills, and to identify best practices when building such a program. The curriculum is presented in quarterly modules over a 2-year cycle. Using our surgical simulation center for the training, we focused on proven strategies for interacting with patients and other providers. We trained and used former patients as standardized participants (SPs) in communication scenarios. Surgical simulation center in a 900-bed tertiary care hospital. Program learners were general surgery residents (postgraduate year 1-5). Trauma Survivors Network volunteers served as SPs in simulation scenarios. We identified several important lessons: (1) designing and implementing a new curriculum is a challenging process with multiple barriers and complexities; (2) several readily available facilitators can ease the implementation process; (3) with the right approach, learners, faculty, and colleagues are enthusiastic and engaged participants; (4) learners increasingly agree that communication skills can be improved with practice and appreciate the curriculum value; (5) patient SPs can be valuable members of the team; and importantly (6) the culture of patient-physician communication appears to shift with the implementation of such a curriculum. Our approach using Trauma Survivors Network volunteers as SPs could be reproduced in other institutions with similar programs. Faculty enthusiasm and support is strong, and learner participation is active. Continued focus on patient and family communication skills would enhance patient care for institutions providing such education as well as for institutions where residents continue on in fellowships or begin their surgical practice. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Exploiting parallel computing with limited program changes using a network of microcomputers
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.; Sobieszczanski-Sobieski, J.
1985-01-01
Network computing and multiprocessor computers are two discernible trends in parallel processing. The computational behavior of an iterative distributed process in which some subtasks are completed later than others because of an imbalance in computational requirements is of significant interest. The effects of asynchronus processing was studied. A small existing program was converted to perform finite element analysis by distributing substructure analysis over a network of four Apple IIe microcomputers connected to a shared disk, simulating a parallel computer. The substructure analysis uses an iterative, fully stressed, structural resizing procedure. A framework of beams divided into three substructures is used as the finite element model. The effects of asynchronous processing on the convergence of the design variables are determined by not resizing particular substructures on various iterations.
Simulation Training in Obstetrics and Gynaecology Residency Programs in Canada.
Sanders, Ari; Wilson, R Douglas
2015-11-01
The integration of simulation into residency programs has been slower in obstetrics and gynaecology than in other surgical specialties. The goal of this study was to evaluate the current use of simulation in obstetrics and gynaecology residency programs in Canada. A 19-question survey was developed and distributed to all 16 active and accredited obstetrics and gynaecology residency programs in Canada. The survey was sent to program directors initially, but on occasion was redirected to other faculty members involved in resident education or to senior residents. Survey responses were collected over an 18-month period. Twelve programs responded to the survey (11 complete responses). Eleven programs (92%) reported introducing an obstetrics and gynaecology simulation curriculum into their residency education. All respondents (100%) had access to a simulation centre. Simulation was used to teach various obstetrical and gynaecological skills using different simulation modalities. Barriers to simulation integration were primarily the costs of equipment and space and the need to ensure dedicated time for residents and educators. The majority of programs indicated that it was a priority for them to enhance their simulation curriculum and transition to competency-based resident assessment. Simulation training has increased in obstetrics and gynaecology residency programs. The development of formal simulation curricula for use in obstetrics and gynaecology resident education is in early development. A standardized national simulation curriculum would help facilitate the integration of simulation into obstetrics and gynaecology resident education and aid in the shift to competency-based resident assessment. Obstetrics and gynaecology residency programs need national collaboration (between centres and specialties) to develop a standardized simulation curriculum for use in obstetrics and gynaecology residency programs in Canada.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Yi; Fakcharoenphol, Perapon; Wang, Shihao
2013-12-01
TOUGH2-EGS-MP is a parallel numerical simulation program coupling geomechanics with fluid and heat flow in fractured and porous media, and is applicable for simulation of enhanced geothermal systems (EGS). TOUGH2-EGS-MP is based on the TOUGH2-MP code, the massively parallel version of TOUGH2. In TOUGH2-EGS-MP, the fully-coupled flow-geomechanics model is developed from linear elastic theory for thermo-poro-elastic systems and is formulated in terms of mean normal stress as well as pore pressure and temperature. Reservoir rock properties such as porosity and permeability depend on rock deformation, and the relationships between these two, obtained from poro-elasticity theories and empirical correlations, are incorporatedmore » into the simulation. This report provides the user with detailed information on the TOUGH2-EGS-MP mathematical model and instructions for using it for Thermal-Hydrological-Mechanical (THM) simulations. The mathematical model includes the fluid and heat flow equations, geomechanical equation, and discretization of those equations. In addition, the parallel aspects of the code, such as domain partitioning and communication between processors, are also included. Although TOUGH2-EGS-MP has the capability for simulating fluid and heat flows coupled with geomechanical effects, it is up to the user to select the specific coupling process, such as THM or only TH, in a simulation. There are several example problems illustrating applications of this program. These example problems are described in detail and their input data are presented. Their results demonstrate that this program can be used for field-scale geothermal reservoir simulation in porous and fractured media with fluid and heat flow coupled with geomechanical effects.« less
NASA Technical Reports Server (NTRS)
Goltz, G.; Kaiser, L. M.; Weiner, H.
1977-01-01
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document establishes the software requirements for the DSPA computer program, discusses the processing that occurs within the program, and defines the necessary interfaces for operation.
1974-12-01
incineration of chemical agent mustard and pesticides are presented. 1. EDGEWOOD ARSENAL INCINERATION PROGRAM The name of the program which we...only 5 elements to a compound read. -This was fine for mustard, but had to be altered when we wished to simulate the incineration of a nerve agent VX...input data to this program. A process flow sheet of the scrubber system is shown in Figure 1. The incinerator burns Mustard Agent . The off gas from
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1988-01-01
The Rubber Airplane program, which combines two symbolic processing techniques with a component-based database of design knowledge, is proposed as a computer aid for conceptual design. Using object-oriented programming, programs are organized around the objects and behavior to be simulated, and using constraint propagation, declarative statements designate mathematical relationships among all the equation variables. It is found that the additional level of organizational structure resulting from the arrangement of the design information in terms of design components provides greater flexibility and convenience.
LACIE performance predictor final operational capability program description, volume 3
NASA Technical Reports Server (NTRS)
1976-01-01
The requirements and processing logic for the LACIE Error Model program (LEM) are described. This program is an integral part of the Large Area Crop Inventory Experiment (LACIE) system. LEM is that portion of the LPP (LACIE Performance Predictor) which simulates the sample segment classification, strata yield estimation, and production aggregation. LEM controls repetitive Monte Carlo trials based on input error distributions to obtain statistical estimates of the wheat area, yield, and production at different levels of aggregation. LEM interfaces with the rest of the LPP through a set of data files.
Processing EOS MLS Level-2 Data
NASA Technical Reports Server (NTRS)
Snyder, W. Van; Wu, Dong; Read, William; Jiang, Jonathan; Wagner, Paul; Livesey, Nathaniel; Schwartz, Michael; Filipiak, Mark; Pumphrey, Hugh; Shippony, Zvi
2006-01-01
A computer program performs level-2 processing of thermal-microwave-radiance data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS). The purpose of the processing is to estimate the composition and temperature of the atmosphere versus altitude from .8 to .90 km. "Level-2" as used here is a specialists f term signifying both vertical profiles of geophysical parameters along the measurement track of the instrument and processing performed by this or other software to generate such profiles. Designed to be flexible, the program is controlled via a configuration file that defines all aspects of processing, including contents of state and measurement vectors, configurations of forward models, measurement and calibration data to be read, and the manner of inverting the models to obtain the desired estimates. The program can operate in a parallel form in which one instance of the program acts a master, coordinating the work of multiple slave instances on a cluster of computers, each slave operating on a portion of the data. Optionally, the configuration file can be made to instruct the software to produce files of simulated radiances based on state vectors formed from sets of geophysical data-product files taken as input.
PPM Receiver Implemented in Software
NASA Technical Reports Server (NTRS)
Gray, Andrew; Kang, Edward; Lay, Norman; Vilnrotter, Victor; Srinivasan, Meera; Lee, Clement
2010-01-01
A computer program has been written as a tool for developing optical pulse-position- modulation (PPM) receivers in which photodetector outputs are fed to analog-to-digital converters (ADCs) and all subsequent signal processing is performed digitally. The program can be used, for example, to simulate an all-digital version of the PPM receiver described in Parallel Processing of Broad-Band PPM Signals (NPO-40711), which appears elsewhere in this issue of NASA Tech Briefs. The program can also be translated into a design for digital PPM receiver hardware. The most notable innovation embodied in the software and the underlying PPM-reception concept is a digital processing subsystem that performs synchronization of PPM time slots, even though the digital processing is, itself, asynchronous in the sense that no attempt is made to synchronize it with the incoming optical signal a priori and there is no feedback to analog signal processing subsystems or ADCs. Functions performed by the software receiver include time-slot synchronization, symbol synchronization, coding preprocessing, and diagnostic functions. The program is written in the MATLAB and Simulink software system. The software receiver is highly parameterized and, hence, programmable: for example, slot- and symbol-synchronization filters have programmable bandwidths.
Analysis of the possibilities and limits of the Moldflow method
NASA Astrophysics Data System (ADS)
Brierre, M.
1982-01-01
The Moldflow information and computation service is presented. Moldflow is a computer program and data bank available as a computer aid to dimensioning thermoplastic injection molding equipment and processes. It is based on the simultaneous solution of thermal and rheological equations and is intended to completely simulate the injection process. The Moldflow system is described and algorithms are discussed, based on Moldflow listings.
ROMPS critical design review. Volume 1: Hardware
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1992-01-01
Topics concerning the Robot-Operated Material Processing in Space (ROMPS) Program are presented in viewgraph form and include the following: a systems overview; servocontrol and servomechanisms; testbed and simulation results; system V controller; robot module; furnace module; SCL experiment supervisor; SCL script sample processing control; SCL experiment supervisor fault handling; block diagrams; hitchhiker interfaces; battery systems; watchdog timers; mechanical/thermal systems; and fault conditions and recovery.
System-Wide Water Resources Program Nutrient Sub-Model (SWWRP-NSM) Version 1.1
2008-09-01
species including crops, native grasses, and trees . The process descriptions utilize a single plant growth model to simulate all types of land covers...characteristics: • Multi- species , multi-phase, and multi-reaction system • Fast (equilibrium-based) and slow (non-equilibrium-based or rate- based...Transformation and loading of N and P species in the overland flow • Simulation of the N and P cycle in the water column (both overland and
Rapid Prediction of Unsteady Three-Dimensional Viscous Flows in Turbopump Geometries
NASA Technical Reports Server (NTRS)
Dorney, Daniel J.
1998-01-01
A program is underway to improve the efficiency of a three-dimensional Navier-Stokes code and generalize it for nozzle and turbopump geometries. Code modifications will include the implementation of parallel processing software, incorporating new physical models and generalizing the multi-block capability to allow the simultaneous simulation of nozzle and turbopump configurations. The current report contains details of code modifications, numerical results of several flow simulations and the status of the parallelization effort.
SimBRS: A University/Industry Consortium Focused on Simulation Based Solutions for Ground Vehicles
2009-07-29
plan is to use the SimBRS contract mechanism to streamline a process that applies research funds into a managed program, that is cognizant to the... designs . Therefore, the challenge for the SimBRS team is to establish an approach based on the capacity of measured data and simulations to support ...by systematically relating appropriate results from measurements and applied research in engineering and science. In turn, basic research and
LSPC is the Loading Simulation Program in C++, a watershed modeling system that includes streamlined Hydrologic Simulation Program Fortran (HSPF) algorithms for simulating hydrology, sediment, and general water quality
Simulation of the hyperspectral data from multispectral data using Python programming language
NASA Astrophysics Data System (ADS)
Tiwari, Varun; Kumar, Vinay; Pandey, Kamal; Ranade, Rigved; Agarwal, Shefali
2016-04-01
Multispectral remote sensing (MRS) sensors have proved their potential in acquiring and retrieving information of Land Use Land (LULC) Cover features in the past few decades. These MRS sensor generally acquire data within limited broad spectral bands i.e. ranging from 3 to 10 number of bands. The limited number of bands and broad spectral bandwidth in MRS sensors becomes a limitation in detailed LULC studies as it is not capable of distinguishing spectrally similar LULC features. On the counterpart, fascinating detailed information available in hyperspectral (HRS) data is spectrally over determined and able to distinguish spectrally similar material of the earth surface. But presently the availability of HRS sensors is limited. This is because of the requirement of sensitive detectors and large storage capability, which makes the acquisition and processing cumbersome and exorbitant. So, there arises a need to utilize the available MRS data for detailed LULC studies. Spectral reconstruction approach is one of the technique used for simulating hyperspectral data from available multispectral data. In the present study, spectral reconstruction approach is utilized for the simulation of hyperspectral data using EO-1 ALI multispectral data. The technique is implemented using python programming language which is open source in nature and possess support for advanced imaging processing libraries and utilities. Over all 70 bands have been simulated and validated using visual interpretation, statistical and classification approach.
Mantle Convection on Modern Supercomputers
NASA Astrophysics Data System (ADS)
Weismüller, J.; Gmeiner, B.; Huber, M.; John, L.; Mohr, M.; Rüde, U.; Wohlmuth, B.; Bunge, H. P.
2015-12-01
Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures is handled successfully only in an interdisciplinary context. A new priority program - named SPPEXA - by the German Research Foundation (DFG) addresses this issue, and brings together computer scientists, mathematicians and application scientists around grand challenges in HPC. Here we report from the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection and assess the impact of small scale processes on global mantle flow.
Hamman, William R; Beaudin-Seiler, Beth M; Beaubien, Jeffrey M
2010-09-01
In the report "Five Years After 'To Err is Human' ", it was noted that "the combination of complexity, professional fragmentation, and a tradition of individualism, enhanced by a well-entrenched hierarchical authority structure and diffuse accountability, forms a daunting barrier to creating the habits and beliefs of common purpose, teamwork, and individual accountability for successful interdependence that a safe culture requires". Training physicians, nurses, and other professionals to work in teams is a concept that has been promoted by many patient safety experts. However the model of teamwork in healthcare is diffusely defined, no clear performance metrics have been established, and the use of simulation to train teams has been suboptimal. This paper reports on the first three years of work performed in the Michigan Economic Development Corporation (MEDC) Tri-Corridor life science grant to apply concepts and processes of simulation design that were developed in the air carrier industry to understand and train healthcare teams. This work has been monitored by the American Academy for the Advancement of Science (AAA) and is based on concepts designed in the Advanced Qualification Program (AQP) from the air carrier industry, which trains and assesses teamwork skills in the same manner as technical skills. This grant has formed the foundation for the Center of Excellence for Simulation Education and Research (CESR).
Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study
McKenna, Kim D.; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann
2015-01-01
Abstract Objectives. The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders’ efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. Methods. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Results. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt they should use more simulation. Conclusions. Paramedic programs have and have access to diverse simulation resources; however, faculty training and other program resources appear to influence their use. PMID:25664774
Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study.
McKenna, Kim D; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann
2015-01-01
The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders' efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt they should use more simulation. Paramedic programs have and have access to diverse simulation resources; however, faculty training and other program resources appear to influence their use.
Computer model to simulate testing at the National Transonic Facility
NASA Technical Reports Server (NTRS)
Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.
1995-01-01
A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.
Kim, Sung Bong; Park, Chulhwan; Kim, Seung Wook
2014-11-01
To design biorefinery processes producing bioethanol from lignocellulosic biomass with dilute acid pretreatment, biorefinery processes were simulated using the SuperPro Designer program. To improve the efficiency of biomass use and the economics of biorefinery, additional pretreatment processes were designed and evaluated, in which a combined process of dilute acid and aqueous ammonia pretreatments, and a process of waste media containing xylose were used, for the production of 7-aminocephalosporanic acid. Finally, the productivity and economics of the designed processes were compared. Copyright © 2014 Elsevier Ltd. All rights reserved.
Fast Simulation of Electromagnetic Showers in the ATLAS Calorimeter: Frozen Showers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barberio, E.; /Melbourne U.; Boudreau, J.
2011-11-29
One of the most time consuming process simulating pp interactions in the ATLAS detector at LHC is the simulation of electromagnetic showers in the calorimeter. In order to speed up the event simulation several parametrisation methods are available in ATLAS. In this paper we present a short description of a frozen shower technique, together with some recent benchmarks and comparison with full simulation. An expected high rate of proton-proton collisions in ATLAS detector at LHC requires large samples of simulated events (Monte Carlo) to study various physics processes. A detailed simulation of particle reactions ('full simulation') in the ATLAS detectormore » is based on GEANT4 and is very accurate. However, due to complexity of the detector, high particle multiplicity and GEANT4 itself, the average CPU time spend to simulate typical QCD event in pp collision is 20 or more minutes for modern computers. During detector simulation the largest time is spend in the calorimeters (up to 70%) most of which is required for electromagnetic particles in the electromagnetic (EM) part of the calorimeters. This is the motivation for fast simulation approaches which reduce the simulation time without affecting the accuracy. Several of fast simulation methods available within the ATLAS simulation framework (standard Athena based simulation program) are discussed here with the focus on the novel frozen shower library (FS) technique. The results obtained with FS are presented here as well.« less
Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators
NASA Astrophysics Data System (ADS)
Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.
2015-12-01
Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.
NASA Technical Reports Server (NTRS)
Fritsch, J. Michael (Principal Investigator); Kain, John S.
1995-01-01
Research efforts during the first year focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (1993 - KF) and the Betts-Miller (Betts 1986- BM) schemes. The second system was the June 10-11 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.
Baker, John [Walnut Creek, CA; Archer, Daniel E [Knoxville, TN; Luke, Stanley John [Pleasanton, CA; Decman, Daniel J [Livermore, CA; White, Gregory K [Livermore, CA
2009-06-23
A tailpulse signal generating/simulating apparatus, system, and method designed to produce electronic pulses which simulate tailpulses produced by a gamma radiation detector, including the pileup effect caused by the characteristic exponential decay of the detector pulses, and the random Poisson distribution pulse timing for radioactive materials. A digital signal process (DSP) is programmed and configured to produce digital values corresponding to pseudo-randomly selected pulse amplitudes and pseudo-randomly selected Poisson timing intervals of the tailpulses. Pulse amplitude values are exponentially decayed while outputting the digital value to a digital to analog converter (DAC). And pulse amplitudes of new pulses are added to decaying pulses to simulate the pileup effect for enhanced realism in the simulation.
A piezoelectric shock-loading response simulator for piezoelectric-based device developers
NASA Astrophysics Data System (ADS)
Rastegar, J.; Feng, Z.
2017-04-01
Pulsed loading of piezoelectric transducers occurs in many applications, such as those in munitions firing, or when a mechanical system is subjected to impact type loading. In this paper, an electronic simulator that can be programmed to generate electrical charges that a piezoelectric transducer generates as it is subjected to various shock loading profiles is presented. The piezoelectric output simulator can provide close to realistic outputs so that the circuit designer can use it to test the developed system under close to realistic conditions without the need for the costly and time consuming process of performing actual tests. The design of the electronic simulator and results of its testing are presented.
Djukic, Tijana; Mandic, Vesna; Filipovic, Nenad
2013-12-01
Medical education, training and preoperative diagnostics can be drastically improved with advanced technologies, such as virtual reality. The method proposed in this paper enables medical doctors and students to visualize and manipulate three-dimensional models created from CT or MRI scans, and also to analyze the results of fluid flow simulations. Simulation of fluid flow using the finite element method is performed, in order to compute the shear stress on the artery walls. The simulation of motion through the artery is also enabled. The virtual reality system proposed here could shorten the length of training programs and make the education process more effective. © 2013 Published by Elsevier Ltd.
Artificial intelligence (AI) based tactical guidance for fighter aircraft
NASA Technical Reports Server (NTRS)
Mcmanus, John W.; Goodrich, Kenneth H.
1990-01-01
A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The knowledge-based systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real time in the Langley Differential Maneuvering Simulator, are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs.