Cook, Tessa S; Hernandez, Jessica; Scanlon, Mary; Langlotz, Curtis; Li, Chun-Der L
2016-07-01
Despite its increasing use in training other medical specialties, high-fidelity simulation to prepare diagnostic radiology residents for call remains an underused educational resource. To attempt to characterize the barriers toward adoption of this technology, we conducted a survey of academic radiologists and radiology trainees. An Institutional Review Board-approved survey was distributed to the Association of University Radiologists members via e-mail. Survey results were collected electronically, tabulated, and analyzed. A total of 68 survey responses representing 51 programs were received from program directors, department chairs, chief residents, and program administrators. The most common form of educational activity for resident call preparation was lectures. Faculty supervised "baby call" was also widely reported. Actual simulated call environments were quite rare with only three programs reporting this type of educational activity. Barriers to the use of simulation include lack of faculty time, lack of faculty expertise, and lack of perceived need. High-fidelity simulation can be used to mimic the high-stress, high-stakes independent call environment that the typical radiology resident encounters during the second year of training, and can provide objective data for program directors to assess the Accreditation Council of Graduate Medical Education milestones. We predict that this technology will begin to supplement traditional diagnostic radiology teaching methods and to improve patient care and safety in the next decade. Published by Elsevier Inc.
Performance Analysis of an Actor-Based Distributed Simulation
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1998-01-01
Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.
Development of a multilayer interference simulation program for MSS systems
NASA Technical Reports Server (NTRS)
Izadian, Jamal S.
1993-01-01
This paper discusses the development of a multilayer interference analysis and simulation program which is used to evaluate interference between non-geostationary and geostationary satellites. In addition to evaluating interference, this program can be used in the development of sharing criteria and coordination among various Mobile Satellite Services (MSS) systems. A C++/Windows implementation of this program, called Globalstar Interference Simulation Program (GISP), has been developed.
Flight dynamics analysis and simulation of heavy lift airships. Volume 5: Programmer's manual
NASA Technical Reports Server (NTRS)
Ringland, R. F.; Tischler, M. B.; Jex, H. R.; Emmen, R. D.; Ashkenas, I. L.
1982-01-01
The Programmer's Manual contains explanations of the logic embodied in the various program modules, a dictionary of program variables, a subroutine listing, subroutine/common block/cross reference listing, and a calling/called subroutine cross reference listing.
Radar target classification studies: Software development and documentation
NASA Astrophysics Data System (ADS)
Kamis, A.; Garber, F.; Walton, E.
1985-09-01
Three computer programs were developed to process and analyze calibrated radar returns. The first program, called DATABASE, was developed to create and manage a random accessed data base. The second program, called FTRAN DB, was developed to process horizontal and vertical polarizations radar returns into different formats (i.e., time domain, circular polarizations and polarization parameters). The third program, called RSSE, was developed to simulate a variety of radar systems and to evaluate their ability to identify radar returns. Complete computer listings are included in the appendix volumes.
Introducing Computer Simulation into the High School: An Applied Mathematics Curriculum.
ERIC Educational Resources Information Center
Roberts, Nancy
1981-01-01
A programing language called DYNAMO, developed especially for writing simulation models, is promoted. Details of six, self-teaching curriculum packages recently developed for simulation-oriented instruction are provided. (MP)
Kelly, Michelle M; Blunt, Elizabeth; Nestor, Kelly
2017-12-01
Few nurse practitioner (NP) programs include an after-hours/on-call component in their clinical preparation of NP students. This role is expected in many primary and specialty care practices, and is one that students feel unprepared to competently navigate. Utilizing simulated callers as patients or parents, NP students participated in a simulated after-hours/on-call experience that included receiving the call, managing the patient, and submitting documentation of the encounter. Students completed pre- and postparticipation evaluations, and were evaluated by the simulated patient callers and faculty using standardized evaluation tools. NP students rated the experience as an educationally valuable experience despite feeling anxious and nervous about the experience. Several essential skills were identified including critical thinking, clear communication, self-confidence, and access to resources. After participation NP students were more receptive to an NP position with an on-call component. Inclusion of a simulated on-call experience is a feasible component of NP education and should be added to the NP curriculum. ©2017 American Association of Nurse Practitioners.
ERIC Educational Resources Information Center
Shultz, Gary
This chapter describes the development of a set of programs called "History Comes Alive," a series of historical simulations and interactive experiences for students at heritage sites in Ontario. The programs allow students from Ontario and New York to relive the past by spending 3 days and 2 nights in a simulated historical setting. In…
Molecular Dynamics Simulations of Chemical Reactions for Use in Education
ERIC Educational Resources Information Center
Qian Xie; Tinker, Robert
2006-01-01
One of the simulation engines of an open-source program called the Molecular Workbench, which can simulate thermodynamics of chemical reactions, is described. This type of real-time, interactive simulation and visualization of chemical reactions at the atomic scale could help students understand the connections between chemical reaction equations…
NASA Technical Reports Server (NTRS)
Fortenbaugh, R. L.
1980-01-01
Instructions for using Vertical Attitude Takeoff and Landing Aircraft Simulation (VATLAS), the digital simulation program for application to vertical attitude takeoff and landing (VATOL) aircraft developed for installation on the NASA Ames CDC 7600 computer system are described. The framework for VATLAS is the Off-Line Simulation (OLSIM) routine. The OLSIM routine provides a flexible framework and standardized modules which facilitate the development of off-line aircraft simulations. OLSIM runs under the control of VTOLTH, the main program, which calls the proper modules for executing user specified options. These options include trim, stability derivative calculation, time history generation, and various input-output options.
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
Heat simulation via Scilab programming
NASA Astrophysics Data System (ADS)
Hasan, Mohammad Khatim; Sulaiman, Jumat; Karim, Samsul Arifin Abdul
2014-07-01
This paper discussed the used of an open source sofware called Scilab to develop a heat simulator. In this paper, heat equation was used to simulate heat behavior in an object. The simulator was developed using finite difference method. Numerical experiment output show that Scilab can produce a good heat behavior simulation with marvellous visual output with only developing simple computer code.
NASA Technical Reports Server (NTRS)
1995-01-01
As a Jet Propulsion Laboratory astronomer, John D. Callahan developed a computer program called Multimission Interactive Planner (MIP) to help astronomers analyze scientific and optical data collected on the Voyager's Grand Tour. The commercial version of the program called XonVu is published by XonTech, Inc. Callahan has since developed two more advanced programs based on MIP technology, Grand Tour and Jovian Traveler, which simulate Voyager and Giotto missions. The software allows astronomers and space novices to view the objects seen by the spacecraft, manipulating perspective, distance and field of vision.
SIMULATION GAMING FOR MANAGEMENT DEVELOPMENT.
ERIC Educational Resources Information Center
MCKENNEY, JAMES L.
THE PRESENT HARVARD BUSINESS SCHOOL MANAGEMENT SIMULATION GAME WAS DEVELOPED AS A TEACHING DEVICE FOR CLASSES OF 20 OR MORE STUDENTS GROUPED INTO FOUR- AND FIVE-MAN TEAMS CALLED "FIRMS." EACH FIRM COMPETES WITH OTHERS IN AN "INDUSTRY," AN ECONOMIC ABSTRACTION OF A CONSUMER GOODS MARKET PROGRAMED TO BE SIMULATED ON AN ELECTRONIC…
ERIC Educational Resources Information Center
Haji, Faizal A.; Hoppe, Daniel J.; Morin, Marie-Paule; Giannoulakis, Konstantine; Koh, Jansen; Rojas, David; Cheung, Jeffrey J. H.
2014-01-01
Rapid technological advances and concern for patient safety have increased the focus on simulation as a pedagogical tool for educating health care providers. To date, simulation research scholarship has focused on two areas; evaluating instructional designs of simulation programs, and the integration of simulation into a broader educational…
The Creation of a CPU Timer for High Fidelity Programs
NASA Technical Reports Server (NTRS)
Dick, Aidan A.
2011-01-01
Using C and C++ programming languages, a tool was developed that measures the efficiency of a program by recording the amount of CPU time that various functions consume. By inserting the tool between lines of code in the program, one can receive a detailed report of the absolute and relative time consumption associated with each section. After adapting the generic tool for a high-fidelity launch vehicle simulation program called MAVERIC, the components of a frequently used function called "derivatives ( )" were measured. Out of the 34 sub-functions in "derivatives ( )", it was found that the top 8 sub-functions made up 83.1% of the total time spent. In order to decrease the overall run time of MAVERIC, a launch vehicle simulation program, a change was implemented in the sub-function "Event_Controller ( )". Reformatting "Event_Controller ( )" led to a 36.9% decrease in the total CPU time spent by that sub-function, and a 3.2% decrease in the total CPU time spent by the overarching function "derivatives ( )".
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computers
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1975-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar digital programs on future engine simulation philosophy is also discussed.
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
An Agent-Based Cockpit Task Management System
NASA Technical Reports Server (NTRS)
Funk, Ken
1997-01-01
An agent-based program to facilitate Cockpit Task Management (CTM) in commercial transport aircraft is developed and evaluated. The agent-based program called the AgendaManager (AMgr) is described and evaluated in a part-task simulator study using airline pilots.
Kalet, Adina; Zabar, Sondra; Szyld, Demian; Yavner, Steven D; Song, Hyuksoon; Nick, Michael W; Ng, Grace; Pusic, Martin V; Denicola, Christine; Blum, Cary; Eliasz, Kinga L; Nicholson, Joey; Riles, Thomas S
2017-01-01
Transitioning medical students are anxious about their readiness-for-internship, as are their residency program directors and teaching hospital leadership responsible for care quality and patient safety. A readiness-for-internship assessment program could contribute to ensuring optimal quality and safety and be a key element in implementing competency-based, time-variable medical education. In this paper, we describe the development of the Night-onCall program (NOC), a 4-h readiness-for-internship multi-instructional method simulation event. NOC was designed and implemented over the course of 3 years to provide an authentic "night on call" experience for near graduating students and build measurements of students' readiness for this transition framed by the Association of American Medical College's Core Entrustable Professional Activities for Entering Residency. The NOC is a product of a program of research focused on questions related to enabling individualized pathways through medical training. The lessons learned and modifications made to create a feasible, acceptable, flexible, and educationally rich NOC are shared to inform the discussion about transition to residency curriculum and best practices regarding educational handoffs from undergraduate to graduate education.
DOT National Transportation Integrated Search
1982-07-01
In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...
Computer simulator for a mobile telephone system
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1981-01-01
A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.
ERIC Educational Resources Information Center
Zillesen, P. G. van Schaick; And Others
Instructional feedback given to the learners during computer simulation sessions may be greatly improved by integrating educational computer simulation programs with hypermedia-based computer-assisted learning (CAL) materials. A prototype of a learning environment of this type called BRINE PURIFICATION was developed for use in corporate training…
A microcomputer model for simulating pressurized flow in a storm sewer system : final report.
DOT National Transportation Integrated Search
1989-01-01
A review was made of several computer programs capable of simulating sewer flows under surcharge or pressurized flow conditions. A modified version of the EXTRAN module of the SYMM model, called PFSM, was developed and attached to the FHYA Pooled Fun...
NASA Astrophysics Data System (ADS)
Szidarovszky, Tamás; Jono, Maho; Yamanouchi, Kaoru
2018-07-01
A user-friendly and cross-platform software called Laser-Induced Molecular Alignment and Orientation simulator (LIMAO) has been developed. The program can be used to simulate within the rigid rotor approximation the rotational dynamics of gas phase molecules induced by linearly polarized intense laser fields at a given temperature. The software is implemented in the Java and Mathematica programming languages. The primary aim of LIMAO is to aid experimental scientists in predicting and analyzing experimental data representing laser-induced spatial alignment and orientation of molecules.
Haji, Faizal A; Hoppe, Daniel J; Morin, Marie-Paule; Giannoulakis, Konstantine; Koh, Jansen; Rojas, David; Cheung, Jeffrey J H
2014-05-01
Rapid technological advances and concern for patient safety have increased the focus on simulation as a pedagogical tool for educating health care providers. To date, simulation research scholarship has focused on two areas; evaluating instructional designs of simulation programs, and the integration of simulation into a broader educational context. However, these two categories of research currently exist under a single label-Simulation-Based Medical Education. In this paper we argue that introducing a more refined nomenclature within which to frame simulation research is necessary for researchers, to appropriately design research studies and describe their findings, and for end-point users (such as program directors and educators), to more appropriately understand and utilize this evidence.
NASA Technical Reports Server (NTRS)
Sandy, Michael
2015-01-01
The Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 is an excavation robot for mining regolith on a planet like Mars. The robot is programmed using the Robotic Operating System (ROS) and it also uses a physical simulation program called Gazebo. This internship focused on various functions of the program in order to make it a more professional and efficient robot. During the internship another project called the Smart Autonomous Sand-Swimming Excavator was worked on. This is a robot that is designed to dig through sand and extract sample material. The intern worked on programming the Sand-Swimming robot, and designing the electrical system to power and control the robot.
Social Emotional Optimization Algorithm for Nonlinear Constrained Optimization Problems
NASA Astrophysics Data System (ADS)
Xu, Yuechun; Cui, Zhihua; Zeng, Jianchao
Nonlinear programming problem is one important branch in operational research, and has been successfully applied to various real-life problems. In this paper, a new approach called Social emotional optimization algorithm (SEOA) is used to solve this problem which is a new swarm intelligent technique by simulating the human behavior guided by emotion. Simulation results show that the social emotional optimization algorithm proposed in this paper is effective and efficiency for the nonlinear constrained programming problems.
ISS Robotic Student Programming
NASA Technical Reports Server (NTRS)
Barlow, J.; Benavides, J.; Hanson, R.; Cortez, J.; Le Vasseur, D.; Soloway, D.; Oyadomari, K.
2016-01-01
The SPHERES facility is a set of three free-flying satellites launched in 2006. In addition to scientists and engineering, middle- and high-school students program the SPHERES during the annual Zero Robotics programming competition. Zero Robotics conducts virtual competitions via simulator and on SPHERES aboard the ISS, with students doing the programming. A web interface allows teams to submit code, receive results, collaborate, and compete in simulator-based initial rounds and semi-final rounds. The final round of each competition is conducted with SPHERES aboard the ISS. At the end of 2017 a new robotic platform called Astrobee will launch, providing new game elements and new ground support for even more student interaction.
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
EDIN0613P weight estimating program. [for launch vehicles
NASA Technical Reports Server (NTRS)
Hirsch, G. N.
1976-01-01
The weight estimating relationships and program developed for space power system simulation are described. The program was developed to size a two-stage launch vehicle for the space power system. The program is actually part of an overall simulation technique called EDIN (Engineering Design and Integration) system. The program sizes the overall vehicle, generates major component weights and derives a large amount of overall vehicle geometry. The program is written in FORTRAN V and is designed for use on the Univac Exec 8 (1110). By utilizing the flexibility of this program while remaining cognizant of the limits imposed upon output depth and accuracy by utilization of generalized input, this program concept can be a useful tool for estimating purposes at the conceptual design stage of a launch vehicle.
Generic Modeling of a Life Support System for Process Technology Comparison
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
“Kicking the Tires” of the energy balance routine within the CROPGRO crop growth models of DSSAT
USDA-ARS?s Scientific Manuscript database
Two decades ago a routine called ETPHOT was written to compute evaporation, transpiration, and photosynthesis in the CROPGRO crop simulation programs for grain legumes such as soybean. These programs are part of the DSSAT (Decision Support System of Agrotechnology Transfer), which has been widely us...
High Powered Rocketry: Design, Construction, and Launching Experience and Analysis
ERIC Educational Resources Information Center
Paulson, Pryce; Curtis, Jarret; Bartel, Evan; Cyr, Waycen Owens; Lamsal, Chiranjivi
2018-01-01
In this study, the nuts and bolts of designing and building a high powered rocket have been presented. A computer simulation program called RockSim was used to design the rocket. Simulation results are consistent with time variations of altitude, velocity, and acceleration obtained in the actual flight. The actual drag coefficient was determined…
Computer programs for generation and evaluation of near-optimum vertical flight profiles
NASA Technical Reports Server (NTRS)
Sorensen, J. A.; Waters, M. H.; Patmore, L. C.
1983-01-01
Two extensive computer programs were developed. The first, called OPTIM, generates a reference near-optimum vertical profile, and it contains control options so that the effects of various flight constraints on cost performance can be examined. The second, called TRAGEN, is used to simulate an aircraft flying along an optimum or any other vertical reference profile. TRAGEN is used to verify OPTIM's output, examine the effects of uncertainty in the values of parameters (such as prevailing wind) which govern the optimum profile, or compare the cost performance of profiles generated by different techniques. A general description of these programs, the efforts to add special features to them, and sample results of their usage are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilke, Jeremiah J; Kenny, Joseph P.
2015-02-01
Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading frameworkmore » allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.« less
The development of an interim generalized gate logic software simulator
NASA Technical Reports Server (NTRS)
Mcgough, J. G.; Nemeroff, S.
1985-01-01
A proof-of-concept computer program called IGGLOSS (Interim Generalized Gate Logic Software Simulator) was developed and is discussed. The simulator engine was designed to perform stochastic estimation of self test coverage (fault-detection latency times) of digital computers or systems. A major attribute of the IGGLOSS is its high-speed simulation: 9.5 x 1,000,000 gates/cpu sec for nonfaulted circuits and 4.4 x 1,000,000 gates/cpu sec for faulted circuits on a VAX 11/780 host computer.
Interactive grid generation for turbomachinery flow field simulations
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Eiseman, Peter R.; Reno, Charles
1988-01-01
The control point form of algebraic grid generation presented provides the means that are needed to generate well structured grids for turbomachinery flow simulations. It uses a sparse collection of control points distributed over the flow domain. The shape and position of coordinate curves can be adjusted from these control points while the grid conforms precisely to all boundaries. An interactive program called TURBO, which uses the control point form, is being developed. Basic features of the code are discussed and sample grids are presented. A finite volume LU implicit scheme is used to simulate flow in a turbine cascade on the grid generated by the program.
Interactive grid generation for turbomachinery flow field simulations
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Reno, Charles; Eiseman, Peter R.
1988-01-01
The control point form of algebraic grid generation presented provides the means that are needed to generate well structured grids of turbomachinery flow simulations. It uses a sparse collection of control points distributed over the flow domain. The shape and position of coordinate curves can be adjusted from these control points while the grid conforms precisely to all boundaries. An interactive program called TURBO, which uses the control point form, is being developed. Basic features of the code are discussed and sample grids are presented. A finite volume LU implicit scheme is used to simulate flow in a turbine cascade on the grid generated by the program.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
This manual describes how to use the Emulation Simulation Computer Model (ESCM). Based on G189A, ESCM computes the transient performance of a Space Station atmospheric revitalization subsystem (ARS) with CO2 removal provided by a solid amine water desorbed subsystem called SAWD. Many performance parameters are computed some of which are cabin CO2 partial pressure, relative humidity, temperature, O2 partial pressure, and dew point. The program allows the user to simulate various possible combinations of man loading, metabolic profiles, cabin volumes and certain hypothesized failures that could occur.
NASA Technical Reports Server (NTRS)
Bergeron, H. P.; Haynie, A. T.; Mcdede, J. B.
1980-01-01
A general aviation single pilot instrument flight rule simulation capability was developed. Problems experienced by single pilots flying in IFR conditions were investigated. The simulation required a three dimensional spatial navaid environment of a flight navigational area. A computer simulation of all the navigational aids plus 12 selected airports located in the Washington/Norfolk area was developed. All programmed locations in the list were referenced to a Cartesian coordinate system with the origin located at a specified airport's reference point. All navigational aids with their associated frequencies, call letters, locations, and orientations plus runways and true headings are included in the data base. The simulation included a TV displayed out-the-window visual scene of country and suburban terrain and a scaled model runway complex. Any of the programmed runways, with all its associated navaids, can be referenced to a runway on the airport in this visual scene. This allows a simulation of a full mission scenario including breakout and landing.
Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam
2014-08-01
Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.
Solar heating and cooling system design and development
NASA Technical Reports Server (NTRS)
1978-01-01
The progress of the program during the sixth program quarter is reported. The program calls for the development and delivery of eight prototype solar heating and cooling systems for installation and operational test. The William O'Brien single-family heating system was installed and is operational. The New Castle single-family heating residence is under construction. The Kansas University (KU) system is in the final design stages. The 25 ton cooling subsystem for KU is the debugging stage. Pressure drops that were greater than anticipated were encountered. The 3 ton simulation work is being finalized and the design parameters for the Rankine system were determined from simulation output.
ERIC Educational Resources Information Center
Dickinson, J. Barry; Dickinson, Carleen D.
2012-01-01
This study examines the impact that experienced mentoring has on business decisions in a higher education business school. Students, arranged in teams, were given the opportunity to operate virtual companies in a well-known, business simulation program called Capsim. They were required to make decisions concerning marketing, production, finance,…
NASA Technical Reports Server (NTRS)
Sang, Janche
2003-01-01
Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.
NASA Technical Reports Server (NTRS)
Plankey, B.
1981-01-01
A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.
An Integrated Development Environment for Adiabatic Quantum Programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; McCaskey, Alex; Bennink, Ryan S
2014-01-01
Adiabatic quantum computing is a promising route to the computational power afforded by quantum information processing. The recent availability of adiabatic hardware raises the question of how well quantum programs perform. Benchmarking behavior is challenging since the multiple steps to synthesize an adiabatic quantum program are highly tunable. We present an adiabatic quantum programming environment called JADE that provides control over all the steps taken during program development. JADE captures the workflow needed to rigorously benchmark performance while also allowing a variety of problem types, programming techniques, and processor configurations. We have also integrated JADE with a quantum simulation enginemore » that enables program profiling using numerical calculation. The computational engine supports plug-ins for simulation methodologies tailored to various metrics and computing resources. We present the design, integration, and deployment of JADE and discuss its use for benchmarking adiabatic quantum programs.« less
GASP-PL/I Simulation of Integrated Avionic System Processor Architectures. M.S. Thesis
NASA Technical Reports Server (NTRS)
Brent, G. A.
1978-01-01
A development study sponsored by NASA was completed in July 1977 which proposed a complete integration of all aircraft instrumentation into a single modular system. Instead of using the current single-function aircraft instruments, computers compiled and displayed inflight information for the pilot. A processor architecture called the Team Architecture was proposed. This is a hardware/software approach to high-reliability computer systems. A follow-up study of the proposed Team Architecture is reported. GASP-PL/1 simulation models are used to evaluate the operating characteristics of the Team Architecture. The problem, model development, simulation programs and results at length are presented. Also included are program input formats, outputs and listings.
Dynamics of flexible bodies in tree topology - A computer oriented approach
NASA Technical Reports Server (NTRS)
Singh, R. P.; Vandervoort, R. J.; Likins, P. W.
1984-01-01
An approach suited for automatic generation of the equations of motion for large mechanical systems (i.e., large space structures, mechanisms, robots, etc.) is presented. The system topology is restricted to a tree configuration. The tree is defined as an arbitrary set of rigid and flexible bodies connected by hinges characterizing relative translations and rotations of two adjoining bodies. The equations of motion are derived via Kane's method. The resulting equation set is of minimum dimension. Dynamical equations are imbedded in a computer program called TREETOPS. Extensive control simulation capability is built in the TREETOPS program. The simulation is driven by an interactive set-up program resulting in an easy to use analysis tool.
NASA Technical Reports Server (NTRS)
Benyo, Theresa L.
2002-01-01
Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.
Development of the functional simulator for the Galileo attitude and articulation control system
NASA Technical Reports Server (NTRS)
Namiri, M. K.
1983-01-01
A simulation program for verifying and checking the performance of the Galileo Spacecraft's Attitude and Articulation Control Subsystem's (AACS) flight software is discussed. The program, which is called Functional Simulator (FUNSIM), provides a simple method of interfacing user-supplied mathematical models coded in FORTRAN which describes spacecraft dynamics, sensors, and actuators; this is done with the AACS flight software, coded in HAL/S (High-level Advanced Language/Shuttle). It is thus able to simulate the AACS flight software accurately to the HAL/S statement level in the environment of a mainframe computer system. FUNSIM also has a command and data subsystem (CDS) simulator. It is noted that the input/output data and timing are simulated with the same precision as the flight microprocessor. FUNSIM uses a variable stepsize numerical integration algorithm complete with individual error bound control on the state variable to solve the equations of motion. The program has been designed to provide both line printer and matrix dot plotting of the variables requested in the run section and to provide error diagnostics.
An Evaluation of Operation Moonvigil. Number 14.
ERIC Educational Resources Information Center
Virgin, Albert E.
An evaluation assessed the simulation game called "Operation Moonvigil". The program consisted of eight daily five minute telecasts followed by 30 minutes of classroom activities based on information communicated during the telecast. Teachers, pupils, and non-participant observers provided data through questionnaires, diaries, and…
Unique medical education programs at Nippon Medical School.
Shimura, Toshiro; Yoshimura, Akinobu; Saito, Takuya; Aso, Ryoko
2008-08-01
In an attempt to improve the content of the educational programs offered by Nippon Medical School and to better prepare our students to work in the rapidly changing world of medicine, the school has recently revamped its teaching methodology. Particular emphasis has been placed on 1) simulator-based education involving the evaluation of students and residents in a new clinical simulation laboratory; 2) improving communication skills with the extensive help of simulated patients; 3) improving medical English education; 4) providing early clinical exposure with a one-week clinical nursing program for the first year students to increase student motivation at an early stage in their studies; 5) a new program called Novel Medical Science, which aims to introduce first-year students to the schools fundamental educational philosophy and thereby increase their motivation to become ideal physicians. The programs have been designed in line with 2006 guidelines issued by the Ministry of Education, Culture, Sports, Science and Technology to allow flexibility for students to take part in education outside their own departments and year groups as part of the Ministry's program to encourage distinctive education at Japanese universities.
Simulation of car collision with an impact block
NASA Astrophysics Data System (ADS)
Kostek, R.; Aleksandrowicz, P.
2017-10-01
This article presents the experimental results of crash test of Fiat Cinquecento performed by Allgemeiner Deutscher Automobil-Club (ADAC) and the simulation results obtained with program called V-SIM for default settings. At the next stage a wheel was blocked and the parameters of contact between the vehicle and the barrier were changed for better results matching. The following contact parameters were identified: stiffness at compression phase, stiffness at restitution phase, the coefficients of restitution and friction. The changes lead to various post-impact positions, which shows sensitivity of the results to contact parameters. V-SIM is commonly used by expert witnesses who tend to use default settings, therefore the companies offering simulation programs should identify those parameters with due diligence.
NASA Technical Reports Server (NTRS)
Betts, W. S., Jr.
1972-01-01
A computer program called HOPI was developed to predict reorientation flow dynamics, wherein liquids move from one end of a closed, partially filled, rigid container to the other end under the influence of container acceleration. The program uses the simplified marker and cell numerical technique and, using explicit finite-differencing, solves the Navier-Stokes equations for an incompressible viscous fluid. The effects of turbulence are also simulated in the program. HOPI can consider curved as well as straight walled boundaries. Both free-surface and confined flows can be calculated. The program was used to simulate five liquid reorientation cases. Three of these cases simulated actual NASA LeRC drop tower test conditions while two cases simulated full-scale Centaur tank conditions. It was concluded that while HOPI can be used to analytically determine the fluid motion in a typical settling problem, there is a current need to optimize HOPI. This includes both reducing the computer usage time and also reducing the core storage required for a given size problem.
DOT National Transportation Integrated Search
2009-11-01
The Oregon Department of Transportation and Portland State University evaluated the seismic : vulnerability of state highway bridges in western Oregon. The study used a computer program : called REDARS2 that simulated the damage to bridges within a t...
Context Switching with Multiple Register Windows: A RISC Performance Study
NASA Technical Reports Server (NTRS)
Konsek, Marion B.; Reed, Daniel A.; Watcharawittayakul, Wittaya
1987-01-01
Although previous studies have shown that a large file of overlapping register windows can greatly reduce procedure call/return overhead, the effects of register windows in a multiprogramming environment are poorly understood. This paper investigates the performance of multiprogrammed, reduced instruction set computers (RISCs) as a function of window management strategy. Using an analytic model that reflects context switch and procedure call overheads, we analyze the performance of simple, linearly self-recursive programs. For more complex programs, we present the results of a simulation study. These studies show that a simple strategy that saves all windows prior to a context switch, but restores only a single window following a context switch, performs near optimally.
Using Phun to Study ``Perpetual Motion'' Machines
NASA Astrophysics Data System (ADS)
Koreš, Jaroslav
2012-05-01
The concept of "perpetual motion" has a long history. The Indian astronomer and mathematician Bhaskara II (12th century) was the first person to describe a perpetual motion (PM) machine. An example of a 13th- century PM machine is shown in Fig. 1. Although the law of conservation of energy clearly implies the impossibility of PM construction, over the centuries numerous proposals for PM have been made, involving ever more elements of modern science in their construction. It is possible to test a variety of PM machines in the classroom using a program called Phun2 or its commercial version Algodoo.3 The programs are designed to simulate physical processes and we can easily simulate mechanical machines using them. They provide an intuitive graphical environment controlled with a mouse; a programming language is not needed. This paper describes simulations of four different (supposed) PM machines.4
Construction of a parallel processor for simulating manipulators and other mechanical systems
NASA Technical Reports Server (NTRS)
Hannauer, George
1991-01-01
This report summarizes the results of NASA Contract NAS5-30905, awarded under phase 2 of the SBIR Program, for a demonstration of the feasibility of a new high-speed parallel simulation processor, called the Real-Time Accelerator (RTA). The principal goals were met, and EAI is now proceeding with phase 3: development of a commercial product. This product is scheduled for commercial introduction in the second quarter of 1992.
Simulation of a master-slave event set processor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Comfort, J.C.
1984-03-01
Event set manipulation may consume a considerable amount of the computation time spent in performing a discrete-event simulation. One way of minimizing this time is to allow event set processing to proceed in parallel with the remainder of the simulation computation. The paper describes a multiprocessor simulation computer, in which all non-event set processing is performed by the principal processor (called the host). Event set processing is coordinated by a front end processor (the master) and actually performed by several other functionally identical processors (the slaves). A trace-driven simulation program modeling this system was constructed, and was run with tracemore » output taken from two different simulation programs. Output from this simulation suggests that a significant reduction in run time may be realized by this approach. Sensitivity analysis was performed on the significant parameters to the system (number of slave processors, relative processor speeds, and interprocessor communication times). A comparison between actual and simulation run times for a one-processor system was used to assist in the validation of the simulation. 7 references.« less
Cosmic dust analog simulation in a microgravity environment: The STARDUST program
NASA Technical Reports Server (NTRS)
Ferguson, F.; Lilleleht, L. U.; Nuth, J.; Stephens, J. R.; Bussoletti, E.; Carotenuto, L.; Colangeli, L.; Dell'aversana, P.; Mele, F.; Mennella, V.
1995-01-01
We have undertaken a project called STARDUST which is a collaboration with Italian and American investigators. The goals of this program are to study the condensation and coagulation of refractory materials from the vapor and to study the properties of the resulting grains as analogs to cosmic dust particles. To reduce thermal convective currents and to develop valuable experience in designing an experiment for the Gas-Grain Simulation Facility aboard Space Station, Freedom we have built and flown a new chamber to study these processes under periods of microgravity available on NASA's KC-135 Research Aircraft. Preliminary results from flights with magnesium and zinc are discussed.
Universal programming interface with concurrent access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alferov, Oleg
2004-10-07
There exist a number of devices with a positioning nature of operation, such as mechanical linear stages, temperature controllers, or filterwheels with discrete state, and most of them have different programming interfaces. The Universal Positioner software suggests the way to handle all of them is with a single approach, whereby a particular hardware driver is created from the template and by translating the actual commands used by the hardware to and from the universal programming interface. The software contains the universal API module itself, the demo simulation of hardware, and the front-end programs to help developers write their own softwaremore » drivers along with example drivers for actual hardware controllers. The software allows user application programs to call devices simultaneously without race conditions (multitasking and concurrent access). The template suggested in this package permits developers to integrate various devices easily into their applications using the same API. The drivers can be stacked; i.e., they can call each other via the same interface.« less
NASA Technical Reports Server (NTRS)
Baker, L. R.; Sulyma, P. R.; Tevepaugh, J. A.; Penny, M. M.
1976-01-01
Since exhaust plumes affect vehicle base environment (pressure and heat loads) and the orbiter vehicle aerodynamic control surface effectiveness, an intensive program involving detailed analytical and experimental investigations of the exhaust plume/vehicle interaction was undertaken as a pertinent part of the overall space shuttle development program. The program, called the Plume Technology program, has as its objective the determination of the criteria for simulating rocket engine (in particular, space shuttle propulsion system) plume-induced aerodynamic effects in a wind tunnel environment. The comprehensive experimental program was conducted using test facilities at NASA's Marshall Space Flight Center and Ames Research Center. A post-test examination of some of the experimental results obtained from NASA-MSFC's 14 x 14-inch trisonic wind tunnel is presented. A description is given of the test facility, simulant gas supply system, nozzle hardware, test procedure and test matrix. Analysis of exhaust plume flow fields and comparison of analytical and experimental exhaust plume data are presented.
Advanced Simulation and Computing: A Summary Report to the Director's Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, M G; Peck, T
2003-06-01
It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less
Using Immersive Virtual Reality for Electrical Substation Training
ERIC Educational Resources Information Center
Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana
2015-01-01
Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…
Enhancing Teacher Education with Simulations
ERIC Educational Resources Information Center
Kaufman, David; Ireland, Alice
2016-01-01
As calls for accountability in our schools increase, teaching quality faces scrutiny and, often, criticism. These realities challenge teacher education programs to find new ways to ensure that their graduates will be effective in highly demanding work settings. In this article the authors draw on literature and practice examples to highlight ways…
Students As Environmental Consultants Simulating Life Science Problems
ERIC Educational Resources Information Center
Roberts, Megan; Zydney, Janet Mannheimer
2004-01-01
This article describes a project in which eighth graders at East Side Middle School in New York City used an interactive multimedia program called "Pollution Solution" in a science unit on environmental pollution. Students assumed the role of environmental consultants working at fictional corporations which were being investigated for…
Optical 3-Way Handshake (O3WHS) Protocol Simulation in OMNeT++
2017-06-01
PERSON Vinod K Mishra a. REPORT Unclassified b. ABSTRACT Unclassified c . THIS PAGE Unclassified 19b. TELEPHONE NUMBER (Include area code) 410...popular program called OMNeT++2 for that purpose. It is an open-source discrete event simulator tool written in C ++ language. It has been chiefly...References 1. Von Lehmen A, Doverspike R, Clapp G, Freimuth DM, Gannett J, Kolarov A, Kobrinski H, Makaya C , Mavrogiorgis E, Pastor J, Rauch M
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1977-05-01
HELIAKI is a FORTRAN computer program which simulates the optical/thermal performance of a central receiver solar thermal power plant for the dynamic conversion of solar-generated heat to electricity. The solar power plant which this program simulates consists of a field of individual sun tracking mirror units, or heliostats, redirecting sunlight into a cavity, called the receiver, mounted atop a tower. The program calculates the power retained by that cavity receiver at any point in time or the energy into the receiver over a year's time using a Monte Carlo ray trace technique to solve the multiple integral equations. An artist'smore » concept of this plant is shown.« less
Petascale Simulation Initiative Tech Base: FY2007 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, J; Chen, R; Jefferson, D
The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscalemore » and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton (depending on when Peloton becomes available); (2) Improve SARS's robustness and ease-of-use, and develop user documentation; and (3) Work with LLNL code teams to help them determine how Symponents could benefit their applications. The original funding request was $296,000 for the year, and we eventually received $252,000. The remainder of this report describes our efforts and accomplishments for each of the goals listed above.« less
Simulating synchrotron radiation in accelerators including diffuse and specular reflections
Dugan, G.; Sagan, D.
2017-02-24
An accurate calculation of the synchrotron radiation flux within the vacuum chamber of an accelerator is needed for a number of applications. These include simulations of electron cloud effects and the design of radiation masking systems. To properly simulate the synchrotron radiation, it is important to include the scattering of the radiation at the vacuum chamber walls. To this end, a program called synrad3d has been developed which simulates the production and propagation of synchrotron radiation using a collection of photons. Photons generated by a charged particle beam are tracked from birth until they strike the vacuum chamber wall wheremore » the photon is either absorbed or scattered. Both specular and diffuse scattering is simulated. If a photon is scattered, it is further tracked through multiple encounters with the wall until it is finally absorbed. This paper describes the synrad3d program, with a focus on the details of its scattering model, and presents some examples of the program’s use.« less
NASA Astrophysics Data System (ADS)
Korol, Roman; Kilgour, Michael; Segal, Dvira
2018-03-01
We present our in-house quantum transport package, ProbeZT. This program provides linear response coefficients: electrical and electronic thermal conductances, as well as the thermopower of molecular junctions in which electrons interact with the surrounding thermal environment. Calculations are performed based on the Büttiker probe method, which introduces decoherence, energy exchange and dissipation effects phenomenologically using virtual electrode terminals called probes. The program can realize different types of probes, each introducing various environmental effects, including elastic and inelastic scattering of electrons. The molecular system is described by an arbitrary tight-binding Hamiltonian, allowing the study of different geometries beyond simple one-dimensional wires. Applications of the program to study the thermoelectric performance of molecular junctions are illustrated. The program also has a built-in functionality to simulate electron transport in double-stranded DNA molecules based on a tight-binding (ladder) description of the junction.
Managing human error in aviation.
Helmreich, R L
1997-05-01
Crew resource management (CRM) programs were developed to address team and leadership aspects of piloting modern airplanes. The goal is to reduce errors through team work. Human factors research and social, cognitive, and organizational psychology are used to develop programs tailored for individual airlines. Flight crews study accident case histories, group dynamics, and human error. Simulators provide pilots with the opportunity to solve complex flight problems. CRM in the simulator is called line-oriented flight training (LOFT). In automated cockpits CRM promotes the idea of automation as a crew member. Cultural aspects of aviation include professional, business, and national culture. The aviation CRM model has been adapted for training surgeons and operating room staff in human factors.
Cunningham, Charles E; Rimas, Heather; Chen, Yvonne; Deal, Ken; McGrath, Patrick; Lingley-Pottie, Patricia; Reid, Graham J; Lipman, Ellen; Corkum, Penny
2015-01-01
Using a discrete choice conjoint experiment, we explored the design of parenting programs as an interim strategy for families waiting for children's mental health treatment. Latent class analysis yielded 4 segments with different design preferences. Simulations predicted the Fast-Paced Personal Contact segment, 22.1% of the sample, would prefer weekly therapist-led parenting groups. The Moderate-Paced Personal Contact segment (24.7%) preferred twice-monthly therapist-led parenting groups with twice-monthly lessons. The Moderate-Paced E-Contact segment (36.3%), preferred weekly to twice-monthly contacts, e-mail networking, and a program combining therapist-led sessions with the support of a computerized telephone e-coach. The Slow-Paced E-Contact segment (16.9%) preferred an approach combining monthly therapist-led sessions, e-coaching, and e-mail networking with other parents. Simulations predicted 45.3% of parents would utilize an option combining 5 therapist coaching calls with 5 e-coaching calls, a model that could reduce costs and extend the availability of interim services. Although 41.0% preferred weekly pacing, 58% were predicted to choose an interim parenting service conducted at a twice-monthly to monthly pace. The results of this study suggest that developing interim services reflecting parental preferences requires a choice of formats that includes parenting groups, telephone-coached distance programs, and e-coaching options conducted at a flexible pace.
Teaching Note-Teaching Student Interviewing Competencies through Second Life
ERIC Educational Resources Information Center
Tandy, Cynthia; Vernon, Robert; Lynch, Darlene
2017-01-01
A prototype standardized client was created and programmed to respond to students in the 3D virtual world of Second Life. This automaton, called a "chatbot," was repeatedly interviewed by beginning MSW students in a practice course as a learning exercise. Initial results were positive and suggest the use of simulated clients in virtual…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goltz, G.; Weiner, H.
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U. S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document provides all the information necessary tomore » access the DSPA programs, to input required data and to generate appropriate Design Synthesis or Performance Analysis Output.« less
Pool, René; Heringa, Jaap; Hoefling, Martin; Schulz, Roland; Smith, Jeremy C; Feenstra, K Anton
2012-05-05
We report on a python interface to the GROMACS molecular simulation package, GromPy (available at https://github.com/GromPy). This application programming interface (API) uses the ctypes python module that allows function calls to shared libraries, for example, written in C. To the best of our knowledge, this is the first reported interface to the GROMACS library that uses direct library calls. GromPy can be used for extending the current GROMACS simulation and analysis modes. In this work, we demonstrate that the interface enables hybrid Monte-Carlo/molecular dynamics (MD) simulations in the grand-canonical ensemble, a simulation mode that is currently not implemented in GROMACS. For this application, the interplay between GromPy and GROMACS requires only minor modifications of the GROMACS source code, not affecting the operation, efficiency, and performance of the GROMACS applications. We validate the grand-canonical application against MD in the canonical ensemble by comparison of equations of state. The results of the grand-canonical simulations are in complete agreement with MD in the canonical ensemble. The python overhead of the grand-canonical scheme is only minimal. Copyright © 2012 Wiley Periodicals, Inc.
Aerodynamics model for a generic ASTOVL lift-fan aircraft
NASA Technical Reports Server (NTRS)
Birckelbaw, Lourdes G.; Mcneil, Walter E.; Wardwell, Douglas A.
1995-01-01
This report describes the aerodynamics model used in a simulation model of an advanced short takeoff and vertical landing (ASTOVL) lift-fan fighter aircraft. The simulation model was developed for use in piloted evaluations of transition and hover flight regimes, so that only low speed (M approximately 0.2) aerodynamics are included in the mathematical model. The aerodynamic model includes the power-off aerodynamic forces and moments and the propulsion system induced aerodynamic effects, including ground effects. The power-off aerodynamics data were generated using the U.S. Air Force Stability and Control Digital DATCOM program and a NASA Ames in-house graphics program called VORVIEW which allows the user to easily analyze arbitrary conceptual aircraft configurations using the VORLAX program. The jet-induced data were generated using the prediction methods of R. E. Kuhn et al., as referenced in this report.
Reservoir property grids improve with geostatistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogt, J.
1993-09-01
Visualization software, reservoir simulators and many other E and P software applications need reservoir property grids as input. Using geostatistics, as compared to other gridding methods, to produce these grids leads to the best output from the software programs. For the purpose stated herein, geostatistics is simply two types of gridding methods. Mathematically, these methods are based on minimizing or duplicating certain statistical properties of the input data. One geostatical method, called kriging, is used when the highest possible point-by-point accuracy is desired. The other method, called conditional simulation, is used when one wants statistics and texture of the resultingmore » grid to be the same as for the input data. In the following discussion, each method is explained, compared to other gridding methods, and illustrated through example applications. Proper use of geostatistical data in flow simulations, use of geostatistical data for history matching, and situations where geostatistics has no significant advantage over other methods, also will be covered.« less
The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.
2005-12-01
The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.
Cognitive simulation as a tool for cognitive task analysis.
Roth, E M; Woods, D D; Pople, H E
1992-10-01
Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.
NASA Technical Reports Server (NTRS)
Platt, M. E.; Lewis, E. E.; Boehm, F.
1991-01-01
A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.
NASA Technical Reports Server (NTRS)
Carlson, C. R.
1981-01-01
The user documentation of the SYSGEN model and its links with other simulations is described. The SYSGEN is a production costing and reliability model of electric utility systems. Hydroelectric, storage, and time dependent generating units are modeled in addition to conventional generating plants. Input variables, modeling options, output variables, and reports formats are explained. SYSGEN also can be run interactively by using a program called FEPS (Front End Program for SYSGEN). A format for SYSGEN input variables which is designed for use with FEPS is presented.
A Modular Three-Dimensional Finite-Difference Ground-Water Flow Model
McDonald, Michael G.; Harbaugh, Arlen W.; Guo, Weixing; Lu, Guoping
1988-01-01
This report presents a finite-difference model and its associated modular computer program. The model simulates flow in three dimensions. The report includes detailed explanations of physical and mathematical concepts on which the model is based and an explanation of how those concepts are incorporated in the modular structure of the computer program. The modular structure consists of a Main Program and a series of highly independent subroutines called 'modules.' The modules are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system which is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving linear equations which describe the flow system, such as the Strongly Implicit Procedure or Slice-Successive Overrelaxation. The division of the program into modules permits the user to examine specific hydrologic features of the model independently. This also facilita development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program are also designed to permit maximum flexibility. Ground-water flow within the aquifer is simulated using a block-centered finite-difference approach. Layers can be simulated as confined, unconfined, or a combination of confined and unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and streams, can also be simulated. The finite-difference equations can be solved using either the Strongly Implicit Procedure or Slice-Successive Overrelaxation. The program is written in FORTRAN 77 and will run without modification on most computers that have a FORTRAN 77 compiler. For each program ,module, this report includes a narrative description, a flow chart, a list of variables, and a module listing.
A Conceptual View of the Officer Procurement Model (TOPOPS). Technical Report No. 73-73.
ERIC Educational Resources Information Center
Akman, Allan; Nordhauser, Fred
This report presents the conceptual design of a computer-based linear programing model of the Air Force officer procurement system called TOPOPS. The TOPOPS model is an aggregate model which simulates officer accession and training and is directed at optimizing officer procurement in terms of either minimizing cost or maximizing accession quality…
NASA Technical Reports Server (NTRS)
Goltz, G.; Kaiser, L. M.; Weiner, H.
1977-01-01
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document establishes the software requirements for the DSPA computer program, discusses the processing that occurs within the program, and defines the necessary interfaces for operation.
NASA Technical Reports Server (NTRS)
Dubos, Gregory F.; Cornford, Steven
2012-01-01
While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".
DOE Office of Scientific and Technical Information (OSTI.GOV)
G.A. Pope; K. Sephernoori; D.C. McKinney
1996-03-15
This report describes the application of distributed-memory parallel programming techniques to a compositional simulator called UTCHEM. The University of Texas Chemical Flooding reservoir simulator (UTCHEM) is a general-purpose vectorized chemical flooding simulator that models the transport of chemical species in three-dimensional, multiphase flow through permeable media. The parallel version of UTCHEM addresses solving large-scale problems by reducing the amount of time that is required to obtain the solution as well as providing a flexible and portable programming environment. In this work, the original parallel version of UTCHEM was modified and ported to CRAY T3D and CRAY T3E, distributed-memory, multiprocessor computersmore » using CRAY-PVM as the interprocessor communication library. Also, the data communication routines were modified such that the portability of the original code across different computer architectures was mad possible.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goltz, G.; Weiner, H.
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document provides a detailed description of the DSPAmore » Computer Program system and its subprograms. This manual will assist the programmer in revising or updating the several subprograms.« less
VPython: Writing Real-time 3D Physics Programs
NASA Astrophysics Data System (ADS)
Chabay, Ruth
2001-06-01
VPython (http://cil.andrew.cmu.edu/projects/visual) combines the Python programming language with an innovative 3D graphics module called Visual, developed by David Scherer. Designed to make 3D physics simulations accessible to novice programmers, VPython allows the programmer to write a purely computational program without any graphics code, and produces an interactive realtime 3D graphical display. In a program 3D objects are created and their positions modified by computational algorithms. Running in a separate thread, the Visual module monitors the positions of these objects and renders them many times per second. Using the mouse, one can zoom and rotate to navigate through the scene. After one hour of instruction, students in an introductory physics course at Carnegie Mellon University, including those who have never programmed before, write programs in VPython to model the behavior of physical systems and to visualize fields in 3D. The Numeric array processing module allows the construction of more sophisticated simulations and models as well. VPython is free and open source. The Visual module is based on OpenGL, and runs on Windows, Linux, and Macintosh.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
Emulation/Simulation Computer Model (ESCM) computes the transient performance of a Space Station air revitalization subsystem with carbon dioxide removal provided by a solid amine water desorbed subsystem called SAWD. This manual describes the mathematical modeling and equations used in the ESCM. For the system as a whole and for each individual component, the fundamental physical and chemical laws which govern their operations are presented. Assumptions are stated, and when necessary, data is presented to support empirically developed relationships.
A human operator simulator model of the NASA Terminal Configured Vehicle (TCV)
NASA Technical Reports Server (NTRS)
Glenn, F. A., III; Doane, S. M.
1981-01-01
A generic operator model called HOS was used to simulate the behavior and performance of a pilot flying a transport airplane during instrument approach and landing operations in order to demonstrate the applicability of the model to problems associated with interfacing a crew with a flight system. The model which was installed and operated on NASA Langley's central computing system is described. Preliminary results of its application to an investigation of an innovative display system under development in Langley's terminal configured vehicle program are considered.
Leake, S.A.; Prudic, David E.
1988-01-01
The process of permanent compaction is not routinely included in simulations of groundwater flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U. S. Geological Survey modular finite-difference groundwater flow model. The new program is called the Interbed-Storage Package. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of skeletal component of elastic specific storage and thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the groundwater flow model by adding an additional term to the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum head. Another package that allows for a time-varying specified-head boundary is also documented. This package was written to reduce the data requirements for test simulations of the Interbed-Storage Package. (USGS)
National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; Evans, A.
1999-01-01
Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.
NASA Astrophysics Data System (ADS)
Bagli, Enrico; Guidi, Vincenzo
2013-08-01
A toolkit for the simulation of coherent interactions between high-energy charged particles and complex crystal structures, called DYNECHARM++ has been developed. The code has been written in C++ language taking advantage of this object-oriented programing method. The code is capable to evaluating the electrical characteristics of complex atomic structures and to simulate and track the particle trajectory within them. Calculation method of electrical characteristics based on their expansion in Fourier series has been adopted. Two different approaches to simulate the interaction have been adopted, relying on the full integration of particle trajectories under the continuum potential approximation and on the definition of cross-sections of coherent processes. Finally, the code has proved to reproduce experimental results and to simulate interaction of charged particles with complex structures.
Nishizawa, Hiroaki; Nishimura, Yoshifumi; Kobayashi, Masato; Irle, Stephan; Nakai, Hiromi
2016-08-05
The linear-scaling divide-and-conquer (DC) quantum chemical methodology is applied to the density-functional tight-binding (DFTB) theory to develop a massively parallel program that achieves on-the-fly molecular reaction dynamics simulations of huge systems from scratch. The functions to perform large scale geometry optimization and molecular dynamics with DC-DFTB potential energy surface are implemented to the program called DC-DFTB-K. A novel interpolation-based algorithm is developed for parallelizing the determination of the Fermi level in the DC method. The performance of the DC-DFTB-K program is assessed using a laboratory computer and the K computer. Numerical tests show the high efficiency of the DC-DFTB-K program, a single-point energy gradient calculation of a one-million-atom system is completed within 60 s using 7290 nodes of the K computer. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Cockpit Resource Management (CRM) training in the 1550th combat crew training wing
NASA Technical Reports Server (NTRS)
Fiedler, Michael T.
1987-01-01
The training program the 1550th Combat Crew Training Wing at Kirtland Air Force Base, New Mexico, implemented in September 1985 is discussed. The program is called Aircrew Coordination Training (ACT), and it is designed specifically to help aircrew members work more effectively as a team in their respective aircraft and hopefully to reduce human factors-related accidents. The scope of the 1550th CCTW's training responsibilities is described, the structure of the program, along with a brief look at the content of the academic part of the course. Then the Mission-Oriented Simulator Training (MOST) program is discussed; a program similar to the Line Oriented Flight Training (LOFT) programs. Finally, the future plans for the Aircrew Coordination Training Program at the 1550th is discussed.
The Navy/NASA Engine Program (NNEP89): A user's manual
NASA Technical Reports Server (NTRS)
Plencner, Robert M.; Snyder, Christopher A.
1991-01-01
An engine simulation computer code called NNEP89 was written to perform 1-D steady state thermodynamic analysis of turbine engine cycles. By using a very flexible method of input, a set of standard components are connected at execution time to simulate almost any turbine engine configuration that the user could imagine. The code was used to simulate a wide range of engine cycles from turboshafts and turboprops to air turborockets and supersonic cruise variable cycle engines. Off design performance is calculated through the use of component performance maps. A chemical equilibrium model is incorporated to adequately predict chemical dissociation as well as model virtually any fuel. NNEP89 is written in standard FORTRAN77 with clear structured programming and extensive internal documentation. The standard FORTRAN77 programming allows it to be installed onto most mainframe computers and workstations without modification. The NNEP89 code was derived from the Navy/NASA Engine program (NNEP). NNEP89 provides many improvements and enhancements to the original NNEP code and incorporates features which make it easier to use for the novice user. This is a comprehensive user's guide for the NNEP89 code.
Admixture Aberration Analysis: Application to Mapping in Admixed Population Using Pooled DNA
NASA Astrophysics Data System (ADS)
Bercovici, Sivan; Geiger, Dan
Admixture mapping is a gene mapping approach used for the identification of genomic regions harboring disease susceptibility genes in the case of recently admixed populations such as African Americans. We present a novel method for admixture mapping, called admixture aberration analysis (AAA), that uses a DNA pool of affected admixed individuals. We demonstrate through simulations that AAA is a powerful and economical mapping method under a range of scenarios, capturing complex human diseases such as hypertension and end stage kidney disease. The method has a low false-positive rate and is robust to deviation from model assumptions. Finally, we apply AAA on 600 prostate cancer-affected African Americans, replicating a known risk locus. Simulation results indicate that the method can yield over 96% reduction in genotyping. Our method is implemented as a Java program called AAAmap and is freely available.
JMSS-1: a new Martian soil simulant
NASA Astrophysics Data System (ADS)
Zeng, Xiaojia; Li, Xiongyao; Wang, Shijie; Li, Shijie; Spring, Nicole; Tang, Hong; Li, Yang; Feng, Junming
2015-05-01
It is important to develop Martian soil simulants that can be used in Mars exploration programs and Mars research. A new Martian soil simulant, called Jining Martian Soil Simulant (JMSS-1), was developed at the Lunar and Planetary Science Research Center at the Institute of Geochemistry, Chinese Academy of Sciences. The raw materials of JMSS-1 are Jining basalt and Fe oxides (magnetite and hematite). JMSS-1 was produced by mechanically crushing Jining basalt with the addition of small amounts of magnetite and hematite. The properties of this simulant, including chemical composition, mineralogy, particle size, mechanical properties, reflectance spectra, dielectric properties, volatile content, and hygroscopicity, have been analyzed. On the basis of these test results, it was demonstrated that JMSS-1 is an ideal Martian soil simulant in terms of chemical composition, mineralogy, and physical properties. JMSS-1 would be an appropriate choice as a Martian soil simulant in scientific and engineering experiments in China's Mars exploration in the future.
Communicating Value in Simulation: Cost-Benefit Analysis and Return on Investment.
Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu
2018-02-01
Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and the economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost-effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes," our breakout session critically evaluated the cost-benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost-benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. © 2017 by the Society for Academic Emergency Medicine.
Case-based tutoring from a medical knowledge base.
Chin, H L; Cooper, G F
1989-01-01
The past decade has seen the emergence of programs that make use of large knowledge bases to assist physicians in diagnosis within the general field of internal medicine. One such program, Internist-I, contains knowledge about over 600 diseases, covering a significant proportion of internal medicine. This paper describes the process of converting a subset of this knowledge base--in the area of cardiovascular diseases--into a probabilistic format, and the use of this resulting knowledge base to teach medical diagnostic knowledge. The system (called KBSimulator--for Knowledge-Based patient Simulator) generates simulated patient cases and uses these cases as a focal point from which to teach medical knowledge. This project demonstrates the feasibility of building an intelligent, flexible instructional system that uses a knowledge base constructed primarily for medical diagnosis.
Emile: Software-Realized Scaffolding for Science Learners Programming in Mixed Media
NASA Astrophysics Data System (ADS)
Guzdial, Mark Joseph
Emile is a computer program that facilitates students using programming to create models of kinematics (physics of motion without forces) and executing these models as simulations. Emile facilitates student programming and model-building with software-realized scaffolding (SRS). Emile integrates a range of SRS and provides mechanisms to fade (or diminish) most scaffolding. By fading Emile's SRS, students can adapt support to their individual needs. Programming in Emile involves graphic and text elements (as compared with more traditional text-based programming). For example, students create graphical objects which can be dragged on the screen, and when dropped, fall as if in a gravitational field. Emile supports a simplified, HyperCard-like mixed media programming framework. Scaffolding is defined as support which enables student performance (called the immediate benefit of scaffolding) and which facilitates student learning (called the lasting benefit of scaffolding). Scaffolding provides this support through three methods: Modeling, coaching, and eliciting articulation. For example, Emile has tools to structure the programming task (modeling), menus identify the next step in the programming and model-building process (coaching), and prompts for student plans and predictions (eliciting articulation). Five students used Emile in a summer workshop (45 hours total) focusing on creating kinematics simulations and multimedia demonstrations. Evaluation of Emile's scaffolding addressed use of scaffolding and the expected immediate and lasting benefits. Emile created records of student interactions (log files) which were analyzed to determine how students used Emile's SRS and how they faded that scaffolding. Student projects and articulations about those projects were analyzed to assess success of student's model-building and programming activities. Clinical interviews were conducted before and after the workshop to determine students' conceptualizations of kinematics and programming and how they changed. The results indicate that students were successful at model-building and programming, learned physics and programming, and used and faded Emile's scaffolding over time. These results are from a small sample who were self -selected and highly-motivated. Nonetheless, this study provides a theory and operationalization for SRS, an example of a successful model-building environment, and a description of student use of mixed media programming.
49 CFR 198.37 - State one-call damage prevention program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false State one-call damage prevention program. 198.37... REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.37 State one-call damage prevention program. A State must adopt a one-call damage prevention...
Web-based training: a new paradigm in computer-assisted instruction in medicine.
Haag, M; Maylein, L; Leven, F J; Tönshoff, B; Haux, R
1999-01-01
Computer-assisted instruction (CAI) programs based on internet technologies, especially on the world wide web (WWW), provide new opportunities in medical education. The aim of this paper is to examine different aspects of such programs, which we call 'web-based training (WBT) programs', and to differentiate them from conventional CAI programs. First, we will distinguish five different interaction types: presentation; browsing; tutorial dialogue; drill and practice; and simulation. In contrast to conventional CAI, there are four architectural types of WBT programs: client-based; remote data and knowledge; distributed teaching; and server-based. We will discuss the implications of the different architectures for developing WBT software. WBT programs have to meet other requirements than conventional CAI programs. The most important tools and programming languages for developing WBT programs will be listed and assigned to the architecture types. For the future, we expect a trend from conventional CAI towards WBT programs.
Takano, Yu; Nakata, Kazuto; Yonezawa, Yasushige; Nakamura, Haruki
2016-05-05
A massively parallel program for quantum mechanical-molecular mechanical (QM/MM) molecular dynamics simulation, called Platypus (PLATform for dYnamic Protein Unified Simulation), was developed to elucidate protein functions. The speedup and the parallelization ratio of Platypus in the QM and QM/MM calculations were assessed for a bacteriochlorophyll dimer in the photosynthetic reaction center (DIMER) on the K computer, a massively parallel computer achieving 10 PetaFLOPs with 705,024 cores. Platypus exhibited the increase in speedup up to 20,000 core processors at the HF/cc-pVDZ and B3LYP/cc-pVDZ, and up to 10,000 core processors by the CASCI(16,16)/6-31G** calculations. We also performed excited QM/MM-MD simulations on the chromophore of Sirius (SIRIUS) in water. Sirius is a pH-insensitive and photo-stable ultramarine fluorescent protein. Platypus accelerated on-the-fly excited-state QM/MM-MD simulations for SIRIUS in water, using over 4000 core processors. In addition, it also succeeded in 50-ps (200,000-step) on-the-fly excited-state QM/MM-MD simulations for the SIRIUS in water. © 2016 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc.
ALCF Data Science Program: Productive Data-centric Supercomputing
NASA Astrophysics Data System (ADS)
Romero, Nichols; Vishwanath, Venkatram
The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
A uniform input data convention for the CALL 3-D crash victim simulation
NASA Astrophysics Data System (ADS)
Shaibani, S. J.
1982-07-01
Logical schemes for the labelling of planes (cards D) and functions (cards E) in the input decks used for the Calspan 3-D Crash Victim Simulation (CVS) program are proposed. One benefit of introducing such a standardized format for these inputs would be to facilitate greatly the interchange of data for different vehicles. A further advantage would be that the table of allowed contacts (cards F) could remain largely unaltered. It is hoped that the uniformity of the convention described by these schemes would help to promote the exchange of readily usable data between CVS users.
A Fast-Time Simulation Environment for Airborne Merging and Spacing Research
NASA Technical Reports Server (NTRS)
Bussink, Frank J. L.; Doble, Nathan A.; Barmore, Bryan E.; Singer, Sharon
2005-01-01
As part of NASA's Distributed Air/Ground Traffic Management (DAG-TM) effort, NASA Langley Research Center is developing concepts and algorithms for merging multiple aircraft arrival streams and precisely spacing aircraft over the runway threshold. An airborne tool has been created for this purpose, called Airborne Merging and Spacing for Terminal Arrivals (AMSTAR). To evaluate the performance of AMSTAR and complement human-in-the-loop experiments, a simulation environment has been developed that enables fast-time studies of AMSTAR operations. The environment is based on TMX, a multiple aircraft desktop simulation program created by the Netherlands National Aerospace Laboratory (NLR). This paper reviews the AMSTAR concept, discusses the integration of the AMSTAR algorithm into TMX and the enhancements added to TMX to support fast-time AMSTAR studies, and presents initial simulation results.
MODFLOW-2005 : the U.S. Geological Survey modular ground-water model--the ground-water flow process
Harbaugh, Arlen W.
2005-01-01
This report presents MODFLOW-2005, which is a new version of the finite-difference ground-water model commonly called MODFLOW. Ground-water flow is simulated using a block-centered finite-difference approach. Layers can be simulated as confined or unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and rivers, also can be simulated. The report includes detailed explanations of physical and mathematical concepts on which the model is based, an explanation of how those concepts are incorporated in the modular structure of the computer program, instructions for using the model, and details of the computer code. The modular structure consists of a MAIN Program and a series of highly independent subroutines. The subroutines are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system that is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving the set of simultaneous equations resulting from the finite-difference method. Several solution methods are incorporated, including the Preconditioned Conjugate-Gradient method. The division of the program into packages permits the user to examine specific hydrologic features of the model independently. This also facilitates development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program also are designed to permit maximum flexibility. The program is designed to allow other capabilities, such as transport and optimization, to be incorporated, but this report is limited to describing the ground-water flow capability. The program is written in Fortran 90 and will run without modification on most computers that have a Fortran 90 compiler.
Existing Fortran interfaces to Trilinos in preparation for exascale ForTrilinos development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J.; Young, Mitchell T.; Collins, Benjamin S.
This report summarizes the current state of Fortran interfaces to the Trilinos library within several key applications of the Exascale Computing Program (ECP), with the aim of informing developers about strategies to develop ForTrilinos, an exascale-ready, Fortran interface software package within Trilinos. The two software projects assessed within are the DOE Office of Science's Accelerated Climate Model for Energy (ACME) atmosphere component, CAM, and the DOE Office of Nuclear Energy's core-simulator portion of VERA, a nuclear reactor simulation code. Trilinos is an object-oriented, C++ based software project, and spans a collection of algorithms and other enabling technologies such as uncertaintymore » quantification and mesh generation. To date, Trilinos has enabled these codes to achieve large-scale simulation results, however the simulation needs of CAM and VERA-CS will approach exascale over the next five years. A Fortran interface to Trilinos that enables efficient use of programming models and more advanced algorithms is necessary. Where appropriate, the needs of the CAM and VERA-CS software to achieve their simulation goals are called out specifically. With this report, a design document and execution plan for ForTrilinos development can proceed.« less
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
1999-01-01
The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.
Java-based Graphical User Interface for MAVERIC-II
NASA Technical Reports Server (NTRS)
Seo, Suk Jai
2005-01-01
A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.
Design and Simulation of an Electrothermal Actuator Based Rotational Drive
NASA Astrophysics Data System (ADS)
Beeson, Sterling; Dallas, Tim
2008-10-01
As a participant in the Micro and Nano Device Engineering (MANDE) Research Experience for Undergraduates program at Texas Tech University, I learned how MEMS devices operate and the limits of their operation. Using specialized AutoCAD-based design software and the ANSYS simulation program, I learned the MEMS fabrication process used at Sandia National Labs, the design limitations of this process, the abilities and drawbacks of micro devices, and finally, I redesigned a MEMS device called the Chevron Torsional Ratcheting Actuator (CTRA). Motion is achieved through electrothermal actuation. The chevron (bent-beam) actuators cause a ratcheting motion on top of a hub-less gear so that as voltage is applied the CTRA spins. The voltage applied needs to be pulsed and the frequency of the pulses determine the angular frequency of the device. The main objective was to design electromechanical structures capable of transforming the electrical signals into mechanical motion without overheating. The design was optimized using finite element analysis in ANSYS allowing multi-physics simulations of our model system.
NASA Technical Reports Server (NTRS)
Wilkenfeld, J. M.; Judge, R. J. R.; Harlacher, B. L.
1982-01-01
A combined experimental and analytical program to develop system electrical test procedures for the qualification of spacecraft against damage produced by space-electron-induced discharges (EID) occurring on spacecraft dielectric outer surfaces is described. The data on the response of a simple satellite model, called CAN, to electron-induced discharges is presented. The experimental results were compared to predicted behavior and to the response of the CAN to electrical injection techniques simulating blowoff and arc discharges. Also included is a review of significant results from other ground tests and the P78-2 program to form a data base from which is specified those test procedures which optimally simulate the response of spacecraft to EID. The electrical and electron spraying test data were evaluated to provide a first-cut determination of the best methods for performance of electrical excitation qualification tests from the point of view of simulation fidelity.
NASA Astrophysics Data System (ADS)
Lourderaj, Upakarasamy; Sun, Rui; Kohale, Swapnil C.; Barnes, George L.; de Jong, Wibe A.; Windus, Theresa L.; Hase, William L.
2014-03-01
The interface for VENUS and NWChem, and the resulting software package for direct dynamics simulations are described. The coupling of the two codes is considered to be a tight coupling since the two codes are compiled and linked together and act as one executable with data being passed between the two codes through routine calls. The advantages of this type of coupling are discussed. The interface has been designed to have as little interference as possible with the core codes of both VENUS and NWChem. VENUS is the code that propagates the direct dynamics trajectories and, therefore, is the program that drives the overall execution of VENUS/NWChem. VENUS has remained an essentially sequential code, which uses the highly parallel structure of NWChem. Subroutines of the interface that accomplish the data transmission and communication between the two computer programs are described. Recent examples of the use of VENUS/NWChem for direct dynamics simulations are summarized.
NASA Astrophysics Data System (ADS)
Himr, D.
2013-04-01
Article describes simulation of unsteady flow during water hammer with two programs, which use different numerical approaches to solve ordinary one dimensional differential equations describing the dynamics of hydraulic elements and pipes. First one is Matlab-Simulink-SimHydraulics, which is a commercial software developed to solve the dynamics of general hydraulic systems. It defines them with block elements. The other software is called HYDRA and it is based on the Lax-Wendrff numerical method, which serves as a tool to solve the momentum and continuity equations. This program was developed in Matlab by Brno University of Technology. Experimental measurements were performed on a simple test rig, which consists of an elastic pipe with strong damping connecting two reservoirs. Water hammer is induced with fast closing the valve. Physical properties of liquid and pipe elasticity parameters were considered in both simulations, which are in very good agreement and differences in comparison with experimental data are minimal.
Platform-Independence and Scheduling In a Multi-Threaded Real-Time Simulation
NASA Technical Reports Server (NTRS)
Sugden, Paul P.; Rau, Melissa A.; Kenney, P. Sean
2001-01-01
Aviation research often relies on real-time, pilot-in-the-loop flight simulation as a means to develop new flight software, flight hardware, or pilot procedures. Often these simulations become so complex that a single processor is incapable of performing the necessary computations within a fixed time-step. Threads are an elegant means to distribute the computational work-load when running on a symmetric multi-processor machine. However, programming with threads often requires operating system specific calls that reduce code portability and maintainability. While a multi-threaded simulation allows a significant increase in the simulation complexity, it also increases the workload of a simulation operator by requiring that the operator determine which models run on which thread. To address these concerns an object-oriented design was implemented in the NASA Langley Standard Real-Time Simulation in C++ (LaSRS++) application framework. The design provides a portable and maintainable means to use threads and also provides a mechanism to automatically load balance the simulation models.
Electron tunneling in proteins program.
Hagras, Muhammad A; Stuchebrukhov, Alexei A
2016-06-05
We developed a unique integrated software package (called Electron Tunneling in Proteins Program or ETP) which provides an environment with different capabilities such as tunneling current calculation, semi-empirical quantum mechanical calculation, and molecular modeling simulation for calculation and analysis of electron transfer reactions in proteins. ETP program is developed as a cross-platform client-server program in which all the different calculations are conducted at the server side while only the client terminal displays the resulting calculation outputs in the different supported representations. ETP program is integrated with a set of well-known computational software packages including Gaussian, BALLVIEW, Dowser, pKip, and APBS. In addition, ETP program supports various visualization methods for the tunneling calculation results that assist in a more comprehensive understanding of the tunneling process. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Communicating Value in Simulation: Cost Benefit Analysis and Return on Investment.
Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu
2017-10-26
Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes", our breakout session critically evaluated the cost benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Adaptation of a program for nonlinear finite element analysis to the CDC STAR 100 computer
NASA Technical Reports Server (NTRS)
Pifko, A. B.; Ogilvie, P. L.
1978-01-01
The conversion of a nonlinear finite element program to the CDC STAR 100 pipeline computer is discussed. The program called DYCAST was developed for the crash simulation of structures. Initial results with the STAR 100 computer indicated that significant gains in computation time are possible for operations on gloval arrays. However, for element level computations that do not lend themselves easily to long vector processing, the STAR 100 was slower than comparable scalar computers. On this basis it is concluded that in order for pipeline computers to impact the economic feasibility of large nonlinear analyses it is absolutely essential that algorithms be devised to improve the efficiency of element level computations.
Effect of Spatial Locality Prefetching on Structural Locality
1991-12-01
Pollution module calculates the SLC and CAM cache pollution percentages. And finally, the Generate Reference Frequency List module produces the output...3.2.5 Generate Reference Frequency List 3.2.6 Each program module in the structure chart is mapped into an Ada package. By performing this encapsulation...call routine to generate reference -- frequency list -- end if -- end loop -- close input, output, and reference files end Cache Simulator Figure 3.5
Preliminary Guidelines for Installation Product Line Land Management Suite (LMS) Product Developers
2005-01-01
land use patterns might call a storm simulation model available as a CDF service to evaluate the ability of the pattern to maintain water quality ...Analysis GIS data Server Internal DIAS objects External DIAS objects External CDF services Fort Future DIAS Model GUI Figure 10. A Fort Future DIAS...31 iv ERDC/CERL TR-05-1 Are Programs that Analyze Data Being Developed as CDF Services
ALMA from the Users' Perspective
NASA Astrophysics Data System (ADS)
Johnson, Kelsey
2010-05-01
After decades of dreaming and preparation, the call for early science with ALMA is just around the corner. The goal of this talk is to illustrate the process of preparing and carrying out a research program with ALMA. This presentation will step through the user interface for proposal preparation, proposal review, project tracking, data acquisition, and post-processing. Examples of the software tools, including the simulator and spectral line catalog, will be included.
Prudic, David E.
1989-01-01
Computer models are widely used to simulate groundwater flow for evaluating and managing the groundwater resource of many aquifers, but few are designed to also account for surface flow in streams. A computer program was written for use in the US Geological Survey modular finite difference groundwater flow model to account for the amount of flow in streams and to simulate the interaction between surface streams and groundwater. The new program is called the Streamflow-Routing Package. The Streamflow-Routing Package is not a true surface water flow model, but rather is an accounting program that tracks the flow in one or more streams which interact with groundwater. The program limits the amount of groundwater recharge to the available streamflow. It permits two or more streams to merge into one with flow in the merged stream equal to the sum of the tributary flows. The program also permits diversions from streams. The groundwater flow model with the Streamflow-Routing Package has an advantage over the analytical solution in simulating the interaction between aquifer and stream because it can be used to simulate complex systems that cannot be readily solved analytically. The Streamflow-Routing Package does not include a time function for streamflow but rather streamflow entering the modeled area is assumed to be instantly available to downstream reaches during each time period. This assumption is generally reasonable because of the relatively slow rate of groundwater flow. Another assumption is that leakage between streams and aquifers is instantaneous. This assumption may not be reasonable if the streams and aquifers are separated by a thick unsaturated zone. Documentation of the Streamflow-Routing Package includes data input instructions; flow charts, narratives, and listings of the computer program for each of four modules; and input data sets and printed results for two test problems, and one example problem. (Lantz-PTT)
EngineSim: Turbojet Engine Simulator Adapted for High School Classroom Use
NASA Technical Reports Server (NTRS)
Petersen, Ruth A.
2001-01-01
EngineSim is an interactive educational computer program that allows users to explore the effect of engine operation on total aircraft performance. The software is supported by a basic propulsion web site called the Beginner's Guide to Propulsion, which includes educator-created, web-based activities for the classroom use of EngineSim. In addition, educators can schedule videoconferencing workshops in which EngineSim's creator demonstrates the software and discusses its use in the educational setting. This software is a product of NASA Glenn Research Center's Learning Technologies Project, an educational outreach initiative within the High Performance Computing and Communications Program.
Simulating the impacts of fire: A computer program
NASA Astrophysics Data System (ADS)
Ffolliott, Peter F.; Guertin, D. Phillip; Rasmussen, William D.
1988-11-01
Recurrent fire has played a dominant role in the ecology of southwestern ponderosa pine forests. To assess the benefits or losses of fire in these forests, a computer simulation model, called BURN, considers vegetation (mortality, regeneration, and production of herbaceous vegetation), wildlife (populations and habitats), and hydrology (streamflow and water quality). In the formulation of the model, graphical representations (time-trend response curves) of increases or losses (compared to an unburned control) after the occurrence of fire are converted to fixedterm annual ratios, and then annuities for the simulation components. Annuity values higher than 1.0 indicate benefits, while annuity values lower than 1.0 indicate losses. Studies in southwestern ponderosa pine forests utilized in the development of BURN are described briefly.
NASA Technical Reports Server (NTRS)
Reznick, Steve
1988-01-01
Transonic Euler/Navier-Stokes computations are accomplished for wing-body flow fields using a computer program called Transonic Navier-Stokes (TNS). The wing-body grids are generated using a program called ZONER, which subdivides a coarse grid about a fighter-like aircraft configuration into smaller zones, which are tailored to local grid requirements. These zones can be either finely clustered for capture of viscous effects, or coarsely clustered for inviscid portions of the flow field. Different equation sets may be solved in the different zone types. This modular approach also affords the opportunity to modify a local region of the grid without recomputing the global grid. This capability speeds up the design optimization process when quick modifications to the geometry definition are desired. The solution algorithm embodied in TNS is implicit, and is capable of capturing pressure gradients associated with shocks. The algebraic turbulence model employed has proven adequate for viscous interactions with moderate separation. Results confirm that the TNS program can successfully be used to simulate transonic viscous flows about complicated 3-D geometries.
Additions to Mars Global Reference Atmospheric Model (MARS-GRAM)
NASA Technical Reports Server (NTRS)
Justus, C. G.; James, Bonnie
1992-01-01
Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification was also made which allows heights to go 'below' local terrain height and return 'realistic' pressure, density, and temperature, and not the surface values, as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local 'valley' areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch versions of Mars-GRAM are presented.
Additions to Mars Global Reference Atmospheric Model (Mars-GRAM)
NASA Technical Reports Server (NTRS)
Justus, C. G.
1991-01-01
Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification has also been made which allows heights to go below local terrain height and return realistic pressure, density, and temperature (not the surface values) as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local valley areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch version of Mars-GRAM are presented.
Ligand-protein docking using a quantum stochastic tunneling optimization method.
Mancera, Ricardo L; Källblad, Per; Todorov, Nikolay P
2004-04-30
A novel hybrid optimization method called quantum stochastic tunneling has been recently introduced. Here, we report its implementation within a new docking program called EasyDock and a validation with the CCDC/Astex data set of ligand-protein complexes using the PLP score to represent the ligand-protein potential energy surface and ScreenScore to score the ligand-protein binding energies. When taking the top energy-ranked ligand binding mode pose, we were able to predict the correct crystallographic ligand binding mode in up to 75% of the cases. By using this novel optimization method run times for typical docking simulations are significantly shortened. Copyright 2004 Wiley Periodicals, Inc. J Comput Chem 25: 858-864, 2004
To call or not to call--that is the question (while driving).
Tractinsky, Noam; Ram, Efrat Soffer; Shinar, David
2013-07-01
We studied whether decisions to engage in cell phone conversation while driving and the consequences of such decisions are related to the driver's age, to the road conditions (demands of the driving task), and to the driver's role in initiating the phone call (i.e. the driver as caller vs. as receiver). Two experiments were performed in a driving simulator in which driver age, road conditions and phone conversation, as a secondary task, were manipulated. Engagement in cell phone conversations, performance in the driving and the conversation tasks, and subjective effort assessment were recorded. In general, drivers were more willing to accept incoming calls than to initiate calls. In addition, older and younger drivers were more susceptible to the deleterious effects of phone conversations while driving than middle aged/experienced drivers. While older drivers were aware of this susceptibility by showing sensitivity to road conditions before deciding whether to engage in a call or not, young drivers showed no such sensitivity. The results can guide the development of young driver training programs and point at the need to develop context-aware management systems of in-vehicle cell phone conversations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Drewes, Rich; Zou, Quan; Goodman, Philip H
2009-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.
Drewes, Rich; Zou, Quan; Goodman, Philip H.
2008-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707
Markstrom, Steven L.
2012-01-01
A software program, called P2S, has been developed which couples the daily stream temperature simulation capabilities of the U.S. Geological Survey Stream Network Temperature model with the watershed hydrology simulation capabilities of the U.S. Geological Survey Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a modular, deterministic, distributed-parameter, physical-process watershed model that simulates hydrologic response to various combinations of climate and land use. Stream Network Temperature was developed to help aquatic biologists and engineers predict the effects of changes that hydrology and energy have on water temperatures. P2S will allow scientists and watershed managers to evaluate the effects of historical climate and projected climate change, landscape evolution, and resource management scenarios on watershed hydrology and in-stream water temperature.
Development of thyroid anthropomorphic phantoms for use in nuclear medicine
NASA Astrophysics Data System (ADS)
Cerqueira, R. A. D.; Maia, A. F.
2014-02-01
The objective of this study was to develop thyroid anthropomorphic phantoms to be used in control tests of medical images in scintillation cameras. The main difference among the phantoms was the neck shape: in the first, called OSCT, it was geometrically shaped, while in the second, called OSAP, it was anthropomorphically shaped. In both phantoms, thyroid gland prototypes, which were made of acrylic and anthropomorphically shaped, were constructed to allow the simulation of a healthy thyroid and of thyroids with hyperthyroidism and hypothyroidism. Images of these thyroid anthropomorphic phantoms were obtained using iodine 131 with an activity of 8.695 MBq. The iodine 131 was chosen because it is widely used in studies of thyroid scintigraphy. The images obtained proved the effectiveness of the phantoms to simulate normal or abnormal thyroids function. These phantoms can be used in medical imaging quality control programs and, also in the training of professionals involved in the analysis of images in nuclear medicine centers.
Robison, Weston; Patel, Sonya K; Mehta, Akshat; Senkowski, Tristan; Allen, John; Shaw, Eric; Senkowski, Christopher K
2018-03-01
To study the effects of fatigue on general surgery residents' performance on the da Vinci Skills Simulator (dVSS). 15 General Surgery residents from various postgraduate training years (PGY2, PGY3, PGY4, and PGY5) performed 5 simulation tasks on the dVSS as recommended by the Robotic Training Network (RTN). The General Surgery residents had no prior experience with the dVSS. Participants were assigned to either the Pre-call group or Post-call group based on call schedule. As a measure of subjective fatigue, residents were given the Epworth Sleepiness Scale (ESS) prior to their dVSS testing. The dVSS MScore™ software recorded various metrics (Objective Structured Assessment of Technical Skills, OSATS) that were used to evaluate the performance of each resident to compare the robotic simulation proficiency between the Pre-call and Post-call groups. Six general surgery residents were stratified into the Pre-call group and nine into the Post-call group. These residents were also stratified into Fatigued (10) or Nonfatigued (5) groups, as determined by their reported ESS scores. A statistically significant difference was found between the Pre-call and Post-call reported sleep hours (p = 0.036). There was no statistically significant difference between the Pre-call and Post-call groups or between the Fatigued and Nonfatigued groups in time to complete exercise, number of attempts, and high MScore™ score. Despite variation in fatigue levels, there was no effect on the acquisition of robotic simulator skills.
Chung, Catherine; Cooper, Simon J; Cant, Robyn P; Connell, Cliff; McKay, Angela; Kinsman, Leigh; Gazula, Swapnali; Boyle, Jayne; Cameron, Amanda; Cash, Penny; Evans, Lisa; Kim, Jeong-Ah; Masud, Rana; McInnes, Denise; Norman, Lisa; Penz, Erika; Rotter, Thomas; Tanti, Erin; Breakspear, Tom
2018-05-01
There are international concerns relating to the management of patient deterioration. The "failure to rescue" literature identifies that nursing staff miss cues of deterioration and often fail to call for assistance. Simulation-based educational approaches may improve nurses' recognition and management of patient deterioration. To investigate the educational impact of the First2Act web-based (WB) and face-to-face (F2F) simulation programs. A mixed methods interventional cohort trial with nursing staff from four Australian hospitals. Nursing staff working in four public and private hospital medical wards in the State of Victoria. In 2016, ward nursing staff (n = 74) from a public and private hospital completed three F2F laboratory-based team simulations with a patient actor in teams of three. 56 nursing staff from another public and private hospital individually completed a three-scenario WB simulation program (First2ActWeb) [A 91% participation rate]. Validated tools were used to measure knowledge (multi-choice questionnaire), competence (check-list of actions) and confidence (self-rated) before and after the intervention. Both WB and F2F participants' knowledge, competence and confidence increased significantly after training (p ≤0.001). Skill performance for the WB group increased significantly from 61% to 74% (p ≤ 0.05) and correlated significantly with post-test knowledge (p = 0.014). No change was seen in the F2F groups' performance scores. Course evaluations were positive with median ratings of 4/5 (WB) and 5/5 (F2F). The F2F program received significantly more positive evaluations than the WB program (p < 0.05), particularly with regard to quality of feedback. WB and F2F simulation are effective education strategies with both programs demonstrating positive learning outcomes. WB programs increase ease of access to training whilst F2F enable the development of tactile hands on skills and teamwork. A combined blended learning education strategy is recommended to enhance competence and patient safety. Copyright © 2018 Elsevier Ltd. All rights reserved.
Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive and time consuming. One of the main contributors to the high cost and lengthy time is the need to perform many large-scale hardware tests and the inability to integrate all appropriate subsystems early in the design process. The NASA Glenn Research Center is developing the technologies required to enable simulations of full aerospace propulsion systems in sufficient detail to resolve critical design issues early in the design process before hardware is built. This concept, called the Numerical Propulsion System Simulation (NPSS), is focused on the integration of multiple disciplines such as aerodynamics, structures and heat transfer with computing and communication technologies to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS, as illustrated, is to be a "numerical test cell" that enables full engine simulation overnight on cost-effective computing platforms. There are several key elements within NPSS that are required to achieve this capability: 1) clear data interfaces through the development and/or use of data exchange standards, 2) modular and flexible program construction through the use of object-oriented programming, 3) integrated multiple fidelity analysis (zooming) techniques that capture the appropriate physics at the appropriate fidelity for the engine systems, 4) multidisciplinary coupling techniques and finally 5) high performance parallel and distributed computing. The current state of development in these five area focuses on air breathing gas turbine engines and is reported in this paper. However, many of the technologies are generic and can be readily applied to rocket based systems and combined cycles currently being considered for low-cost access-to-space applications. Recent accomplishments include: (1) the development of an industry-standard engine cycle analysis program and plug 'n play architecture, called NPSS Version 1, (2) A full engine simulation that combines a 3D low-pressure subsystem with a 0D high pressure core simulation. This demonstrates the ability to integrate analyses at different levels of detail and to aerodynamically couple components, the fan/booster and low-pressure turbine, through a 3D computational fluid dynamics simulation. (3) Simulation of all of the turbomachinery in a modern turbofan engine on parallel computing platform for rapid and cost-effective execution. This capability can also be used to generate full compressor map, requiring both design and off-design simulation. (4) Three levels of coupling characterize the multidisciplinary analysis under NPSS: loosely coupled, process coupled and tightly coupled. The loosely coupled and process coupled approaches require a common geometry definition to link CAD to analysis tools. The tightly coupled approach is currently validating the use of arbitrary Lagrangian/Eulerian formulation for rotating turbomachinery. The validation includes both centrifugal and axial compression systems. The results of the validation will be reported in the paper. (5) The demonstration of significant computing cost/performance reduction for turbine engine applications using PC clusters. The NPSS Project is supported under the NASA High Performance Computing and Communications Program.
Graphical programming interface: A development environment for MRI methods.
Zwart, Nicholas R; Pipe, James G
2015-11-01
To introduce a multiplatform, Python language-based, development environment called graphical programming interface for prototyping MRI techniques. The interface allows developers to interact with their scientific algorithm prototypes visually in an event-driven environment making tasks such as parameterization, algorithm testing, data manipulation, and visualization an integrated part of the work-flow. Algorithm developers extend the built-in functionality through simple code interfaces designed to facilitate rapid implementation. This article shows several examples of algorithms developed in graphical programming interface including the non-Cartesian MR reconstruction algorithms for PROPELLER and spiral as well as spin simulation and trajectory visualization of a FLORET example. The graphical programming interface framework is shown to be a versatile prototyping environment for developing numeric algorithms used in the latest MR techniques. © 2014 Wiley Periodicals, Inc.
Lien, Rebecca K; Schillo, Barbara A; Mast, Jay L; Lukowski, Amy V; Greenseid, Lija O; Keith, Jennifer D; Keller, Paula A
2016-01-01
Tobacco users in all 50 states have access to quitline telephone counseling and cessation medications. While studies show multiple calls relate to quit success, most participants do not complete a full call series. To date, quitline program use studies have analyzed single factors-such as number of calls or counseling minutes. This study combines multiple factors of quitline program use across 2 states to describe how participants use a 5-call program; assess whether intensity of program use is associated with participant subgroups; and assess whether key outcomes (quitting, satisfaction) are associated with intensity. This observational study examines data for quitline participants in Minnesota (n = 2844) and Pennsylvania (n = 14 359) in 2011 and 2012. A subset of participants was surveyed 7 months after registration to assess key outcomes (response rates: Minnesota 65%; Pennsylvania 60%). Quitline utilization data were used to identify program use variables: nicotine replacement therapy provision, number of counseling calls, number of counseling minutes, days from first to last counseling call, and days from registration to first counseling call. Ten program use groups were created using all 5 program use variables, from lowest (1) to highest (10) intensity. Results were similar for both states. Only 11% of Minnesota and 8% of Pennsylvania participants completed all 5 calls. Intensity of quitline program use was associated with several participant characteristics including health conditions and age. Both quit status and program satisfaction were associated with program use intensity. Quit rates peaked in group 9, participants who received the full 5-call program. Quitlines should focus on engaging participants in multiple calls to improve quit outcomes. In addition, it is important to leverage multiple program use factors for a fuller understanding of how quitline participants use a program.
Carolan-Olah, Mary; Kruger, Gina; Brown, Vera; Lawton, Felicity; Mazzarino, Melissa; Vasilevski, Vidanka
2018-03-01
Midwifery students feel unprepared to deal with commonly encountered emergencies, such as neonatal resuscitation. Clinical simulation of emergencies may provide a safe forum for students to develop necessary skills. A simulation exercise, for neonatal resuscitation, was developed and evaluated using qualitative methods. Pre and post-simulation questions focussed on student confidence and knowledge of resuscitation. Data were analysed using a thematic analysis approach. Pre-simulation questions revealed that most students considered themselves not very confident/unsure about their level of confidence in undertaking neonatal resuscitation. Most correctly identified features of the neonate requiring resuscitation. Post-simulation, students indicated that their confidence and knowledge of neonatal resuscitation had improved. Themes included: gaining confidence; understanding when to call for help; understanding the principles of resuscitation; tailoring simulation/education approaches to student needs. Students benefits included improved knowledge, confidence and skills. Participants unanimously suggested a program of simulation exercises, over a longer period of time, to reinforce knowledge and confidence gains. Ideally, students would like to actively participate in the simulation, rather than observe. Copyright © 2017. Published by Elsevier Ltd.
A software bus for thread objects
NASA Technical Reports Server (NTRS)
Callahan, John R.; Li, Dehuai
1995-01-01
The authors have implemented a software bus for lightweight threads in an object-oriented programming environment that allows for rapid reconfiguration and reuse of thread objects in discrete-event simulation experiments. While previous research in object-oriented, parallel programming environments has focused on direct communication between threads, our lightweight software bus, called the MiniBus, provides a means to isolate threads from their contexts of execution by restricting communications between threads to message-passing via their local ports only. The software bus maintains a topology of connections between these ports. It routes, queues, and delivers messages according to this topology. This approach allows for rapid reconfiguration and reuse of thread objects in other systems without making changes to the specifications or source code. A layered approach that provides the needed transparency to developers is presented. Examples of using the MiniBus are given, and the value of bus architectures in building and conducting simulations of discrete-event systems is discussed.
[Improving communication skills of physicians caring for adolescents by simulation].
Reister, Gad; Stoffman, Nava
2011-04-01
Although the unique characteristics and abilities of youths were noted in ancient ages, it was only later that the process of adolescence was studied and understood. Adolescents are considered a healthy population when compared to younger kids and adults. However, unlike other age groups, the morbidity and mortality of adolescents has not decreased in the last decades, probably due to risk-taking behaviors. Since the 1950s, the need for a special medical and health approach in treating adolescents was established. Yet, only a few countries incorporate such approaches when educating and training students, residents and fellows in physicians programs. Youths are treated by physicians of many disciplines, despite the fact that only a minority were trained in adolescent medicine. Simulation of medical situations with standard patients has become a significant tool for improving the communication skills of healthcare providers. The article in this edition of Harefuah describes the use of a simulated-patient-based education system in improving the communication skills of physicians of different fields. The authors presented the positive feedback of the participants in the program and demonstrated that following the program there was a positive influence on their practice when dealing with adolescents. We call to incorporate the teaching of adolescent medicine in all Levels, starting at medical school. Using the simulation tool is very helpful in improving the communication skills of medical personnel.
Operative team communication during simulated emergencies: Too busy to respond?
Davis, W Austin; Jones, Seth; Crowell-Kuhnberg, Adrianna M; O'Keeffe, Dara; Boyle, Kelly M; Klainer, Suzanne B; Smink, Douglas S; Yule, Steven
2017-05-01
Ineffective communication among members of a multidisciplinary team is associated with operative error and failure to rescue. We sought to measure operative team communication in a simulated emergency using an established communication framework called "closed loop communication." We hypothesized that communication directed at a specific recipient would be more likely to elicit a check back or closed loop response and that this relationship would vary with changes in patients' clinical status. We used the closed loop communication framework to code retrospectively the communication behavior of 7 operative teams (each comprising 2 surgeons, anesthesiologists, and nurses) during response to a simulated, postanesthesia care unit "code blue." We identified call outs, check backs, and closed loop episodes and applied descriptive statistics and a mixed-effects negative binomial regression to describe characteristics of communication in individuals and in different specialties. We coded a total of 662 call outs. The frequency and type of initiation and receipt of communication events varied between clinical specialties (P < .001). Surgeons and nurses initiated fewer and received more communication events than anesthesiologists. For the average participant, directed communication increased the likelihood of check back by at least 50% (P = .021) in periods preceding acute changes in the clinical setting, and exerted no significant effect in periods after acute changes in the clinical situation. Communication patterns vary by specialty during a simulated operative emergency, and the effect of directed communication in eliciting a response depends on the clinical status of the patient. Operative training programs should emphasize the importance of quality communication in the period immediately after an acute change in the clinical setting of a patient and recognize that communication patterns and needs vary between members of multidisciplinary operative teams. Copyright © 2016 Elsevier Inc. All rights reserved.
2009.1 Revision of the Evaluated Nuclear Data Library (ENDL2009.1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, I. J.; Beck, B.; Descalles, M. A.
LLNL’s Computational Nuclear Data and Theory Group have created a 2009.1 revised release of the Evaluated Nuclear Data Library (ENDL2009.1). This library is designed to support LLNL’s current and future nuclear data needs and will be employed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. The ENDL2009 database was the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. It was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/Office of Science’s US Nuclear Data Program. This document listsmore » the revisions and fixes made in a new release called ENDL2009.1, by comparing with the existing data in the original release which is now called ENDL2009.0. These changes are made in conjunction with the revisions for ENDL2011.1, so that both the .1 releases are as free as possible of known defects.« less
System support software for the Space Ultrareliable Modular Computer (SUMC)
NASA Technical Reports Server (NTRS)
Hill, T. E.; Hintze, G. C.; Hodges, B. C.; Austin, F. A.; Buckles, B. P.; Curran, R. T.; Lackey, J. D.; Payne, R. E.
1974-01-01
The highly transportable programming system designed and implemented to support the development of software for the Space Ultrareliable Modular Computer (SUMC) is described. The SUMC system support software consists of program modules called processors. The initial set of processors consists of the supervisor, the general purpose assembler for SUMC instruction and microcode input, linkage editors, an instruction level simulator, a microcode grid print processor, and user oriented utility programs. A FORTRAN 4 compiler is undergoing development. The design facilitates the addition of new processors with a minimum effort and provides the user quasi host independence on the ground based operational software development computer. Additional capability is provided to accommodate variations in the SUMC architecture without consequent major modifications in the initial processors.
NASA Technical Reports Server (NTRS)
Kenner, B. G.; Lincoln, N. R.
1979-01-01
The manual is intended to show the revisions and additions to the current STAR FORTRAN. The changes are made to incorporate an FMP (Flow Model Processor) for use in the Numerical Aerodynamic Simulation Facility (NASF) for the purpose of simulating fluid flow over three-dimensional bodies in wind tunnel environments and in free space. The FORTRAN programming language for the STAR-100 computer contains both CDC and unique STAR extensions to the standard FORTRAN. Several of the STAR FORTRAN extensions to standard FOR-TRAN allow the FORTRAN user to exploit the vector processing capabilities of the STAR computer. In STAR FORTRAN, vectors can be expressed with an explicit notation, functions are provided that return vector results, and special call statements enable access to any machine instruction.
NY TBO Research: Integrated Demand Management (IDM): IDM Concept, Tools, and Training Package
NASA Technical Reports Server (NTRS)
Smith, Nancy
2016-01-01
A series of human-in-the-loop simulation sessions were conducted in the Airspace Operations Laboratory (AOL) to evaluate a new traffic management concept called Integrated Demand Management (IDM). The simulation explored how to address chronic equity, throughput and delay issues associated with New Yorks high-volume airports by operationally integrating three current and NextGen capabilities the Collaborative Trajectory Options Program (CTOP), Time-Based Flow Management (TBFM) and Required Time of Arrival (RTA) in order to better manage traffic demand within the National Air Traffic System. A package of presentation slides was developed to describe the concept, tools, and training materials used in the simulation sessions. The package will be used to outbrief our stakeholders by both presenting orally and disseminating of the materials via email.
Accelerating Wright–Fisher Forward Simulations on the Graphics Processing Unit
Lawrie, David S.
2017-01-01
Forward Wright–Fisher simulations are powerful in their ability to model complex demography and selection scenarios, but suffer from slow execution on the Central Processor Unit (CPU), thus limiting their usefulness. However, the single-locus Wright–Fisher forward algorithm is exceedingly parallelizable, with many steps that are so-called “embarrassingly parallel,” consisting of a vast number of individual computations that are all independent of each other and thus capable of being performed concurrently. The rise of modern Graphics Processing Units (GPUs) and programming languages designed to leverage the inherent parallel nature of these processors have allowed researchers to dramatically speed up many programs that have such high arithmetic intensity and intrinsic concurrency. The presented GPU Optimized Wright–Fisher simulation, or “GO Fish” for short, can be used to simulate arbitrary selection and demographic scenarios while running over 250-fold faster than its serial counterpart on the CPU. Even modest GPU hardware can achieve an impressive speedup of over two orders of magnitude. With simulations so accelerated, one can not only do quick parametric bootstrapping of previously estimated parameters, but also use simulated results to calculate the likelihoods and summary statistics of demographic and selection models against real polymorphism data, all without restricting the demographic and selection scenarios that can be modeled or requiring approximations to the single-locus forward algorithm for efficiency. Further, as many of the parallel programming techniques used in this simulation can be applied to other computationally intensive algorithms important in population genetics, GO Fish serves as an exciting template for future research into accelerating computation in evolution. GO Fish is part of the Parallel PopGen Package available at: http://dl42.github.io/ParallelPopGen/. PMID:28768689
ERIC Educational Resources Information Center
Samani, Ebrahim; Baki, Roselan; Razali, Abu Bakar
2014-01-01
Success in implementation of computer-assisted language learning (CALL) programs depends on the teachers' understanding of the roles of CALL programs in education. Consequently, it is also important to understand the barriers teachers face in the use of computer-assisted language learning (CALL) programs. The current study was conducted on 14…
Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes
NASA Technical Reports Server (NTRS)
Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.
1989-01-01
The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.
NASA Astrophysics Data System (ADS)
Canal, Fernando; Garcia-Mateos, Jorge; Rodriguez-Larena, Jorge; Rivera, Alejandro; Aparicio, E.
2000-12-01
Medical therapeutic applications using lasers involves understanding the light tissue interaction, in particular the rate ofphotochemical and thermal reactions. Tissue is composed ofa mix ofturbid media. Light propagation in turbid media can be described by the so-called Equation of Radiative Transfer, an integro-differential equation where scattering, absorption and internal reflection are significant factors in determining the light distribution in tissue. The Equation of Radiative Transfer however can not commonly be solved analytically.' In order to visualize and simulate the effects of laser light on heart tissues (myocardium) in relation to the treatment of irregular heart rates or so called arrhythmias, a fast interactive computer program has been developed in Java.
F77NNS - A FORTRAN-77 NEURAL NETWORK SIMULATOR
NASA Technical Reports Server (NTRS)
Mitchell, P. H.
1994-01-01
F77NNS (A FORTRAN-77 Neural Network Simulator) simulates the popular back error propagation neural network. F77NNS is an ANSI-77 FORTRAN program designed to take advantage of vectorization when run on machines having this capability, but it will run on any computer with an ANSI-77 FORTRAN Compiler. Artificial neural networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to biological nerve cells. Problems which involve pattern matching or system modeling readily fit the class of problems which F77NNS is designed to solve. The program's formulation trains a neural network using Rumelhart's back-propagation algorithm. Typically the nodes of a network are grouped together into clumps called layers. A network will generally have an input layer through which the various environmental stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. The back-propagation training algorithm can require massive computational resources to implement a large network such as a network capable of learning text-to-phoneme pronunciation rules as in the famous Sehnowski experiment. The Sehnowski neural network learns to pronounce 1000 common English words. The standard input data defines the specific inputs that control the type of run to be made, and input files define the NN in terms of the layers and nodes, as well as the input/output (I/O) pairs. The program has a restart capability so that a neural network can be solved in stages suitable to the user's resources and desires. F77NNS allows the user to customize the patterns of connections between layers of a network. The size of the neural network to be solved is limited only by the amount of random access memory (RAM) available to the user. The program has a memory requirement of about 900K. The standard distribution medium for this package is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. F77NNS was developed in 1989.
Validation of the Chemistry Module for the Euler Solver in Unified Flow Solver
2012-03-01
traveling through the atmosphere there are three types of flow regimes that exist; the first is the continuum regime, second is the rarified regime and...The second method has been used in a program called Unified Flow Solver (UFS). UFS is currently being developed under collaborative efforts the Air...thermal non-equilibrium case and finally to a thermo-chemical non- equilibrium case. The data from the simulations will be compared to a second code
Social factors in space station interiors
NASA Technical Reports Server (NTRS)
Cranz, Galen; Eichold, Alice; Hottes, Klaus; Jones, Kevin; Weinstein, Linda
1987-01-01
Using the example of the chair, which is often written into space station planning but which serves no non-cultural function in zero gravity, difficulties in overcoming cultural assumptions are discussed. An experimental approach is called for which would allow designers to separate cultural assumptions from logistic, social and psychological necessities. Simulations, systematic doubt and monitored brainstorming are recommended as part of basic research so that the designer will approach the problems of space module design with a complete program.
Unifying Model-Based and Reactive Programming within a Model-Based Executive
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Apollo experience report: Real-time auxiliary computing facility development
NASA Technical Reports Server (NTRS)
Allday, C. E.
1972-01-01
The Apollo real time auxiliary computing function and facility were an extension of the facility used during the Gemini Program. The facility was expanded to include support of all areas of flight control, and computer programs were developed for mission and mission-simulation support. The scope of the function was expanded to include prime mission support functions in addition to engineering evaluations, and the facility became a mandatory mission support facility. The facility functioned as a full scale mission support activity until after the first manned lunar landing mission. After the Apollo 11 mission, the function and facility gradually reverted to a nonmandatory, offline, on-call operation because the real time program flexibility was increased and verified sufficiently to eliminate the need for redundant computations. The evaluation of the facility and function and recommendations for future programs are discussed in this report.
High powered rocketry: design, construction, and launching experience and analysis
NASA Astrophysics Data System (ADS)
Paulson, Pryce; Curtis, Jarret; Bartel, Evan; Owens Cyr, Waycen; Lamsal, Chiranjivi
2018-01-01
In this study, the nuts and bolts of designing and building a high powered rocket have been presented. A computer simulation program called RockSim was used to design the rocket. Simulation results are consistent with time variations of altitude, velocity, and acceleration obtained in the actual flight. The actual drag coefficient was determined by using altitude back-tracking method and found to be 0.825. Speed of the exhaust determined to be 2.5 km s-1 by analyzing the thrust curve of the rocket. Acceleration in the coasting phase of the flight, represented by the second-degree polynomial of a small leading coefficient, have been found to approach ‘-g’ asymptotically.
Simulation laboratories for training in obstetrics and gynecology.
Macedonia, Christian R; Gherman, Robert B; Satin, Andrew J
2003-08-01
Simulations have been used by the military, airline industry, and our colleagues in other medical specialties to educate, evaluate, and prepare for rare but life-threatening scenarios. Work hour limits for residents in obstetrics and gynecology and decreased patient availability for teaching of students and residents require us to think creatively and practically on how to optimize their education. Medical simulations may address scenarios in clinical practice that are considered important to know or understand. Simulations can take many forms, including computer programs, models or mannequins, virtual reality data immersion caves, and a combination of formats. The purpose of this commentary is to call attention to a potential role for medical simulation in obstetrics and gynecology. We briefly describe an example of how simulation may be incorporated into obstetric and gynecologic residency training. It is our contention that educators in obstetrics and gynecology should be aware of the potential for simulation in education. We hope this commentary will stimulate interest in the field, lead to validation studies, and improve training in and the practice of obstetrics and gynecology.
GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System
James Menart
2013-06-07
This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..
Design of object-oriented distributed simulation classes
NASA Technical Reports Server (NTRS)
Schoeffler, James D. (Principal Investigator)
1995-01-01
Distributed simulation of aircraft engines as part of a computer aided design package is being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for 'Numerical Propulsion Simulation System'. NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT 'Actor' model of a concurrent object and uses 'connectors' to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.
Design of Object-Oriented Distributed Simulation Classes
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1995-01-01
Distributed simulation of aircraft engines as part of a computer aided design package being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for "Numerical Propulsion Simulation System". NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT "Actor" model of a concurrent object and uses "connectors" to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.
Coupling of Noah-MP and the High Resolution CI-WATER ADHydro Hydrological Model
NASA Astrophysics Data System (ADS)
Moreno, H. A.; Goncalves Pureza, L.; Ogden, F. L.; Steinke, R. C.
2014-12-01
ADHydro is a physics-based, high-resolution, distributed hydrological model suitable for simulating large watersheds in a massively parallel computing environment. It simulates important processes such as: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow and water management. For the vegetation and evapotranspiration processes, ADHydro uses the validated community land surface model (LSM) Noah-MP. Noah-MP uses multiple options for key land-surface hydrology and was developed to facilitate climate predictions with physically based ensembles. This presentation discusses the lessons learned in coupling Noah-MP to ADHydro. Noah-MP is delivered with a main driver program and not as a library with a clear interface to be called from other codes. This required some investigation to determine the correct functions to call and the appropriate parameter values. ADHydro runs Noah-MP as a point process on each mesh element and provides initialization and forcing data for each element. Modeling data are acquired from various sources including the Soil Survey Geographic Database (SSURGO), the Weather Research and Forecasting (WRF) model, and internal ADHydro simulation states. Despite these challenges in coupling Noah-MP to ADHydro, the use of Noah-MP provides the benefits of a supported community code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starodumov, Ilya; Kropotin, Nikolai
2016-08-10
We investigate the three-dimensional mathematical model of crystal growth called PFC (Phase Field Crystal) in a hyperbolic modification. This model is also called the modified model PFC (originally PFC model is formulated in parabolic form) and allows to describe both slow and rapid crystallization processes on atomic length scales and on diffusive time scales. Modified PFC model is described by the differential equation in partial derivatives of the sixth order in space and second order in time. The solution of this equation is possible only by numerical methods. Previously, authors created the software package for the solution of the Phasemore » Field Crystal problem, based on the method of isogeometric analysis (IGA) and PetIGA program library. During further investigation it was found that the quality of the solution can strongly depends on the discretization parameters of a numerical method. In this report, we show the features that should be taken into account during constructing the computational grid for the numerical simulation.« less
Vane Pump Casing Machining of Dumpling Machine Based on CAD/CAM
NASA Astrophysics Data System (ADS)
Huang, Yusen; Li, Shilong; Li, Chengcheng; Yang, Zhen
Automatic dumpling forming machine is also called dumpling machine, which makes dumplings through mechanical motions. This paper adopts the stuffing delivery mechanism featuring the improved and specially-designed vane pump casing, which can contribute to the formation of dumplings. Its 3D modeling in Pro/E software, machining process planning, milling path optimization, simulation based on UG and compiling post program were introduced and verified. The results indicated that adoption of CAD/CAM offers firms the potential to pursue new innovative strategies.
PLASIM: A computer code for simulating charge exchange plasma propagation
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Deininger, W. D.; Winder, D. R.; Kaufman, H. R.
1982-01-01
The propagation of the charge exchange plasma for an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ANSI Standard FORTRAN.
Development of the electronic health records for nursing education (EHRNE) software program.
Kowitlawakul, Yanika; Wang, Ling; Chan, Sally Wai-Chi
2013-12-01
This paper outlines preliminary research of an innovative software program that enables the use of an electronic health record in a nursing education curriculum. The software application program is called EHRNE, which stands for Electronic Heath Record for Nursing Education. The aim of EHRNE is to enhance student's learning of health informatics when they are working in the simulation laboratory. Integrating EHRNE into the nursing curriculum exposes students to electronic health records before they go into the workplace. A qualitative study was conducted using focus group interviews of nine nursing students. Nursing students' perceptions of using the EHRNE application were explored. The interviews were audio-taped and transcribed verbatim. The data was analyzed following the Colaizzi (1978) guideline. Four main categories that related to the EHRNE application were identified from the interviews: functionality, data management, timing and complexity, and accessibility. The analysis of the data revealed advantages and limitations of using EHRNE in the classroom setting. Integrating the EHRNE program into the curriculum will promote students' awareness of electronic documentation and enhance students' learning in the simulation laboratory. Preliminary findings suggested that before integrating the EHRNE program into the nursing curriculum, educational sessions for both students and faculty outlining the software's purpose, advantages, and limitations were needed. Following the educational sessions, further investigation of students' perceptions and learning using the EHRNE program is recommended. Copyright © 2012 Elsevier Ltd. All rights reserved.
Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
2001-01-01
The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.
Multiaxial Cyclic Thermoplasticity Analysis with Besseling's Subvolume Method
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1983-01-01
A modification was formulated to Besseling's Subvolume Method to allow it to use multilinear stress-strain curves which are temperature dependent to perform cyclic thermoplasticity analyses. This method automotically reproduces certain aspects of real material behavior important in the analysis of Aircraft Gas Turbine Engine (AGTE) components. These include the Bauschinger effect, cross-hardening, and memory. This constitutive equation was implemented in a finite element computer program called CYANIDE. Subsequently, classical time dependent plasticity (creep) was added to the program. Since its inception, this program was assessed against laboratory and component testing and engine experience. The ability of this program to simulate AGTE material response characteristics was verified by this experience and its utility in providing data for life analyses was demonstrated. In this area of life analysis, the multiaxial thermoplasticity capabilities of the method have proved a match for the actual AGTE life experience.
Opening the door to coordination of care through teachable moments.
Berg, Gregory D; Korn, Allan M; Thomas, Eileen; Klemka-Walden, Linda; Bigony, Marysanta D; Newman, John F
2007-10-01
The challenge for care coordination is to identify members at a moment in time when they are receptive to intervention and provide the appropriate care management services. This manuscript describes a pilot program using inbound nurse advice calls from members to engage them in a care management program including disease management (DM). Annual medical claims diagnoses were used to identify members and their associated disease conditions. For each condition group for each year, nurse advice call data were used to calculate inbound nurse advice service call rates for each group. A pilot program was set up to engage inbound nurse advice callers in a broader discussion of their health concerns and refer them to a care management program. Among the program results, both the call rate by condition group and the correlation between average costs and call rates show that higher cost groups of members call the nurse advice service disproportionately more than lower cost members. Members who entered the DM programs through the nurse advice service were more likely to stay in the program than those who participated in the standard opt-in program. The results of this pilot program suggest that members who voluntarily call in to the nurse advice service for triage are at a "teachable moment" and highly motivated to participate in appropriate care management programs. The implication is that the nurse advice service may well be an innovative and effective way to enhance participation in a variety of care management programs including DM.
An investigation of a sterile access technique for the repair and adjustment of sterile spacecraft
NASA Technical Reports Server (NTRS)
Farmer, F. H.; Fuller, H. V.; Hueschen, R. M.
1973-01-01
A description is presented of a unique system for the sterilization and sterile repair of spacecraft and the results of a test program designed to assess the biological integrity and engineering reliability of the system. This trailer-mounted system, designated the model assembly sterilizer for testing (MAST), is capable of the dry-heat sterilization of spacecraft and/or components less than 2.3 meters in diameter at temperatures up to 433 K and the steam sterilization of components less than 0.724 meter in diameter. Sterile access to spacecraft is provided by two tunnel suits, called the bioisolator suit systems (BISS), which are contiguous with the walls of the sterilization chambers. The test program was designed primarily to verify the biological and engineering reliability of the MAST system by processing simulated space hardware. Each test cycle simulated the initial sterilization of a spacecraft, sterile repair of a failed component, removal of the spacecraft from the MAST for mating with the bus, and a sterile recycle repair.
A Universal Model for Solar Eruptions
NASA Astrophysics Data System (ADS)
Wyper, Peter; Antiochos, Spiro K.; DeVore, C. Richard
2017-08-01
We present a universal model for solar eruptions that encompasses coronal mass ejections (CMEs) at one end of the scale, to coronal jets at the other. The model is a natural extension of the Magnetic Breakout model for large-scale fast CMEs. Using high-resolution adaptive mesh MHD simulations conducted with the ARMS code, we show that so-called blowout or mini-filament coronal jets can be explained as one realisation of the breakout process. We also demonstrate the robustness of this “breakout-jet” model by studying three realisations in simulations with different ambient field inclinations. We conclude that magnetic breakout supports both large-scale fast CMEs and small-scale coronal jets, and by inference eruptions at scales in between. Thus, magnetic breakout provides a unified model for solar eruptions. P.F.W was supported in this work by an award of a RAS Fellowship and an appointment to the NASA Postdoctoral Program. C.R.D and S.K.A were supported by NASA’s LWS TR&T and H-SR programs.
Cooper, Simon J; Kinsman, Leigh; Chung, Catherine; Cant, Robyn; Boyle, Jayne; Bull, Loretta; Cameron, Amanda; Connell, Cliff; Kim, Jeong-Ah; McInnes, Denise; McKay, Angela; Nankervis, Katrina; Penz, Erika; Rotter, Thomas
2016-09-07
There are international concerns in relation to the management of patient deterioration which has led to a body of evidence known as the 'failure to rescue' literature. Nursing staff are known to miss cues of deterioration and often fail to call for assistance. Medical Emergency Teams (Rapid Response Teams) do improve the management of acutely deteriorating patients, but first responders need the requisite skills to impact on patient safety. In this study we aim to address these issues in a mixed methods interventional trial with the objective of measuring and comparing the cost and clinical impact of face-to-face and web-based simulation programs on the management of patient deterioration and related patient outcomes. The education programs, known as 'FIRST(2)ACT', have been found to have an impact on education and will be tested in four hospitals in the State of Victoria, Australia. Nursing staff will be trained in primary (the first 8 min) responses to emergencies in two medical wards using a face-to-face approach and in two medical wards using a web-based version FIRST(2)ACTWeb. The impact of these interventions will be determined through quantitative and qualitative approaches, cost analyses and patient notes review (time series analyses) to measure quality of care and patient outcomes. In this 18 month study it is hypothesised that both simulation programs will improve the detection and management of deteriorating patients but that the web-based program will have lower total costs. The study will also add to our overall understanding of the utility of simulation approaches in the preparation of nurses working in hospital wards. (ACTRN12616000468426, retrospectively registered 8.4.2016).
Computer simulation of a pilot in V/STOL aircraft control loops
NASA Technical Reports Server (NTRS)
Vogt, William G.; Mickle, Marlin H.; Zipf, Mark E.; Kucuk, Senol
1989-01-01
The objective was to develop a computerized adaptive pilot model for the computer model of the research aircraft, the Harrier II AV-8B V/STOL with special emphasis on propulsion control. In fact, two versions of the adaptive pilot are given. The first, simply called the Adaptive Control Model (ACM) of a pilot includes a parameter estimation algorithm for the parameters of the aircraft and an adaption scheme based on the root locus of the poles of the pilot controlled aircraft. The second, called the Optimal Control Model of the pilot (OCM), includes an adaption algorithm and an optimal control algorithm. These computer simulations were developed as a part of the ongoing research program in pilot model simulation supported by NASA Lewis from April 1, 1985 to August 30, 1986 under NASA Grant NAG 3-606 and from September 1, 1986 through November 30, 1988 under NASA Grant NAG 3-729. Once installed, these pilot models permitted the computer simulation of the pilot model to close all of the control loops normally closed by a pilot actually manipulating the control variables. The current version of this has permitted a baseline comparison of various qualitative and quantitative performance indices for propulsion control, the control loops and the work load on the pilot. Actual data for an aircraft flown by a human pilot furnished by NASA was compared to the outputs furnished by the computerized pilot and found to be favorable.
Accelerating simulation for the multiple-point statistics algorithm using vector quantization
NASA Astrophysics Data System (ADS)
Zuo, Chen; Pan, Zhibin; Liang, Hao
2018-03-01
Multiple-point statistics (MPS) is a prominent algorithm to simulate categorical variables based on a sequential simulation procedure. Assuming training images (TIs) as prior conceptual models, MPS extracts patterns from TIs using a template and records their occurrences in a database. However, complex patterns increase the size of the database and require considerable time to retrieve the desired elements. In order to speed up simulation and improve simulation quality over state-of-the-art MPS methods, we propose an accelerating simulation for MPS using vector quantization (VQ), called VQ-MPS. First, a variable representation is presented to make categorical variables applicable for vector quantization. Second, we adopt a tree-structured VQ to compress the database so that stationary simulations are realized. Finally, a transformed template and classified VQ are used to address nonstationarity. A two-dimensional (2D) stationary channelized reservoir image is used to validate the proposed VQ-MPS. In comparison with several existing MPS programs, our method exhibits significantly better performance in terms of computational time, pattern reproductions, and spatial uncertainty. Further demonstrations consist of a 2D four facies simulation, two 2D nonstationary channel simulations, and a three-dimensional (3D) rock simulation. The results reveal that our proposed method is also capable of solving multifacies, nonstationarity, and 3D simulations based on 2D TIs.
Multipole Algorithms for Molecular Dynamics Simulation on High Performance Computers.
NASA Astrophysics Data System (ADS)
Elliott, William Dewey
1995-01-01
A fundamental problem in modeling large molecular systems with molecular dynamics (MD) simulations is the underlying N-body problem of computing the interactions between all pairs of N atoms. The simplest algorithm to compute pair-wise atomic interactions scales in runtime {cal O}(N^2), making it impractical for interesting biomolecular systems, which can contain millions of atoms. Recently, several algorithms have become available that solve the N-body problem by computing the effects of all pair-wise interactions while scaling in runtime less than {cal O}(N^2). One algorithm, which scales {cal O}(N) for a uniform distribution of particles, is called the Greengard-Rokhlin Fast Multipole Algorithm (FMA). This work describes an FMA-like algorithm called the Molecular Dynamics Multipole Algorithm (MDMA). The algorithm contains several features that are new to N-body algorithms. MDMA uses new, efficient series expansion equations to compute general 1/r^{n } potentials to arbitrary accuracy. In particular, the 1/r Coulomb potential and the 1/r^6 portion of the Lennard-Jones potential are implemented. The new equations are based on multivariate Taylor series expansions. In addition, MDMA uses a cell-to-cell interaction region of cells that is closely tied to worst case error bounds. The worst case error bounds for MDMA are derived in this work also. These bounds apply to other multipole algorithms as well. Several implementation enhancements are described which apply to MDMA as well as other N-body algorithms such as FMA and tree codes. The mathematics of the cell -to-cell interactions are converted to the Fourier domain for reduced operation count and faster computation. A relative indexing scheme was devised to locate cells in the interaction region which allows efficient pre-computation of redundant information and prestorage of much of the cell-to-cell interaction. Also, MDMA was integrated into the MD program SIgMA to demonstrate the performance of the program over several simulation timesteps. One MD application described here highlights the utility of including long range contributions to Lennard-Jones potential in constant pressure simulations. Another application shows the time dependence of long range forces in a multiple time step MD simulation.
NASA Astrophysics Data System (ADS)
Chun, Tae Yoon; Lee, Jae Young; Park, Jin Bae; Choi, Yoon Ho
2018-06-01
In this paper, we propose two multirate generalised policy iteration (GPI) algorithms applied to discrete-time linear quadratic regulation problems. The proposed algorithms are extensions of the existing GPI algorithm that consists of the approximate policy evaluation and policy improvement steps. The two proposed schemes, named heuristic dynamic programming (HDP) and dual HDP (DHP), based on multirate GPI, use multi-step estimation (M-step Bellman equation) at the approximate policy evaluation step for estimating the value function and its gradient called costate, respectively. Then, we show that these two methods with the same update horizon can be considered equivalent in the iteration domain. Furthermore, monotonically increasing and decreasing convergences, so called value iteration (VI)-mode and policy iteration (PI)-mode convergences, are proved to hold for the proposed multirate GPIs. Further, general convergence properties in terms of eigenvalues are also studied. The data-driven online implementation methods for the proposed HDP and DHP are demonstrated and finally, we present the results of numerical simulations performed to verify the effectiveness of the proposed methods.
49 CFR 198.39 - Qualifications for operation of one-call notification system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 3 2010-10-01 2010-10-01 false Qualifications for operation of one-call...) PIPELINE SAFETY REGULATIONS FOR GRANTS TO AID STATE PIPELINE SAFETY PROGRAMS Adoption of One-Call Damage Prevention Program § 198.39 Qualifications for operation of one-call notification system. A one-call...
PyCOOL — A Cosmological Object-Oriented Lattice code written in Python
NASA Astrophysics Data System (ADS)
Sainio, J.
2012-04-01
There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive the symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sainio, J., E-mail: jani.sainio@utu.fi; Department of Physics and Astronomy, University of Turku, FI-20014 Turku
There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive themore » symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.« less
A fuzzy call admission control scheme in wireless networks
NASA Astrophysics Data System (ADS)
Ma, Yufeng; Gong, Shenguang; Hu, Xiulin; Zhang, Yunyu
2007-11-01
Scarcity of the spectrum resource and mobility of users make quality of service (QoS) provision a critical issue in wireless networks. This paper presents a fuzzy call admission control scheme to meet the requirement of the QoS. A performance measure is formed as a weighted linear function of new call and handoff call blocking probabilities. Simulation compares the proposed fuzzy scheme with an adaptive channel reservation scheme. Simulation results show that fuzzy scheme has a better robust performance in terms of average blocking criterion.
DIVWAG Model Documentation. Volume II. Programmer/Analyst Manual. Part 3. Chapter 9 Through 12.
1976-07-01
entered through a routine, NAM2, that calls the segment controlling routine NBARAS. (4) Segment 3, controlled by the routine NFIRE , simulates round...nuclear fire, NAM calls in sequence the routines NFIRE (segment 3), ASUNIT (segment 2), SASSMT (segment 4), and NFIRE (segment 3). These calls simulate...this is a call to NFIRE (ISEG equals one or two), control goes to block L2. (2) Block 2. If this is to assess a unit passing through a nuclear barrier
ETARA PC version 3.3 user's guide: Reliability, availability, maintainability simulation model
NASA Technical Reports Server (NTRS)
Hoffman, David J.; Viterna, Larry A.
1991-01-01
A user's manual describing an interactive, menu-driven, personal computer based Monte Carlo reliability, availability, and maintainability simulation program called event time availability reliability (ETARA) is discussed. Given a reliability block diagram representation of a system, ETARA simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair intervals as a function of exponential and/or Weibull distributions. Availability parameters such as equivalent availability, state availability (percentage of time as a particular output state capability), continuous state duration and number of state occurrences can be calculated. Initial spares allotment and spares replenishment on a resupply cycle can be simulated. The number of block failures are tabulated both individually and by block type, as well as total downtime, repair time, and time waiting for spares. Also, maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can be calculated over a cumulative period of time or at specific points in time.
Modeling structural change in spatial system dynamics: A Daisyworld example.
Neuwirth, C; Peck, A; Simonović, S P
2015-03-01
System dynamics (SD) is an effective approach for helping reveal the temporal behavior of complex systems. Although there have been recent developments in expanding SD to include systems' spatial dependencies, most applications have been restricted to the simulation of diffusion processes; this is especially true for models on structural change (e.g. LULC modeling). To address this shortcoming, a Python program is proposed to tightly couple SD software to a Geographic Information System (GIS). The approach provides the required capacities for handling bidirectional and synchronized interactions of operations between SD and GIS. In order to illustrate the concept and the techniques proposed for simulating structural changes, a fictitious environment called Daisyworld has been recreated in a spatial system dynamics (SSD) environment. The comparison of spatial and non-spatial simulations emphasizes the importance of considering spatio-temporal feedbacks. Finally, practical applications of structural change models in agriculture and disaster management are proposed.
NASA Technical Reports Server (NTRS)
Chawner, David M.; Gomez, Ray J.
2010-01-01
In the Applied Aerosciences and CFD branch at Johnson Space Center, computational simulations are run that face many challenges. Two of which are the ability to customize software for specialized needs and the need to run simulations as fast as possible. There are many different tools that are used for running these simulations and each one has its own pros and cons. Once these simulations are run, there needs to be software capable of visualizing the results in an appealing manner. Some of this software is called open source, meaning that anyone can edit the source code to make modifications and distribute it to all other users in a future release. This is very useful, especially in this branch where many different tools are being used. File readers can be written to load any file format into a program, to ease the bridging from one tool to another. Programming such a reader requires knowledge of the file format that is being read as well as the equations necessary to obtain the derived values after loading. When running these CFD simulations, extremely large files are being loaded and having values being calculated. These simulations usually take a few hours to complete, even on the fastest machines. Graphics processing units (GPUs) are usually used to load the graphics for computers; however, in recent years, GPUs are being used for more generic applications because of the speed of these processors. Applications run on GPUs have been known to run up to forty times faster than they would on normal central processing units (CPUs). If these CFD programs are extended to run on GPUs, the amount of time they would require to complete would be much less. This would allow more simulations to be run in the same amount of time and possibly perform more complex computations.
New York area and worldwide: call-in radio program on HIV.
1999-07-16
Treatment activist Jules Levin, founder of the National AIDS Treatment Advocacy Group, has begun a weekly radio program called "Living Well with HIV". Listeners can call in with questions for experts featured on the show. Programs on hepatitis and AIDS have already been scheduled. Contact information is provided.
System IDentification Programs for AirCraft (SIDPAC)
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2002-01-01
A collection of computer programs for aircraft system identification is described and demonstrated. The programs, collectively called System IDentification Programs for AirCraft, or SIDPAC, were developed in MATLAB as m-file functions. SIDPAC has been used successfully at NASA Langley Research Center with data from many different flight test programs and wind tunnel experiments. SIDPAC includes routines for experiment design, data conditioning, data compatibility analysis, model structure determination, equation-error and output-error parameter estimation in both the time and frequency domains, real-time and recursive parameter estimation, low order equivalent system identification, estimated parameter error calculation, linear and nonlinear simulation, plotting, and 3-D visualization. An overview of SIDPAC capabilities is provided, along with a demonstration of the use of SIDPAC with real flight test data from the NASA Glenn Twin Otter aircraft. The SIDPAC software is available without charge to U.S. citizens by request to the author, contingent on the requestor completing a NASA software usage agreement.
Sci—Fri PM: Topics — 05: Experience with linac simulation software in a teaching environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlone, Marco; Harnett, Nicole; Jaffray, David
Medical linear accelerator education is usually restricted to use of academic textbooks and supervised access to accelerators. To facilitate the learning process, simulation software was developed to reproduce the effect of medical linear accelerator beam adjustments on resulting clinical photon beams. The purpose of this report is to briefly describe the method of operation of the software as well as the initial experience with it in a teaching environment. To first and higher orders, all components of medical linear accelerators can be described by analytical solutions. When appropriate calibrations are applied, these analytical solutions can accurately simulate the performance ofmore » all linear accelerator sub-components. Grouped together, an overall medical linear accelerator model can be constructed. Fifteen expressions in total were coded using MATLAB v 7.14. The program was called SIMAC. The SIMAC program was used in an accelerator technology course offered at our institution; 14 delegates attended the course. The professional breakdown of the participants was: 5 physics residents, 3 accelerator technologists, 4 regulators and 1 physics associate. The course consisted of didactic lectures supported by labs using SIMAC. At the conclusion of the course, eight of thirteen delegates were able to successfully perform advanced beam adjustments after two days of theory and use of the linac simulator program. We suggest that this demonstrates good proficiency in understanding of the accelerator physics, which we hope will translate to a better ability to understand real world beam adjustments on a functioning medical linear accelerator.« less
Intelligent controller of novel design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou Qi Jian; Bai Jian Kuo
1983-01-01
This paper presents the authors attempt to combine the control engineering principle with human intelligence to form a new control algorithm. The hybrid system thus formed is both analogous and logical in actions and is called the intelligent controller (IC). With the help of cybernetics princple, the operator's intelligent action of control is programmed into the controller and the system is thus taught to act like an intelligent being within the prescribed range. Remarkable results were obtained from experiments conducted on an electronic model simulating the above mentioned system. Stability studies and system analysis are presented. 12 references.
NASA Astrophysics Data System (ADS)
Johnson, Sylvester, IV
A CAE (Computer Aided Engineering) tool called SEEL (Simulation of Electron Energy Loss) is described in detail. SEEL simulates in any material the energy loss and trajectories of electrons in the complex, multilayered nanostructures typical of ULSI, at beam energies from 1 to 50 keV. Structures and materials are defined in the input file rather than in the source code of the program, for which flowcharts are included in addition to an explanation of the algorithms implemented. Satisfactory comparisons of simulated with experimental results are made of both secondary electron (SE) and backscattered electron (BSE) linescans across an array of MOS gate structures capped by rough oxide. Many other comparisons are made. The effects of varying line edge slopes on SE linescan peak shape are simulated and analyzed. A data library containing the simulated variation of the FWHM, peak height, and peak location with slope for different materials, line heights or trench depths, widths, beam energies, and nominal diameters could be used to find the edge location relative to the peak for improvement of the accuracy of linewidth measurement algorithms. An investigation indicates that the use of such a library would be complicated by the effect of surface roughness on the SE signal at the edge of a feature. SEEL can be used as the first module in a series of programs that simulate energy deposition in resist structures and correct the exposure of a circuit pattern. Pixel by pixel convolution for prediction of the proximity effect is time-consuming. Another method of proximity effect prediction based on the reciprocity of the RED is described. Such programs could be used to reduce the number of iterations in the lab required to optimize resist structures and exposure parameters. For both smooth and rough interfaces between a bottom layer of PMMA in a multilayer resist structure and a W film, the simulated exposure contrast declines from that with an oxide film beneath the structure. A comparison of Auger peak to background ratios resulting from simulation of smooth and rough surfaces indicates that roughening of an Al surface on a small scale could result in a smaller ratio.
Yule, Steven; Parker, Sarah Henrickson; Wilkinson, Jill; McKinley, Aileen; MacDonald, Jamie; Neill, Adrian; McAdam, Tim
2015-01-01
To investigate the effect of coaching on non-technical skills and performance during laparoscopic cholecystectomy in a simulated operating room (OR). Non-technical skills (situation awareness, decision making, teamwork, and leadership) underpin technical ability and are critical to the success of operations and the safety of patients in the OR. The rate of developing assessment tools in this area has outpaced development of workable interventions to improve non-technical skills in surgical training and beyond. A randomized trial was conducted with senior surgical residents (n = 16). Participants were randomized to receive either non-technical skills coaching (intervention) or to self-reflect (control) after each of 5 simulated operations. Coaching was based on the Non-Technical Skills For Surgeons (NOTSS) behavior observation system. Surgeon-coaches trained in this method coached participants in the intervention group for 10 minutes after each simulation. Primary outcome measure was non-technical skills, assessed from video by a surgeon using the NOTSS system. Secondary outcomes were time to call for help during bleeding, operative time, and path length of laparoscopic instruments. Non-technical skills improved in the intervention group from scenario 1 to scenario 5 compared with those in the control group (p = 0.04). The intervention group was faster to call for help when faced with unstoppable bleeding in the final scenario (no. 5; p = 0.03). Coaching improved residents' non-technical skills in the simulated OR compared with those in the control group. Important next steps are to implement non-technical skills coaching in the real OR and assess effect on clinically important process measures and patient outcomes. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
A high-fidelity satellite ephemeris program for Earth satellites in eccentric orbits
NASA Technical Reports Server (NTRS)
Simmons, David R.
1990-01-01
A program for mission planning called the Analytic Satellite Ephemeris Program (ASEP), produces projected data for orbits that remain fairly close to the Earth. ASEP does not take into account lunar and solar perturbations. These perturbations are accounted for in another program called GRAVE, which incorporates more flexible means of input for initial data, provides additional kinds of output information, and makes use of structural programming techniques to make the program more understandable and reliable. GRAVE was revised, and a new program called ORBIT was developed. It is divided into three major phases: initialization, integration, and output. Results of the program development are presented.
Simulation of major space particles toward selected materials in a near-equatorial low earth orbit
NASA Astrophysics Data System (ADS)
Suparta, Wayan; Zulkeple, Siti Katrina
2017-05-01
A low earth orbit near the equator (LEO-NEqO) is exposed to the highest energies from galactic cosmic rays (GCR) and from trapped protons with a wide range of energies. Moreover, GCR fluxes were seen to be the highest in 2009 to 2010 when communication belonging to the RazakSAT-1 satellite was believed to have been lost. Hence, this study aimed to determine the influence of the space environment toward the operation of LEO-NEqO satellites by investigating the behavior of major space particles toward satellite materials. The space environment was referred to GCR protons and trapped protons. Their fluxes were obtained from the Space Environment Information System (SPENVIS) and their tracks were simulated through three materials using a simulation program called Geometry and Tracking (Geant4). The materials included aluminum (Al), gallium arsenide (GaAs) and silicon (Si). Then the total ionizing dose (TID) and non-ionizing dose (NIEL) were calculated for a three-year period. Simulations showed that GCR traveled at longer tracks and produced more secondary radiation than trapped protons. Al turned out to receive the lowest total dose, while GaAs showed to be susceptible toward GCR than Si. However, trapped protons contributed the most in spacecraft doses where Si received the highest doses. Finally, the comparison between two Geant4 programs revealed the estimated doses differed at <18%.
2011-01-01
Background This observational study assessed the relation between mass media campaigns and service volume for a statewide tobacco cessation quitline and stand-alone web-based cessation program. Methods Multivariate regression analysis was used to identify how weekly calls to a cessation quitline and weekly registrations to a web-based cessation program are related to levels of broadcast media, media campaigns, and media types, controlling for the impact of external and earned media events. Results There was a positive relation between weekly broadcast targeted rating points and the number of weekly calls to a cessation quitline and the number of weekly registrations to a web-based cessation program. Additionally, print secondhand smoke ads and online cessation ads were positively related to weekly quitline calls. Television and radio cessation ads and radio smoke-free law ads were positively related to web program registration levels. There was a positive relation between the number of web registrations and the number of calls to the cessation quitline, with increases in registrations to the web in 1 week corresponding to increases in calls to the quitline in the subsequent week. Web program registration levels were more highly influenced by earned media and other external events than were quitline call volumes. Conclusion Overall, broadcast advertising had a greater impact on registrations for the web program than calls to the quitline. Furthermore, registrations for the web program influenced calls to the quitline. These two findings suggest the evolving roles of web-based cessation programs and Internet-use practices should be considered when creating cessation programs and media campaigns to promote them. Additionally, because different types of media and campaigns were positively associated with calls to the quitline and web registrations, developing mass media campaigns that offer a variety of messages and communicate through different types of media to motivate tobacco users to seek services appears important to reach tobacco users. Further research is needed to better understand the complexities and opportunities involved in simultaneous promotion of quitline and web-based cessation services. PMID:22177237
Schillo, Barbara A; Mowery, Andrea; Greenseid, Lija O; Luxenberg, Michael G; Zieffler, Andrew; Christenson, Matthew; Boyle, Raymond G
2011-12-16
This observational study assessed the relation between mass media campaigns and service volume for a statewide tobacco cessation quitline and stand-alone web-based cessation program. Multivariate regression analysis was used to identify how weekly calls to a cessation quitline and weekly registrations to a web-based cessation program are related to levels of broadcast media, media campaigns, and media types, controlling for the impact of external and earned media events. There was a positive relation between weekly broadcast targeted rating points and the number of weekly calls to a cessation quitline and the number of weekly registrations to a web-based cessation program. Additionally, print secondhand smoke ads and online cessation ads were positively related to weekly quitline calls. Television and radio cessation ads and radio smoke-free law ads were positively related to web program registration levels. There was a positive relation between the number of web registrations and the number of calls to the cessation quitline, with increases in registrations to the web in 1 week corresponding to increases in calls to the quitline in the subsequent week. Web program registration levels were more highly influenced by earned media and other external events than were quitline call volumes. Overall, broadcast advertising had a greater impact on registrations for the web program than calls to the quitline. Furthermore, registrations for the web program influenced calls to the quitline. These two findings suggest the evolving roles of web-based cessation programs and Internet-use practices should be considered when creating cessation programs and media campaigns to promote them. Additionally, because different types of media and campaigns were positively associated with calls to the quitline and web registrations, developing mass media campaigns that offer a variety of messages and communicate through different types of media to motivate tobacco users to seek services appears important to reach tobacco users. Further research is needed to better understand the complexities and opportunities involved in simultaneous promotion of quitline and web-based cessation services.
Exploring Scholarship and the Emergency Medicine Educator: A Workforce Study.
Jordan, Jaime; Coates, Wendy C; Clarke, Samuel; Runde, Daniel P; Fowlkes, Emilie; Kurth, Jacqueline; Yarris, Lalena M
2017-01-01
Recent literature calls for initiatives to improve the quality of education studies and support faculty in approaching educational problems in a scholarly manner. Understanding the emergency medicine (EM) educator workforce is a crucial precursor to developing policies to support educators and promote education scholarship in EM. This study aims to illuminate the current workforce model for the academic EM educator. Program leadership at EM training programs completed an online survey consisting of multiple choice, completion, and free-response type items. We calculated and reported descriptive statistics. 112 programs participated. Mean number of core faculty/program: 16.02 ± 7.83 [14.53-17.5]. Mean number of faculty full-time equivalents (FTEs)/program dedicated to education is 6.92 ± 4.92 [5.87-7.98], including (mean FTE): Vice chair for education (0.25); director of medical education (0.13); education fellowship director (0.2); residency program director (0.83); associate residency director (0.94); assistant residency director (1.1); medical student clerkship director (0.8); assistant/associate clerkship director (0.28); simulation fellowship director (0.11); simulation director (0.42); director of faculty development (0.13). Mean number of FTEs/program for education administrative support is 2.34 ± 1.1 [2.13-2.61]. Determination of clinical hours varied; 38.75% of programs had personnel with education research expertise. Education faculty represent about 43% of the core faculty workforce. Many programs do not have the full spectrum of education leadership roles and educational faculty divide their time among multiple important academic roles. Clinical requirements vary. Many departments lack personnel with expertise in education research. This information may inform interventions to promote education scholarship.
Orzol, Leonard L.; McGrath, Timothy S.
1992-01-01
This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.
Intelligent call admission control for multi-class services in mobile cellular networks
NASA Astrophysics Data System (ADS)
Ma, Yufeng; Hu, Xiulin; Zhang, Yunyu
2005-11-01
Scarcity of the spectrum resource and mobility of users make quality of service (QoS) provision a critical issue in mobile cellular networks. This paper presents a fuzzy call admission control scheme to meet the requirement of the QoS. A performance measure is formed as a weighted linear function of new call and handoff call blocking probabilities of each service class. Simulation compares the proposed fuzzy scheme with complete sharing and guard channel policies. Simulation results show that fuzzy scheme has a better robust performance in terms of average blocking criterion.
Decision blocks: A tool for automating decision making in CLIPS
NASA Technical Reports Server (NTRS)
Eick, Christoph F.; Mehta, Nikhil N.
1991-01-01
The human capability of making complex decision is one of the most fascinating facets of human intelligence, especially if vague, judgemental, default or uncertain knowledge is involved. Unfortunately, most existing rule based forward chaining languages are not very suitable to simulate this aspect of human intelligence, because of their lack of support for approximate reasoning techniques needed for this task, and due to the lack of specific constructs to facilitate the coding of frequently reoccurring decision block to provide better support for the design and implementation of rule based decision support systems. A language called BIRBAL, which is defined on the top of CLIPS, for the specification of decision blocks, is introduced. Empirical experiments involving the comparison of the length of CLIPS program with the corresponding BIRBAL program for three different applications are surveyed. The results of these experiments suggest that for decision making intensive applications, a CLIPS program tends to be about three times longer than the corresponding BIRBAL program.
Rotordynamics on the PC: Transient Analysis With ARDS
NASA Technical Reports Server (NTRS)
Fleming, David P.
1997-01-01
Personal computers can now do many jobs that formerly required a large mainframe computer. An example is NASA Lewis Research Center's program Analysis of RotorDynamic Systems (ARDS), which uses the component mode synthesis method to analyze the dynamic motion of up to five rotating shafts. As originally written in the early 1980's, this program was considered large for the mainframe computers of the time. ARDS, which was written in Fortran 77, has been successfully ported to a 486 personal computer. Plots appear on the computer monitor via calls programmed for the original CALCOMP plotter; plots can also be output on a standard laser printer. The executable code, which uses the full array sizes of the mainframe version, easily fits on a high-density floppy disk. The program runs under DOS with an extended memory manager. In addition to transient analysis of blade loss, step turns, and base acceleration, with simulation of squeeze-film dampers and rubs, ARDS calculates natural frequencies and unbalance response.
Learning directed acyclic graphs from large-scale genomics data.
Nikolay, Fabio; Pesavento, Marius; Kritikos, George; Typas, Nassos
2017-09-20
In this paper, we consider the problem of learning the genetic interaction map, i.e., the topology of a directed acyclic graph (DAG) of genetic interactions from noisy double-knockout (DK) data. Based on a set of well-established biological interaction models, we detect and classify the interactions between genes. We propose a novel linear integer optimization program called the Genetic-Interactions-Detector (GENIE) to identify the complex biological dependencies among genes and to compute the DAG topology that matches the DK measurements best. Furthermore, we extend the GENIE program by incorporating genetic interaction profile (GI-profile) data to further enhance the detection performance. In addition, we propose a sequential scalability technique for large sets of genes under study, in order to provide statistically significant results for real measurement data. Finally, we show via numeric simulations that the GENIE program and the GI-profile data extended GENIE (GI-GENIE) program clearly outperform the conventional techniques and present real data results for our proposed sequential scalability technique.
Beaked Whales Respond to Simulated and Actual Navy Sonar
2011-03-14
predator recognition in harbour seals. Nature 420: 171–173. 34. Ford JKB (1989) Acoustic behavior of resident killer whales (Orcinus orca) off Vancouver...acoustic exposure and behavioral reactions of beaked whales to one controlled exposure each of simulated military sonar, killer whale calls, and band...of simulated military sonar, killer whale calls, and band-limited noise. The beaked whales reacted to these three sound playbacks at sound pressure
An Adaptive Instability Suppression Controls Method for Aircraft Gas Turbine Engine Combustors
NASA Technical Reports Server (NTRS)
Kopasakis, George; DeLaat, John C.; Chang, Clarence T.
2008-01-01
An adaptive controls method for instability suppression in gas turbine engine combustors has been developed and successfully tested with a realistic aircraft engine combustor rig. This testing was part of a program that demonstrated, for the first time, successful active combustor instability control in an aircraft gas turbine engine-like environment. The controls method is called Adaptive Sliding Phasor Averaged Control. Testing of the control method has been conducted in an experimental rig with different configurations designed to simulate combustors with instabilities of about 530 and 315 Hz. Results demonstrate the effectiveness of this method in suppressing combustor instabilities. In addition, a dramatic improvement in suppression of the instability was achieved by focusing control on the second harmonic of the instability. This is believed to be due to a phenomena discovered and reported earlier, the so called Intra-Harmonic Coupling. These results may have implications for future research in combustor instability control.
Martin-Brennan, Cindy; Joshi, Jitendra
2003-12-01
Space life sciences research activities are reviewed for 2003. Many life sciences experiments were lost with the tragic loss of STS-107. Life sciences experiments continue to fly as small payloads to the International Space Station (ISS) via the Russian Progress vehicle. Health-related studies continue with the Martian Radiation Environment Experiment (MARIE) aboard the Odyssey spacecraft, collecting data on the radiation environment in Mars orbit. NASA Ames increased nanotechnology research in all areas, including fundamental biology, bioastronautics, life support systems, and homeland security. Plant research efforts continued at NASA Kennedy, testing candidate crops for ISS. Research included plant growth studies at different light intensities, varying carbon dioxide concentrations, and different growth media. Education and outreach efforts included development of a NASA/USDA program called Space Agriculture in the Classroom. Canada sponsored a project called Tomatosphere, with classrooms across North America exposing seeds to simulated Mars environment for growth studies. NASA's Office of Biological and Physical Research released an updated strategic research plan.
NASA Astrophysics Data System (ADS)
Griesse-Nascimento, Sarah; Bridger, Joshua; Brown, Keith; Westervelt, Robert
2011-03-01
Interactive computer simulations increase students' understanding of difficult concepts and their ability to explain complex ideas. We created a module of eight interactive programs and accompanying lesson plans for teaching the fundamental concepts of Nuclear Magnetic Resonance (NMR) and Magnetic Resonance Imaging (MRI) that we call interactive NMR (iNMR). We begin with an analogy between nuclear spins and metronomes to start to build intuition about the dynamics of spins in a magnetic field. We continue to explain T1, T2, and pulse sequences with the metronome analogy. The final three programs are used to introduce and explain the Magnetic Resonance Switch, a recent diagnostic technique based on NMR. A modern relevant application is useful to generate interest in the topic and confidence in the students' ability to apply their knowledge. The iNMR module was incorporated into a high school AP physics class. In a preliminary evaluation of implementation, students expressed enthusiasm and demonstrated enhanced understanding of the material relative to the previous year. Funded by NSF PHY-0646094 grant.
NASA Technical Reports Server (NTRS)
Putnam, L. E.
1979-01-01
A Neumann solution for inviscid external flow was coupled to a modified Reshotko-Tucker integral boundary-layer technique, the control volume method of Presz for calculating flow in the separated region, and an inviscid one-dimensional solution for the jet exhaust flow in order to predict axisymmetric nozzle afterbody pressure distributions and drag. The viscous and inviscid flows are solved iteratively until convergence is obtained. A computer algorithm of this procedure was written and is called DONBOL. A description of the computer program and a guide to its use is given. Comparisons of the predictions of this method with experiments show that the method accurately predicts the pressure distributions of boattail afterbodies which have the jet exhaust flow simulated by solid bodies. For nozzle configurations which have the jet exhaust simulated by high-pressure air, the present method significantly underpredicts the magnitude of nozzle pressure drag. This deficiency results because the method neglects the effects of jet plume entrainment. This method is limited to subsonic free-stream Mach numbers below that for which the flow over the body of revolution becomes sonic.
NASA Technical Reports Server (NTRS)
Lopez, Isaac; Follen, Gregory J.; Gutierrez, Richard; Foster, Ian; Ginsburg, Brian; Larsson, Olle; Martin, Stuart; Tuecke, Steven; Woodford, David
2000-01-01
This paper describes a project to evaluate the feasibility of combining Grid and Numerical Propulsion System Simulation (NPSS) technologies, with a view to leveraging the numerous advantages of commodity technologies in a high-performance Grid environment. A team from the NASA Glenn Research Center and Argonne National Laboratory has been studying three problems: a desktop-controlled parameter study using Excel (Microsoft Corporation); a multicomponent application using ADPAC, NPSS, and a controller program-, and an aviation safety application running about 100 jobs in near real time. The team has successfully demonstrated (1) a Common-Object- Request-Broker-Architecture- (CORBA-) to-Globus resource manager gateway that allows CORBA remote procedure calls to be used to control the submission and execution of programs on workstations and massively parallel computers, (2) a gateway from the CORBA Trader service to the Grid information service, and (3) a preliminary integration of CORBA and Grid security mechanisms. We have applied these technologies to two applications related to NPSS, namely a parameter study and a multicomponent simulation.
Numerical simulation of liquid-layer breakup on a moving wall due to an impinging jet
NASA Astrophysics Data System (ADS)
Yu, Taejong; Moon, Hojoon; You, Donghyun; Kim, Dokyun; Ovsyannikov, Andrey
2014-11-01
Jet wiping, which is a hydrodynamic method for controlling the liquid film thickness in coating processes, is constrained by a rather violent film instability called splashing. The instability is characterized by the ejection of droplets from the runback flow and results in an explosion of the film. The splashing phenomenon degrades the final coating quality. In the present research, a volume-of-fluid (VOF)-based method, which is developed at Cascade Technologies, is employed to simulate the air-liquid multiphase flow dynamics. The present numerical method is based on an unstructured-grid unsplit geometric VOF scheme and guarantees strict conservation of mass of two-phase flow, The simulation results are compared with experimental measurements such as the liquid-film thickness before and after the jet wiping, wall pressure and shear stress distributions. The trajectories of liquid droplets due to the fluid motion entrained by the gas-jet operation, are also qualitatively compared with experimental visualization. Physical phenomena observed during the liquid-layer breakup due to an impinging jet is characterized in order to develop ideas for controlling the liquid-layer instability and resulting splash generation and propagation. Supported by the Grant NRF-2012R1A1A2003699, the Brain Korea 21+ program, POSCO, and 2014 CTR Summer Program.
Solute and heat transport model of the Henry and Hilleke laboratory experiment
Langevin, C.D.; Dausman, A.M.; Sukop, M.C.
2010-01-01
SEAWAT is a coupled version of MODFLOW and MT3DMS designed to simulate variable-density ground water flow and solute transport. The most recent version of SEAWAT, called SEAWAT Version 4, includes new capabilities to represent simultaneous multispecies solute and heat transport. To test the new features in SEAWAT, the laboratory experiment of Henry and Hilleke (1972) was simulated. Henry and Hilleke used warm fresh water to recharge a large sand-filled glass tank. A cold salt water boundary was represented on one side. Adjustable heating pads were used to heat the bottom and left sides of the tank. In the laboratory experiment, Henry and Hilleke observed both salt water and fresh water flow systems separated by a narrow transition zone. After minor tuning of several input parameters with a parameter estimation program, results from the SEAWAT simulation show good agreement with the experiment. SEAWAT results suggest that heat loss to the room was more than expected by Henry and Hilleke, and that multiple thermal convection cells are the likely cause of the widened transition zone near the hot end of the tank. Other computer programs with similar capabilities may benefit from benchmark testing with the Henry and Hilleke laboratory experiment. Journal Compilation ?? 2009 National Ground Water Association.
Automated simulation as part of a design workstation
NASA Technical Reports Server (NTRS)
Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.
1990-01-01
A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.
A theoretical stochastic control framework for adapting radiotherapy to hypoxia
NASA Astrophysics Data System (ADS)
Saberian, Fatemeh; Ghate, Archis; Kim, Minsun
2016-10-01
Hypoxia, that is, insufficient oxygen partial pressure, is a known cause of reduced radiosensitivity in solid tumors, and especially in head-and-neck tumors. It is thus believed to adversely affect the outcome of fractionated radiotherapy. Oxygen partial pressure varies spatially and temporally over the treatment course and exhibits inter-patient and intra-tumor variation. Emerging advances in non-invasive functional imaging offer the future possibility of adapting radiotherapy plans to this uncertain spatiotemporal evolution of hypoxia over the treatment course. We study the potential benefits of such adaptive planning via a theoretical stochastic control framework using computer-simulated evolution of hypoxia on computer-generated test cases in head-and-neck cancer. The exact solution of the resulting control problem is computationally intractable. We develop an approximation algorithm, called certainty equivalent control, that calls for the solution of a sequence of convex programs over the treatment course; dose-volume constraints are handled using a simple constraint generation method. These convex programs are solved using an interior point algorithm with a logarithmic barrier via Newton’s method and backtracking line search. Convexity of various formulations in this paper is guaranteed by a sufficient condition on radiobiological tumor-response parameters. This condition is expected to hold for head-and-neck tumors and for other similarly responding tumors where the linear dose-response parameter is larger than the quadratic dose-response parameter. We perform numerical experiments on four test cases by using a first-order vector autoregressive process with exponential and rational-quadratic covariance functions from the spatiotemporal statistics literature to simulate the evolution of hypoxia. Our results suggest that dynamic planning could lead to a considerable improvement in the number of tumor cells remaining at the end of the treatment course. Through these simulations, we also gain insights into when and why dynamic planning is likely to yield the largest benefits.
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
A hardware-in-the-loop simulation program for ground-based radar
NASA Astrophysics Data System (ADS)
Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna
2011-06-01
A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.
Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2011-01-01
This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.
Bai, Qifeng; Shao, Yonghua; Pan, Dabo; Zhang, Yang; Liu, Huanxiang; Yao, Xiaojun
2014-01-01
We designed a program called MolGridCal that can be used to screen small molecule database in grid computing on basis of JPPF grid environment. Based on MolGridCal program, we proposed an integrated strategy for virtual screening and binding mode investigation by combining molecular docking, molecular dynamics (MD) simulations and free energy calculations. To test the effectiveness of MolGridCal, we screened potential ligands for β2 adrenergic receptor (β2AR) from a database containing 50,000 small molecules. MolGridCal can not only send tasks to the grid server automatically, but also can distribute tasks using the screensaver function. As for the results of virtual screening, the known agonist BI-167107 of β2AR is ranked among the top 2% of the screened candidates, indicating MolGridCal program can give reasonable results. To further study the binding mode and refine the results of MolGridCal, more accurate docking and scoring methods are used to estimate the binding affinity for the top three molecules (agonist BI-167107, neutral antagonist alprenolol and inverse agonist ICI 118,551). The results indicate agonist BI-167107 has the best binding affinity. MD simulation and free energy calculation are employed to investigate the dynamic interaction mechanism between the ligands and β2AR. The results show that the agonist BI-167107 also has the lowest binding free energy. This study can provide a new way to perform virtual screening effectively through integrating molecular docking based on grid computing, MD simulations and free energy calculations. The source codes of MolGridCal are freely available at http://molgridcal.codeplex.com. PMID:25229694
Havens: Explicit Reliable Memory Regions for HPC Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hukerikar, Saurabh; Engelmann, Christian
2016-01-01
Supporting error resilience in future exascale-class supercomputing systems is a critical challenge. Due to transistor scaling trends and increasing memory density, scientific simulations are expected to experience more interruptions caused by transient errors in the system memory. Existing hardware-based detection and recovery techniques will be inadequate to manage the presence of high memory fault rates. In this paper we propose a partial memory protection scheme based on region-based memory management. We define the concept of regions called havens that provide fault protection for program objects. We provide reliability for the regions through a software-based parity protection mechanism. Our approach enablesmore » critical program objects to be placed in these havens. The fault coverage provided by our approach is application agnostic, unlike algorithm-based fault tolerance techniques.« less
Collaborative gaming and competition for CS-STEM education using SPHERES Zero Robotics
NASA Astrophysics Data System (ADS)
Nag, Sreeja; Katz, Jacob G.; Saenz-Otero, Alvar
2013-02-01
There is widespread investment of resources in the fields of Computer Science, Science, Technology, Engineering, Mathematics (CS-STEM) education to improve STEM interests and skills. This paper addresses the goal of revolutionizing student education using collaborative gaming and competition, both in virtual simulation environments and on real hardware in space. The concept is demonstrated using the SPHERES Zero Robotics (ZR) Program which is a robotics programming competition. The robots are miniature satellites called SPHERES—an experimental test bed developed by the MIT SSL on the International Space Station (ISS) to test navigation, formation flight and control algorithms in microgravity. The participants compete to win a technically challenging game by programming their strategies into the SPHERES satellites, completely from a web browser. The programs are demonstrated in simulation, on ground hardware and then in a final competition when an astronaut runs the student software aboard the ISS. ZR had a pilot event in 2009 with 10 High School (HS) students, a nationwide pilot tournament in 2010 with over 200 HS students from 19 US states, a summer tournament in 2010 with ˜150 middle school students and an open-registration tournament in 2011 with over 1000 HS students from USA and Europe. The influence of collaboration was investigated by (1) building new web infrastructure and an Integrated Development Environment where intensive inter-participant collaboration is possible, (2) designing and programming a game to solve a relevant formation flight problem, collaborative in nature—and (3) structuring a tournament such that inter-team collaboration is mandated. This paper introduces the ZR web tools, assesses the educational value delivered by the program using space and games and evaluates the utility of collaborative gaming within this framework. There were three types of collaborations as variables—within matches (to achieve game objectives), inter-team alliances and unstructured communication on online forums. Simulation competition scores, website usage statistics and post-competition surveys are used to evaluate educational impact and the effect of collaboration.
ShareSync: A Solution for Deterministic Data Sharing over Ethernet
NASA Technical Reports Server (NTRS)
Dunn, Daniel J., II; Koons, William A.; Kennedy, Richard D.; Davis, Philip A.
2007-01-01
As part of upgrading the Contact Dynamics Simulation Laboratory (CDSL) at the NASA Marshall Space Flight Center (MSFC), a simple, cost effective method was needed to communicate data among the networked simulation machines and I/O controllers used to run the facility. To fill this need and similar applicable situations, a generic protocol was developed, called ShareSync. ShareSync is a lightweight, real-time, publish-subscribe Ethernet protocol for simple and deterministic data sharing across diverse machines and operating systems. ShareSync provides a simple Application Programming Interface (API) for simulation programmers to incorporate into their code. The protocol is compatible with virtually all Ethernet-capable machines, is flexible enough to support a variety of applications, is fast enough to provide soft real-time determinism, and is a low-cost resource for distributed simulation development, deployment, and maintenance. The first design cycle iteration of ShareSync has been completed, and the protocol has undergone several testing procedures including endurance and benchmarking tests and approaches the 2001ts data synchronization design goal for the CDSL.
HI and Low Metal Ions at the Intersection of Galaxies and the CGM
NASA Astrophysics Data System (ADS)
Oppenheimer, Benjamin
2017-08-01
Over 1000 COS orbits have revealed a surprisingly complex picture of circumgalactic gas flows surrounding the diversity of galaxies in the evolved Universe. Cosmological hydrodynamic simulations have only begun to confront the vast amount of galaxy formation physics, chemistry, and dynamics revealed in the multi-ion CGM datasets. We propose the next generation of EAGLE zoom simulations, called EAGLE Cosmic Origins, to model HI and low metal ions (C II, Mg II, & Si II) throughout not just the CGM but also within the galaxies themselves. We will employ a novel, new chemistry solver, CHIMES, to follow time-dependent ionization, chemistry, and cooling of 157 ionic and molecular species, and include multiple ionization sources from the extra-galactic background, episodic AGN, and star formation. Our aim is to understand the complete baryon cycle of inflows, outflows, and gas recycling traced over 10 decades of HI column densities as well as the complex kinematic information encoded low ion absorption spectroscopy. This simulation project represents a pilot program for a larger suite of zoom simulations, which will be publicly released and lead to additional publications.
Roze, E; Flamand-Roze, C; Méneret, A; Ruiz, M; Le Liepvre, H; Duguet, A; Renaud, M-C; Alamowitch, S; Steichen, O
2016-01-01
Neurological disorders are frequently being managed by general practitioners. It is therefore critical that future physicians become comfortable with neurological examination and physical diagnosis. Graduating medical students often consider neurological examination as one of the clinical skills they are least comfortable with, and they even tend to be neurophobic. One way to improve the learning of neurological semiology is to design innovative learner-friendly educational methods, including simulation training. The feasibility of mime-based roleplaying was tested by a simulation training program in neurological semiology called 'The Move'. The program was proposed to third-year medical students at Pierre and Marie Curie University in Paris during their neurology rotation. Students were trained to roleplay patients by miming various neurological syndromes (pyramidal, vestibular, cerebellar, parkinsonian) as well as distal axonopathy, chorea and tonic-clonic seizures. Using an anonymous self-administered questionnaire, the students' and teachers' emotional experience and views on the impact of the program were then investigated. A total of 223/365 students (61%) chose to participate in the study. Both students and teachers felt their participation was pleasant. Students stated that The Move increased their motivation to learn neurological semiology (78%), and improved both their understanding of the subject (77%) and their long-term memorization of the teaching content (86%). Although only a minority thought The Move was likely to improve their performance on their final medical examination (32%), a clear majority (77%) thought it would be useful for their future clinical practice. Both students (87%) and teachers (95%) thought The Move should be included in the medical curriculum. Mime-based roleplaying simulation may be a valuable tool for training medical students in neurological semiology, and may also help them to overcome neurophobia. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Melzer, S M; Poole, S R
1999-08-01
To describe the operating characteristics, financial performance, and perceived value of computerized children's hospital-based telephone triage and advice (TTA) programs. A written survey of all 32 children's hospital-based TTA programs in the United States that used the same proprietary pediatric TTA software product for at least 6 months. The expense, revenues, and perceived value of children's hospital-based TTA programs. Of 30 programs (94%) responding, 27 (90%) were eligible for the study and reported on their experience with nearly 1.3 million TTA calls over a 12-month period. Programs provided pediatric TTA services for 1560 physicians, serving an average of 82 physicians (range, 10-340 physicians) and answering 38880 calls (range, 8500-140000 calls) annually. The mean call duration was 11.3 minutes and the estimated mean total expense per call was $12.45. Of programs charging fees for TTA services, 16 (59%) used a per-call fee and 7 (26%) used a monthly service fee. All respondents indicated that fees did not cover all associated costs. Telephone triage and advice programs, when examined on a stand-alone basis, were all operating with annual deficits (mean, $447000; median, $325000; range, $74000-$1.3 million), supported by the sponsoring children's hospitals and their companion programs. Using a 3-point Likert scale, the TTA program managers rated the value of the TTA program very highly as a mechanism for marketing to physicians (2.85) and increasing physician (2.92) and patient (2.80) satisfaction. Children's hospital-based TTA programs operate at substantial financial deficits. Ongoing support of these programs may derive from the perception that they are a valuable mechanism for marketing and increase patient and physician satisfaction. Children's hospitals should develop strategies to ensure the long-term financial viability of TTA programs or they may have to discontinue these services.
Implementing an Education and Outreach Program for the Gemini Observatory in Chile.
NASA Astrophysics Data System (ADS)
Garcia, M. A.
2006-08-01
Beginning in 2001, the Gemini Observatory began the development of an innovative and aggressive education and outreach program at its Southern Hemisphere site in northern Chile. A principal focus of this effort is centered on local education and outreach to communities surrounding the observatory and its base facility in La Serena Chile. Programs are now established with local schools using two portable StarLab planetaria, an internet-based teacher exchange called StarTeachers and multiple partnerships with local educational institutions. Other elements include a CD-ROM-based virtual tour that allows students, teachers and the public to experience the observatory's sites in Chile and Hawaii. This virtual environment allows interaction using a variety of immersive scenarios such as a simulated observation using real data from Gemini. Pilot projects like "Live from Gemini" are currently being developed which use internet videoconferencing technologies to bring the observatory's facilities into classrooms at universities and remote institutions. Lessons learned from the implementation of these and other programs will be introduced and the challenges of developing educational programming in a developing country will be shared.
Optimization strategies for molecular dynamics programs on Cray computers and scalar work stations
NASA Astrophysics Data System (ADS)
Unekis, Michael J.; Rice, Betsy M.
1994-12-01
We present results of timing runs and different optimization strategies for a prototype molecular dynamics program that simulates shock waves in a two-dimensional (2-D) model of a reactive energetic solid. The performance of the program may be improved substantially by simple changes to the Fortran or by employing various vendor-supplied compiler optimizations. The optimum strategy varies among the machines used and will vary depending upon the details of the program. The effect of various compiler options and vendor-supplied subroutine calls is demonstrated. Comparison is made between two scalar workstations (IBM RS/6000 Model 370 and Model 530) and several Cray supercomputers (X-MP/48, Y-MP8/128, and C-90/16256). We find that for a scientific application program dominated by sequential, scalar statements, a relatively inexpensive high-end work station such as the IBM RS/60006 RISC series will outperform single processor performance of the Cray X-MP/48 and perform competitively with single processor performance of the Y-MP8/128 and C-9O/16256.
NASA Astrophysics Data System (ADS)
Pantale, O.; Caperaa, S.; Rakotomalala, R.
2004-07-01
During the last 50 years, the development of better numerical methods and more powerful computers has been a major enterprise for the scientific community. In the same time, the finite element method has become a widely used tool for researchers and engineers. Recent advances in computational software have made possible to solve more physical and complex problems such as coupled problems, nonlinearities, high strain and high-strain rate problems. In this field, an accurate analysis of large deformation inelastic problems occurring in metal-forming or impact simulations is extremely important as a consequence of high amount of plastic flow. In this presentation, the object-oriented implementation, using the C++ language, of an explicit finite element code called DynELA is presented. The object-oriented programming (OOP) leads to better-structured codes for the finite element method and facilitates the development, the maintainability and the expandability of such codes. The most significant advantage of OOP is in the modeling of complex physical systems such as deformation processing where the overall complex problem is partitioned in individual sub-problems based on physical, mathematical or geometric reasoning. We first focus on the advantages of OOP for the development of scientific programs. Specific aspects of OOP, such as the inheritance mechanism, the operators overload procedure or the use of template classes are detailed. Then we present the approach used for the development of our finite element code through the presentation of the kinematics, conservative and constitutive laws and their respective implementation in C++. Finally, the efficiency and accuracy of our finite element program are investigated using a number of benchmark tests relative to metal forming and impact simulations.
The application of virtual reality systems as a support of digital manufacturing and logistics
NASA Astrophysics Data System (ADS)
Golda, G.; Kampa, A.; Paprocka, I.
2016-08-01
Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.
Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System
NASA Technical Reports Server (NTRS)
Braman, Julia M. B.; Murray, Richard M; Wagner, David A.
2007-01-01
Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.
NASA Technical Reports Server (NTRS)
Sternfeld, H., Jr.; Doyle, L. B.
1978-01-01
The relationship between the internal noise environment of helicopters and the ability of personnel to understand commands and instructions was studied. A test program was conducted to relate speech intelligibility to a standard measurement called Articulation Index. An acoustical simulator was used to provide noise environments typical of Army helicopters. Speech material (command sentences and phonetically balanced word lists) were presented at several voice levels in each helicopter environment. Recommended helicopter internal noise criteria, based on speech communication, were derived and the effectiveness of hearing protection devices were evaluated.
NASA Technical Reports Server (NTRS)
Bailey, F. R.; Kutler, Paul
1988-01-01
Discussed are the capabilities of NASA's Numerical Aerodynamic Simulation (NAS) Program and its application as an advanced supercomputing system for computational fluid dynamics (CFD) research. First, the paper describes the NAS computational system, called the NAS Processing System Network, and the advanced computational capabilities it offers as a consequence of carrying out the NAS pathfinder objective. Second, it presents examples of pioneering CFD research accomplished during NAS's first operational year. Examples are included which illustrate CFD applications for predicting fluid phenomena, complementing and supplementing experimentation, and aiding in design. Finally, pacing elements and future directions for CFD and NAS are discussed.
Rapid Assessment of Agility for Conceptual Design Synthesis
NASA Technical Reports Server (NTRS)
Biezad, Daniel J.
1996-01-01
This project consists of designing and implementing a real-time graphical interface for a workstation-based flight simulator. It is capable of creating a three-dimensional out-the-window scene of the aircraft's flying environment, with extensive information about the aircraft's state displayed in the form of a heads-up-display (HUD) overlay. The code, written in the C programming language, makes calls to Silicon Graphics' Graphics Library (GL) to draw the graphics primitives. Included in this report is a detailed description of the capabilities of the code, including graphical examples, as well as a printout of the code itself
NASA Astrophysics Data System (ADS)
Abbasi, R. U.; Abu-Zayyad, T.; Amman, J. F.; Archbold, G. C.; Bellido, J. A.; Belov, K.; Belz, J. W.; Bergman, D. R.; Cao, Z.; Clay, R. W.; Cooper, M. D.; Dai, H.; Dawson, B. R.; Everett, A. A.; Girard, J. H. V.; Gray, R. C.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hüntemeyer, P.; Jones, B. F.; Jui, C. C. H.; Kieda, D. B.; Kim, K.; Kirn, M. A.; Loh, E. C.; Manago, N.; Marek, L. J.; Martens, K.; Martin, G.; Manago, N.; Matthews, J. A. J.; Matthews, J. N.; Meyer, J. R.; Moore, S. A.; Morrison, P.; Moosman, A. N.; Mumford, J. R.; Munro, M. W.; Painter, C. A.; Perera, L.; Reil, K.; Riehle, R.; Roberts, M.; Sarracino, J. S.; Schnetzer, S.; Shen, P.; Simpson, K. M.; Sinnis, G.; Smith, J. D.; Sokolsky, P.; Song, C.; Springer, R. W.; Stokes, B. T.; Thomas, S. B.; Thompson, T. N.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; VanderVeen, T. D.; Zech, A.; Zhang, X.
2005-03-01
We have measured the spectrum of UHE cosmic rays using the Flash ADC (FADC) detector (called HiRes-II) of the High Resolution Fly's Eye experiment running in monocular mode. We describe in detail the data analysis, development of the Monte Carlo simulation program, and results. We also describe the results of the HiRes-I detector. We present our measured spectra and compare them with a model incorporating galactic and extragalactic cosmic rays. Our combined spectra provide strong evidence for the existence of the spectral feature known as the "ankle."
Energy-efficient container handling using hybrid model predictive control
NASA Astrophysics Data System (ADS)
Xin, Jianbin; Negenborn, Rudy R.; Lodewijks, Gabriel
2015-11-01
The performance of container terminals needs to be improved to adapt the growth of containers while maintaining sustainability. This paper provides a methodology for determining the trajectory of three key interacting machines for carrying out the so-called bay handling task, involving transporting containers between a vessel and the stacking area in an automated container terminal. The behaviours of the interacting machines are modelled as a collection of interconnected hybrid systems. Hybrid model predictive control (MPC) is proposed to achieve optimal performance, balancing the handling capacity and energy consumption. The underlying control problem is hereby formulated as a mixed-integer linear programming problem. Simulation studies illustrate that a higher penalty on energy consumption indeed leads to improved sustainability using less energy. Moreover, simulations illustrate how the proposed energy-efficient hybrid MPC controller performs under different types of uncertainties.
Examination of various turbulence models for application in liquid rocket thrust chambers
NASA Technical Reports Server (NTRS)
Hung, R. J.
1991-01-01
There is a large variety of turbulence models available. These models include direct numerical simulation, large eddy simulation, Reynolds stress/flux model, zero equation model, one equation model, two equation k-epsilon model, multiple-scale model, etc. Each turbulence model contains different physical assumptions and requirements. The natures of turbulence are randomness, irregularity, diffusivity and dissipation. The capabilities of the turbulence models, including physical strength, weakness, limitations, as well as numerical and computational considerations, are reviewed. Recommendations are made for the potential application of a turbulence model in thrust chamber and performance prediction programs. The full Reynolds stress model is recommended. In a workshop, specifically called for the assessment of turbulence models for applications in liquid rocket thrust chambers, most of the experts present were also in favor of the recommendation of the Reynolds stress model.
NASA Astrophysics Data System (ADS)
Humeniuk, Alexander; Mitrić, Roland
2017-12-01
A software package, called DFTBaby, is published, which provides the electronic structure needed for running non-adiabatic molecular dynamics simulations at the level of tight-binding DFT. A long-range correction is incorporated to avoid spurious charge transfer states. Excited state energies, their analytic gradients and scalar non-adiabatic couplings are computed using tight-binding TD-DFT. These quantities are fed into a molecular dynamics code, which integrates Newton's equations of motion for the nuclei together with the electronic Schrödinger equation. Non-adiabatic effects are included by surface hopping. As an example, the program is applied to the optimization of excited states and non-adiabatic dynamics of polyfluorene. The python and Fortran source code is available at http://www.dftbaby.chemie.uni-wuerzburg.de.
NASA Technical Reports Server (NTRS)
Hiltner, Dale W.
2000-01-01
This report presents the assessment of an analytical tool developed as part of the NASA/FAA Tailplane Icing Program. The analytical tool is a specialized simulation program called TAILSM4 which was developed to model the effects of tailplane icing on the flight dynamics Twin Otter Icing Research Aircraft. This report compares the responses of the TAILSIM program directly to flight test data. The comparisons should be useful to potential users of TAILSIM. The comparisons show that the TAILSIM program qualitatively duplicates the flight test aircraft response during maneuvers with ice on the tailplane. TAILSIM is shown to be quantitatively "in the ballpark" in predicting when Ice Contaminated Tailplane Stall will occur during pushover and thrust transition maneuvers. As such, TAILSIM proved its usefulness to the flight test program by providing a general indication of the aircraft configuration and flight conditions of concern. The aircraft dynamics are shown to be modeled correctly by the equations of motion used in TAILSIM. However, the general accuracy of the TAILSIM responses is shown to be less than desired primarily due to inaccuracies in the aircraft database. The high sensitivity of the TAILSIM program responses to small changes in load factor command input is also shown to be a factor in the accuracy of the responses. A pilot model is shown to allow TAILSIM to produce more accurate responses and contribute significantly to the usefulness of the program. Suggestions to improve the accuracy of the TAILSIM responses are to further refine the database representation of the aircraft aerodynamics and tailplane flowfield and to explore a more realistic definition of the pilot model.
ERIC Educational Resources Information Center
Ali, Azad; Smith, David
2014-01-01
This paper presents a debate between two faculty members regarding the teaching of the legacy programming course (COBOL) in a Computer Science (CS) program. Among the two faculty members, one calls for the continuation of teaching this language and the other calls for replacing it with another modern language. Although CS programs are notorious…
Fortran interface layer of the framework for developing particle simulator FDPS
NASA Astrophysics Data System (ADS)
Namekata, Daisuke; Iwasawa, Masaki; Nitadori, Keigo; Tanikawa, Ataru; Muranushi, Takayuki; Wang, Long; Hosono, Natsuki; Nomura, Kentaro; Makino, Junichiro
2018-06-01
Numerical simulations based on particle methods have been widely used in various fields including astrophysics. To date, various versions of simulation software have been developed by individual researchers or research groups in each field, through a huge amount of time and effort, even though the numerical algorithms used are very similar. To improve the situation, we have developed a framework, called FDPS (Framework for Developing Particle Simulators), which enables researchers to develop massively parallel particle simulation codes for arbitrary particle methods easily. Until version 3.0, FDPS provided an API (application programming interface) for the C++ programming language only. This limitation comes from the fact that FDPS is developed using the template feature in C++, which is essential to support arbitrary data types of particle. However, there are many researchers who use Fortran to develop their codes. Thus, the previous versions of FDPS require such people to invest much time to learn C++. This is inefficient. To cope with this problem, we developed a Fortran interface layer in FDPS, which provides API for Fortran. In order to support arbitrary data types of particle in Fortran, we design the Fortran interface layer as follows. Based on a given derived data type in Fortran representing particle, a PYTHON script provided by us automatically generates a library that manipulates the C++ core part of FDPS. This library is seen as a Fortran module providing an API of FDPS from the Fortran side and uses C programs internally to interoperate Fortran with C++. In this way, we have overcome several technical issues when emulating a `template' in Fortran. Using the Fortran interface, users can develop all parts of their codes in Fortran. We show that the overhead of the Fortran interface part is sufficiently small and a code written in Fortran shows a performance practically identical to the one written in C++.
DEPEND: A simulation-based environment for system level dependability analysis
NASA Technical Reports Server (NTRS)
Goswami, Kumar; Iyer, Ravishankar K.
1992-01-01
The design and evaluation of highly reliable computer systems is a complex issue. Designers mostly develop such systems based on prior knowledge and experience and occasionally from analytical evaluations of simplified designs. A simulation-based environment called DEPEND which is especially geared for the design and evaluation of fault-tolerant architectures is presented. DEPEND is unique in that it exploits the properties of object-oriented programming to provide a flexible framework with which a user can rapidly model and evaluate various fault-tolerant systems. The key features of the DEPEND environment are described, and its capabilities are illustrated with a detailed analysis of a real design. In particular, DEPEND is used to simulate the Unix based Tandem Integrity fault-tolerance and evaluate how well it handles near-coincident errors caused by correlated and latent faults. Issues such as memory scrubbing, re-integration policies, and workload dependent repair times which affect how the system handles near-coincident errors are also evaluated. Issues such as the method used by DEPEND to simulate error latency and the time acceleration technique that provides enormous simulation speed up are also discussed. Unlike any other simulation-based dependability studies, the use of these approaches and the accuracy of the simulation model are validated by comparing the results of the simulations, with measurements obtained from fault injection experiments conducted on a production Tandem Integrity machine.
Simulation of diurnal thermal energy storage systems: Preliminary results
NASA Astrophysics Data System (ADS)
Katipamula, S.; Somasundaram, S.; Williams, H. R.
1994-12-01
This report describes the results of a simulation of thermal energy storage (TES) integrated with a simple-cycle gas turbine cogeneration system. Integrating TES with cogeneration can serve the electrical and thermal loads independently while firing all fuel in the gas turbine. The detailed engineering and economic feasibility of diurnal TES systems integrated with cogeneration systems has been described in two previous PNL reports. The objective of this study was to lay the ground work for optimization of the TES system designs using a simulation tool called TRNSYS (TRaNsient SYstem Simulation). TRNSYS is a transient simulation program with a sequential-modular structure developed at the Solar Energy Laboratory, University of Wisconsin-Madison. The two TES systems selected for the base-case simulations were: (1) a one-tank storage model to represent the oil/rock TES system; and (2) a two-tank storage model to represent the molten nitrate salt TES system. Results of the study clearly indicate that an engineering optimization of the TES system using TRNSYS is possible. The one-tank stratified oil/rock storage model described here is a good starting point for parametric studies of a TES system. Further developments to the TRNSYS library of available models (economizer, evaporator, gas turbine, etc.) are recommended so that the phase-change processes is accurately treated.
Mechanical discrete simulator of the electro-mechanical lift with n:1 roping
NASA Astrophysics Data System (ADS)
Alonso, F. J.; Herrera, I.
2016-05-01
The design process of new products in lift engineering is a difficult task due to, mainly, the complexity and slenderness of the lift system, demanding a predictive tool for the lift mechanics. A mechanical ad-hoc discrete simulator, as an alternative to ‘general purpose’ mechanical simulators is proposed. Firstly, the synthesis and experimentation process that has led to establish a suitable model capable of simulating accurately the response of the electromechanical lift is discussed. Then, the equations of motion are derived. The model comprises a discrete system of 5 vertically displaceable masses (car, counterweight, car frame, passengers/loads and lift drive), an inertial mass of the assembly tension pulley-rotor shaft which can rotate about the machine axis and 6 mechanical connectors with 1:1 suspension layout. The model is extended to any n:1 roping lift by setting 6 equivalent mechanical components (suspension systems for car and counterweight, lift drive silent blocks, tension pulley-lift drive stator and passengers/load equivalent spring-damper) by inductive inference from 1:1 and generalized 2:1 roping system. The application to simulate real elevator systems is proposed by numeric time integration of the governing equations using the Kutta-Meden algorithm and implemented in a computer program for ad-hoc elevator simulation called ElevaCAD.
Clouds of different colors: A prospective look at head and neck surgical resident call experience.
Melzer, Jonathan
2017-12-01
Graduate medical education programs typically set up call under the assumption that residents will have similar experiences. The terms black cloud and white cloud have frequently been used to describe residents with more difficult (black) or less difficult (white) call experiences. This study followed residents in the department of head and neck surgery during call to determine whether certain residents have a significantly different call experience than the norm. It is a prospective observational study conducted over 16 months in a tertiary care center with a resident training program in otolaryngology. Resident call data on total pages, consults, and operative interventions were examined, as well as subjective survey data about sleep and perceived difficulty of resident call. Analysis showed no significant difference in call activity (pages, consults, operative interventions) among residents. However, data from the resident call surveys revealed perceived disparities in call difficulty that were significant. Two residents were clearly labeled as black clouds compared to the rest. These residents did not have the highest average number of pages, consults, or operative interventions. This study suggests that factors affecting call perception are outside the objective, absolute workload. These results may be used to improve resident education on sleep training and nighttime patient management in the field of otolaryngology and may influence otolaryngology residency programs.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen
2014-08-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.
75 FR 25255 - Structure and Practices of the Video Relay Service Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... Video Relay Service Program AGENCY: Federal Communications Commission. ACTION: Notice. SUMMARY: In this... compensability from the Interstate TRS Fund (Fund) of certain types of calls made through Video Relay Service... CA, after the VRS user has initiated the video call to the CA, call back the VRS user on a voice...
Jamaican Call-In Radio: A Uses and Gratification Analysis.
ERIC Educational Resources Information Center
Surlin, Stuart H.
Noting that radio call-in programs seem to contain the elements for active audience involvement and participation, a study was conducted to examine the hypothesis that information gain and surveillance are the primary gratifications sought through call-in radio programs, especially in a culture that has a strong oral tradition and relatively few…
49 CFR 198.35 - Grants conditioned on adoption of one-call damage prevention program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... considers whether a State has adopted or is seeking to adopt a one-call damage prevention program in accordance with § 198.37. If a State has not adopted or is not seeking to adopt such program, the State...
NASA Technical Reports Server (NTRS)
Rule, William Keith
1991-01-01
A computer program called BALLIST that is intended to be a design tool for engineers is described. BALLlST empirically predicts the bumper thickness required to prevent perforation of the Space Station pressure wall by a projectile (such as orbital debris) as a function of the projectile's velocity. 'Ballistic' limit curves (bumper thickness vs. projectile velocity) are calculated and are displayed on the screen as well as being stored in an ASCII file. A Whipple style of spacecraft wall configuration is assumed. The predictions are based on a database of impact test results. NASA/Marshall Space Flight Center currently has the capability to generate such test results. Numerical simulation results of impact conditions that can not be tested (high velocities or large particles) can also be used for predictions.
Differing types of cellular phone conversations and dangerous driving.
Dula, Chris S; Martin, Benjamin A; Fox, Russell T; Leonard, Robin L
2011-01-01
This study sought to investigate the relationship between cell phone conversation type and dangerous driving behaviors. It was hypothesized that more emotional phone conversations engaged in while driving would produce greater frequencies of dangerous driving behaviors in a simulated environment than more mundane conversation or no phone conversation at all. Participants were semi-randomly assigned to one of three conditions: (1) no call, (2) mundane call, and, (3) emotional call. While driving in a simulated environment, participants in the experimental groups received a phone call from a research confederate who either engaged them in innocuous conversation (mundane call) or arguing the opposite position of a deeply held belief of the participant (emotional call). Participants in the no call and mundane call groups differed significantly only on percent time spent speeding and center line crossings, though the mundane call group consistently engaged in more of all dangerous driving behaviors than did the no call participants. Participants in the emotional call group engaged in significantly more dangerous driving behaviors than participants in both the no call and mundane call groups, with the exception of traffic light infractions, where there were no significant group differences. Though there is need for replication, the authors concluded that whereas talking on a cell phone while driving is risky to begin with, having emotionally intense conversations is considerably more dangerous. Copyright © 2010 Elsevier Ltd. All rights reserved.
Outsourcing an Effective Postdischarge Call Program
Meek, Kevin L.; Williams, Paula; Unterschuetz, Caryn J.
2018-01-01
To improve patient satisfaction ratings and decrease readmissions, many organizations utilize internal staff to complete postdischarge calls to recently released patients. Developing, implementing, monitoring, and sustaining an effective call program can be challenging and have eluded some of the renowned medical centers in the country. Using collaboration with an outsourced vendor to bring state-of-the-art call technology and staffed with specially trained callers, health systems can achieve elevated levels of engagement and satisfaction for their patients postdischarge. PMID:29494453
Using genetic algorithm to solve a new multi-period stochastic optimization model
NASA Astrophysics Data System (ADS)
Zhang, Xin-Li; Zhang, Ke-Cun
2009-09-01
This paper presents a new asset allocation model based on the CVaR risk measure and transaction costs. Institutional investors manage their strategic asset mix over time to achieve favorable returns subject to various uncertainties, policy and legal constraints, and other requirements. One may use a multi-period portfolio optimization model in order to determine an optimal asset mix. Recently, an alternative stochastic programming model with simulated paths was proposed by Hibiki [N. Hibiki, A hybrid simulation/tree multi-period stochastic programming model for optimal asset allocation, in: H. Takahashi, (Ed.) The Japanese Association of Financial Econometrics and Engineering, JAFFE Journal (2001) 89-119 (in Japanese); N. Hibiki A hybrid simulation/tree stochastic optimization model for dynamic asset allocation, in: B. Scherer (Ed.), Asset and Liability Management Tools: A Handbook for Best Practice, Risk Books, 2003, pp. 269-294], which was called a hybrid model. However, the transaction costs weren't considered in that paper. In this paper, we improve Hibiki's model in the following aspects: (1) The risk measure CVaR is introduced to control the wealth loss risk while maximizing the expected utility; (2) Typical market imperfections such as short sale constraints, proportional transaction costs are considered simultaneously. (3) Applying a genetic algorithm to solve the resulting model is discussed in detail. Numerical results show the suitability and feasibility of our methodology.
Generic Kalman Filter Software
NASA Technical Reports Server (NTRS)
Lisano, Michael E., II; Crues, Edwin Z.
2005-01-01
The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... UNITED STATES INSTITUTE OF PEACE Call for Proposals for a Micro Support Program on International Conflict Resolution and Peacebuilding For Immediate Release AGENCY: United States Institute of Peace. ACTION: Notice. SUMMARY: Micro Support Program on International Conflict Resolution and Peacebuilding...
ERIC Educational Resources Information Center
Fukuzawa, Jeannette L.; Lubin, Jan M.
Five computer programs for the Macintosh that are geared for Computer-Assisted Language Learning (CALL) are described. All five programs allow the teacher to input material. The first program allows entry of new vocabulary lists including definition, a sentence in which the exact word is used, a fill-in-the-blank exercise, and the word's phonetics…
Framework for Development of Object-Oriented Software
NASA Technical Reports Server (NTRS)
Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan
2004-01-01
The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.
NASA Technical Reports Server (NTRS)
1976-01-01
The program called CTRANS is described which was designed to perform radiative transfer computations in an atmosphere with horizontal inhomogeneities (clouds). Since the atmosphere-ground system was to be richly detailed, the Monte Carlo method was employed. This means that results are obtained through direct modeling of the physical process of radiative transport. The effects of atmopheric or ground albedo pattern detail are essentially built up from their impact upon the transport of individual photons. The CTRANS program actually tracks the photons backwards through the atmosphere, initiating them at a receiver and following them backwards along their path to the Sun. The pattern of incident photons generated through backwards tracking automatically reflects the importance to the receiver of each region of the sky. Further, through backwards tracking, the impact of the finite field of view of the receiver and variations in its response over the field of view can be directly simulated.
Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valencia, Jayson F.; Dirks, James A.
2008-08-29
EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less
Automatic mathematical modeling for space application
NASA Technical Reports Server (NTRS)
Wang, Caroline K.
1987-01-01
A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.
Online Simulation of Radiation Track Structure Project
NASA Technical Reports Server (NTRS)
Plante, Ianik
2015-01-01
Space radiation comprises protons, helium and high charged and energy (HZE) particles. High-energy particles are a concern for human space flight, because they are no known options for shielding astronauts from them. When these ions interact with matter, they damage molecules and create radiolytic species. The pattern of energy deposition and positions of the radiolytic species, called radiation track structure, is highly dependent on the charge and energy of the ion. The radiolytic species damage biological molecules, which may lead to several long-term health effects such as cancer. Because of the importance of heavy ions, the radiation community is very interested in the interaction of HZE particles with DNA, notably with regards to the track structure. A desktop program named RITRACKS was developed to simulate radiation track structure. The goal of this project is to create a web interface to allow registered internal users to use RITRACKS remotely.
Orion Service Module Umbilical (OSMU) Testing Complete
2016-10-19
Testing of the Orion Service Module Umbilical (OSMU) was completed at the Launch Equipment Test Facility at NASA’s Kennedy Space Center in Florida. The OSMU was attached to Vehicle Motion Simulator 1 for a series of simulated launch tests to validate it for installation on the mobile launcher. The mobile launcher tower will be equipped with a number of lines, called umbilicals that will connect to the Space Launch System rocket and Orion spacecraft for Exploration Mission-1 (EM-1). The OSMU will be located high on the mobile launcher tower and, prior to launch, will transfer liquid coolant for the electronics and air for the Environmental Control System to the Orion service module that houses these critical systems to support the spacecraft. Kennedy's Engineering Directorate is providing support to the Ground Systems Development and Operations Program for testing of the OSMU. EM-1 is scheduled to launch in 2018.
Computational strategies in the dynamic simulation of constrained flexible MBS
NASA Technical Reports Server (NTRS)
Amirouche, F. M. L.; Xie, M.
1993-01-01
This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.
Statistical Inference in Hidden Markov Models Using k-Segment Constraints
Titsias, Michalis K.; Holmes, Christopher C.; Yau, Christopher
2016-01-01
Hidden Markov models (HMMs) are one of the most widely used statistical methods for analyzing sequence data. However, the reporting of output from HMMs has largely been restricted to the presentation of the most-probable (MAP) hidden state sequence, found via the Viterbi algorithm, or the sequence of most probable marginals using the forward–backward algorithm. In this article, we expand the amount of information we could obtain from the posterior distribution of an HMM by introducing linear-time dynamic programming recursions that, conditional on a user-specified constraint in the number of segments, allow us to (i) find MAP sequences, (ii) compute posterior probabilities, and (iii) simulate sample paths. We collectively call these recursions k-segment algorithms and illustrate their utility using simulated and real examples. We also highlight the prospective and retrospective use of k-segment constraints for fitting HMMs or exploring existing model fits. Supplementary materials for this article are available online. PMID:27226674
A mathematical and numerical model is developed to simulate the transport and fate of NAPLs (Non-Aqueous Phase Liquids) in near-surface granular soils. The resulting three-dimensional, three phase simulator is called NAPL. The simulator accommodates three mobile phases: water, NA...
NASA Technical Reports Server (NTRS)
1996-01-01
Under the Enabling Propulsion Materials (EPM) program - a partnership between NASA, Pratt & Whitney, and GE Aircraft Engines - the Materials and Structures Divisions of the NASA Lewis Research Center are involved in developing a fan-containment system for the High-Speed Civil Transport (HSCT). The program calls for a baseline system to be designed by the end of 1995, with subsequent testing of innovative concepts. Five metal candidate materials are currently being evaluated for the baseline system in the Structures Division's Ballistic Impact Facility. This facility was developed to provide the EPM program with cost-efficient and timely impact test data. At the facility, material specimens are impacted at speeds up to 350 m/sec by projectiles of various sizes and shapes to assess the specimens' ability to absorb energy and withstand impact. The tests can be conducted at either room or elevated temperatures. Posttest metallographic analysis is conducted to improve understanding of the failure modes. A dynamic finite element program is used to simulate the events and both guide the testing as well as aid in designing the fan-containment system.
Genotype calling from next-generation sequencing data using haplotype information of reads
Zhi, Degui; Wu, Jihua; Liu, Nianjun; Zhang, Kui
2012-01-01
Motivation: Low coverage sequencing provides an economic strategy for whole genome sequencing. When sequencing a set of individuals, genotype calling can be challenging due to low sequencing coverage. Linkage disequilibrium (LD) based refinement of genotyping calling is essential to improve the accuracy. Current LD-based methods use read counts or genotype likelihoods at individual potential polymorphic sites (PPSs). Reads that span multiple PPSs (jumping reads) can provide additional haplotype information overlooked by current methods. Results: In this article, we introduce a new Hidden Markov Model (HMM)-based method that can take into account jumping reads information across adjacent PPSs and implement it in the HapSeq program. Our method extends the HMM in Thunder and explicitly models jumping reads information as emission probabilities conditional on the states of adjacent PPSs. Our simulation results show that, compared to Thunder, HapSeq reduces the genotyping error rate by 30%, from 0.86% to 0.60%. The results from the 1000 Genomes Project show that HapSeq reduces the genotyping error rate by 12 and 9%, from 2.24% and 2.76% to 1.97% and 2.50% for individuals with European and African ancestry, respectively. We expect our program can improve genotyping qualities of the large number of ongoing and planned whole genome sequencing projects. Contact: dzhi@ms.soph.uab.edu; kzhang@ms.soph.uab.edu Availability: The software package HapSeq and its manual can be found and downloaded at www.ssg.uab.edu/hapseq/. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22285565
FLY MPI-2: a parallel tree code for LSS
NASA Astrophysics Data System (ADS)
Becciani, U.; Comparato, M.; Antonuccio-Delogu, V.
2006-04-01
New version program summaryProgram title: FLY 3.1 Catalogue identifier: ADSC_v2_0 Licensing provisions: yes Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSC_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland No. of lines in distributed program, including test data, etc.: 158 172 No. of bytes in distributed program, including test data, etc.: 4 719 953 Distribution format: tar.gz Programming language: Fortran 90, C Computer: Beowulf cluster, PC, MPP systems Operating system: Linux, Aix RAM: 100M words Catalogue identifier of previous version: ADSC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 155 (2003) 159 Does the new version supersede the previous version?: yes Nature of problem: FLY is a parallel collisionless N-body code for the calculation of the gravitational force Solution method: FLY is based on the hierarchical oct-tree domain decomposition introduced by Barnes and Hut (1986) Reasons for the new version: The new version of FLY is implemented by using the MPI-2 standard: the distributed version 3.1 was developed by using the MPICH2 library on a PC Linux cluster. Today the FLY performance allows us to consider the FLY code among the most powerful parallel codes for tree N-body simulations. Another important new feature regards the availability of an interface with hydrodynamical Paramesh based codes. Simulations must follow a box large enough to accurately represent the power spectrum of fluctuations on very large scales so that we may hope to compare them meaningfully with real data. The number of particles then sets the mass resolution of the simulation, which we would like to make as fine as possible. The idea to build an interface between two codes, that have different and complementary cosmological tasks, allows us to execute complex cosmological simulations with FLY, specialized for DM evolution, and a code specialized for hydrodynamical components that uses a Paramesh block structure. Summary of revisions: The parallel communication schema was totally changed. The new version adopts the MPICH2 library. Now FLY can be executed on all Unix systems having an MPI-2 standard library. The main data structure, is declared in a module procedure of FLY (fly_h.F90 routine). FLY creates the MPI Window object for one-sided communication for all the shared arrays, with a call like the following: CALL MPI_WIN_CREATE(POS, SIZE, REAL8, MPI_INFO_NULL, MPI_COMM_WORLD, WIN_POS, IERR) the following main window objects are created: win_pos, win_vel, win_acc: particles positions velocities and accelerations, win_pos_cell, win_mass_cell, win_quad, win_subp, win_grouping: cells positions, masses, quadrupole momenta, tree structure and grouping cells. Other windows are created for dynamic load balance and global counters. Restrictions: The program uses the leapfrog integrator schema, but could be changed by the user. Unusual features: FLY uses the MPI-2 standard: the MPICH2 library on Linux systems was adopted. To run this version of FLY the working directory must be shared among all the processors that execute FLY. Additional comments: Full documentation for the program is included in the distribution in the form of a README file, a User Guide and a Reference manuscript. Running time: IBM Linux Cluster 1350, 512 nodes with 2 processors for each node and 2 GB RAM for each processor, at Cineca, was adopted to make performance tests. Processor type: Intel Xeon Pentium IV 3.0 GHz and 512 KB cache (128 nodes have Nocona processors). Internal Network: Myricom LAN Card "C" Version and "D" Version. Operating System: Linux SuSE SLES 8. The code was compiled using the mpif90 compiler version 8.1 and with basic optimization options in order to have performances that could be useful compared with other generic clusters Processors
Management simulations for Lean healthcare: exploiting the potentials of role-playing.
Barnabè, Federico; Giorgino, Maria Cleofe; Guercini, Jacopo; Bianciardi, Caterina; Mezzatesta, Vincenzo
2018-04-09
Purpose The purpose of this paper is to investigate the potentials of role-playing (RP) both in training healthcare (HC) professionals to implement tools and improvement actions based on Lean principles, and in supporting group discussion and the sharing of different competencies for the development of Lean HC. Design/methodology/approach The paper presents the case study of an RP simulation called LEAN HEALTHCARE LAB, which is used to train HC professionals at Siena University Hospital. The paper reports and discusses the results of a specific two-day simulation session and of a questionnaire that was distributed to gather feedback from the participants. Findings The paper verifies the potentials of RP to be a powerful educational and training tool that is able to stimulate the HC participants to apply Lean thinking principles and share their competencies in collaborative decision-making processes. Research limitations/implications The study provides data in reference to one single simulation session, although the game has already been applied several times in different HC organizations with very similar outcomes. Moreover, a more in-depth analysis of players' perceptions and decisions could be performed using different tools in addition to the adopted questionnaire. Practical implications RP games (RPGs) are effective training and educational tools for HC professionals. They offer benefits and learning conditions which are definitely different if compared with more conventional education programs for HC professionals. Originality/value While previous studies have extensively discussed the potentialities of RPG and simulations in training programs, only a few articles have discussed the RP adoption for Lean thinking and even less to educate HC professionals on Lean principles and tools.
NASA Astrophysics Data System (ADS)
de Brum, A. G. V.; da Cruz, F. C.; Hetem, A., Jr.
2015-10-01
To assist in the investigation of the triple asteroid system 2001-SN263, the deep space mission ASTER will carry onboard a laser altimeter. The instrument was named ALR and its development is now in progress. In order to help in the instrument design, with a view to the creation of software to control the instrument, a package of computer programs was produced to simulate the operation of a pulsed laser altimeter with operating principle based on the measurement of the time of flight of the travelling pulse. This software Simulator was called ALR_Sim, and the results obtained with its use represent what should be expected as return signal when laser pulses are fired toward a target, reflect on it and return to be detected by the instrument. The program was successfully tested with regard to some of the most common situations expected. It constitutes now the main workbench dedicated to the creation and testing of control software to embark in the ALR. In addition, the Simulator constitutes also an important tool to assist the creation of software to be used on Earth, in the processing and analysis of the data received from the instrument. This work presents the results obtained in the special case which involves the modeling of a surface with crater, along with the simulation of the instrument operation above this type of terrain. This study points out that the comparison of the wave form obtained as return signal after reflection of the laser pulse on the surface of the crater with the expected return signal in the case of a flat and homogeneous surface is a useful method that can be applied for terrain details extraction.
Gas-Expanded Liquids: Synergism of Experimental and Computational Determinations of Local Structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charles A. Eckert; Charles L. Liotta; Rigoberto Hernandez
2007-06-26
This project focuses on the characterization of a new class of solvent systems called gas-expanded liquids (GXLs), targeted for green-chemistry processing. The collaboration has adopted a synergistic approach combining elements of molecular dynamics (MD) simulation and spectroscopic experiments to explore the local solvent behavior that could not be studied by simulation or experiment alone. The major accomplishments from this project are: • Applied MD simulations to explore the non-uniform structure of CO2/methanol and CO2/acetone GXLs and studied their dynamic behavior with self-diffusion coefficients and correlation functions • Studied local solvent structure and solvation behavior with a combination of spectroscopy andmore » MD simulations • Measured transport properties of heterocyclic solutes in GXLs through Taylor-Aris diffusion techniques and compared these findings to those of MD simulations • Probed local polarity and specific solute-solvent interactions with Diels-Alder and SN2 reaction studies The broader scientific impact resulting from the research activities of this contract have been recognized by two recent awards: the Presidential Green Chemistry Award (Eckert & Liotta) and a fellowship in the American Association for the Advancement of Science (Hernandez). In addition to the technical aspects of this contract, the investigators have been engaged in a number of programs extending the broader impacts of this project. The project has directly supported the development of two postdoctoral researcher, four graduate students, and five undergraduate students. Several of the undergraduate students were co-funded by a Georgia Tech program, the Presidential Undergraduate Research Award. The other student, an African-American female graduated from Georgia Tech in December 2005, and was co-funded through an NSF Research and Education for Undergraduates (REU) award.« less
Validation of Robotic Surgery Simulator (RoSS).
Kesavadas, Thenkurussi; Stegemann, Andrew; Sathyaseelan, Gughan; Chowriappa, Ashirwad; Srimathveeravalli, Govindarajan; Seixas-Mikelus, Stéfanie; Chandrasekhar, Rameella; Wilding, Gregory; Guru, Khurshid
2011-01-01
Recent growth of daVinci Robotic Surgical System as a minimally invasive surgery tool has led to a call for better training of future surgeons. In this paper, a new virtual reality simulator, called RoSS is presented. Initial results from two studies - face and content validity, are very encouraging. 90% of the cohort of expert robotic surgeons felt that the simulator was excellent or somewhat close to the touch and feel of the daVinci console. Content validity of the simulator received 90% approval in some cases. These studies demonstrate that RoSS has the potential of becoming an important training tool for the daVinci surgical robot.
Simulation and Preliminary Design of a Cold Stream Experiment on Omega EP
NASA Astrophysics Data System (ADS)
Coffing, Shane; Angulo, Adrianna; Trantham, Matt; Malamud, Guy; Kuranz, Carolyn; Drake, R. P.
2017-10-01
Galaxies form within dark matter halos, accreting gas that may clump and eventually form stars. Infalling matter gradually increases the density of the halo, and, if cooling is insufficient, rising pressure forms a shock that slows the infalling gas, reducing star formation. However, galaxies with sufficient cooling become prolific star formers. A recent theory suggests that so called ``stream fed galaxies'' are able to acquire steady streams of cold gas via galactic ``filaments'' that penetrate the halo. The cold, dense filament flowing into a hot, less dense environment is potentially Kelvin-Helmholtz unstable. This instability may hinder the ability of the stream to deliver gas deeply enough into the halo. To study this process, we have begun preliminary design of a well-scaled laser experiment on Omega EP. We present here early simulation results and the physics involved. This work is funded by the U.S. Department of Energy, through the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0002956, and the National Laser User Facility Program, Grant Number DE-NA0002719, and through the Laboratory for Laser Energetics, University of Rochester by the NNSA/OICF under Cooperative Agreement No. DE-NA0001944.
Outsourcing an Effective Postdischarge Call Program: A Collaborative Approach.
Meek, Kevin L; Williams, Paula; Unterschuetz, Caryn J
To improve patient satisfaction ratings and decrease readmissions, many organizations utilize internal staff to complete postdischarge calls to recently released patients. Developing, implementing, monitoring, and sustaining an effective call program can be challenging and have eluded some of the renowned medical centers in the country. Using collaboration with an outsourced vendor to bring state-of-the-art call technology and staffed with specially trained callers, health systems can achieve elevated levels of engagement and satisfaction for their patients postdischarge.
Collaborative Employee Wellness: Living Healthy With Diabetes.
Hovatter, Joan McGarvev; Cooke, Catherine E; de Bittner, Magaly Rodriguez
Innovative approaches to managing an employee population with a high prevalence of type 2 diabetes mellitus can mitigate costs for employers by improving employees' health. This article describes such an approach at McCormick & Company, Inc., where participants had statistically significant improvements in weight, average plasma glucose concentration (also called glycated hemoglobin or A1c) and cholesterol. A simulation analysis applying the findings of the study population to Maryland employees with a baseline A1c of greater than 6.0% showed that participation in the program could improve glycemic control in these patients, reducing the A1 c by 0.24% on average, with associated cost savings for the employer.
GPS Software Packages Deliver Positioning Solutions
NASA Technical Reports Server (NTRS)
2010-01-01
"To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."
Climatic Models Ensemble-based Mid-21st Century Runoff Projections: A Bayesian Framework
NASA Astrophysics Data System (ADS)
Achieng, K. O.; Zhu, J.
2017-12-01
There are a number of North American Regional Climate Change Assessment Program (NARCCAP) climatic models that have been used to project surface runoff in the mid-21st century. Statistical model selection techniques are often used to select the model that best fits data. However, model selection techniques often lead to different conclusions. In this study, ten models are averaged in Bayesian paradigm to project runoff. Bayesian Model Averaging (BMA) is used to project and identify effect of model uncertainty on future runoff projections. Baseflow separation - a two-digital filter which is also called Eckhardt filter - is used to separate USGS streamflow (total runoff) into two components: baseflow and surface runoff. We use this surface runoff as the a priori runoff when conducting BMA of runoff simulated from the ten RCM models. The primary objective of this study is to evaluate how well RCM multi-model ensembles simulate surface runoff, in a Bayesian framework. Specifically, we investigate and discuss the following questions: How well do ten RCM models ensemble jointly simulate surface runoff by averaging over all the models using BMA, given a priori surface runoff? What are the effects of model uncertainty on surface runoff simulation?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Lake, L.W.; Sepehrnoori, K.
1988-11-01
The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. Developing, testing and applying flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agent has been continued. Improvements in both the physical-chemical and numerical aspects of UTCHEM have been made which enhance its versatility, accuracymore » and speed. Supporting experimental studies during the past year include relative permeability and trapping of microemulsion, tracer flow studies oil recovery in cores using alcohol free surfactant slugs, and microemulsion viscosity measurements. These have enabled model improvement simulator testing. Another code called PROPACK has also been developed which is used as a preprocessor for UTCHEM. Specifically, it is used to evaluate input to UTCHEM by computing and plotting key physical properties such as phase behavior interfacial tension.« less
X-38 Experimental Controls Laws
NASA Technical Reports Server (NTRS)
Munday, Steve; Estes, Jay; Bordano, Aldo J.
2000-01-01
X-38 Experimental Control Laws X-38 is a NASA JSC/DFRC experimental flight test program developing a series of prototypes for an International Space Station (ISS) Crew Return Vehicle, often called an ISS "lifeboat." X- 38 Vehicle 132 Free Flight 3, currently scheduled for the end of this month, will be the first flight test of a modem FCS architecture called Multi-Application Control-Honeywell (MACH), originally developed by the Honeywell Technology Center. MACH wraps classical P&I outer attitude loops around a modem dynamic inversion attitude rate loop. The dynamic inversion process requires that the flight computer have an onboard aircraft model of expected vehicle dynamics based upon the aerodynamic database. Dynamic inversion is computationally intensive, so some timing modifications were made to implement MACH on the slower flight computers of the subsonic test vehicles. In addition to linear stability margin analyses and high fidelity 6-DOF simulation, hardware-in-the-loop testing is used to verify the implementation of MACH and its robustness to aerodynamic and environmental uncertainties and disturbances.
77 FR 72364 - National Institute of Allergy and Infectious Diseases; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... Conference Call). Contact Person: Lynn Rust, Ph.D., Scientific Review Officer, Scientific Review Program... Call). Contact Person: Lynn Rust, Ph.D., Scientific Review Officer, Scientific Review Program, Division...
76 FR 3653 - Alaska Region's Subsistence Resource Commission (SRC) Program; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-20
... subsistence management issues. The NPS SRC program is authorized under Title VIII, Section 808 of the Alaska...: 1. Call to order. 2. SRC Roll Call and Confirmation of Quorum. 3. Welcome and Introductions. 4.... c. Resource Management Program Update. 14. Public and other Agency Comments. 15. SRC Work Session...
A Call for Reformation of Teacher Preparation Programs in the United States
ERIC Educational Resources Information Center
Dann, Ashley Ireland
2014-01-01
Although current research, educational theorists, and international comparison prove a need for reform, the United States' teacher preparation programs are failing. The following paper will call for the reform of teacher preparation programs in three distinct areas. Examination of current data, application of educational theorists'…
Exploring the Dynamics of Cell Processes through Simulations of Fluorescence Microscopy Experiments
Angiolini, Juan; Plachta, Nicolas; Mocskos, Esteban; Levi, Valeria
2015-01-01
Fluorescence correlation spectroscopy (FCS) methods are powerful tools for unveiling the dynamical organization of cells. For simple cases, such as molecules passively moving in a homogeneous media, FCS analysis yields analytical functions that can be fitted to the experimental data to recover the phenomenological rate parameters. Unfortunately, many dynamical processes in cells do not follow these simple models, and in many instances it is not possible to obtain an analytical function through a theoretical analysis of a more complex model. In such cases, experimental analysis can be combined with Monte Carlo simulations to aid in interpretation of the data. In response to this need, we developed a method called FERNET (Fluorescence Emission Recipes and Numerical routines Toolkit) based on Monte Carlo simulations and the MCell-Blender platform, which was designed to treat the reaction-diffusion problem under realistic scenarios. This method enables us to set complex geometries of the simulation space, distribute molecules among different compartments, and define interspecies reactions with selected kinetic constants, diffusion coefficients, and species brightness. We apply this method to simulate single- and multiple-point FCS, photon-counting histogram analysis, raster image correlation spectroscopy, and two-color fluorescence cross-correlation spectroscopy. We believe that this new program could be very useful for predicting and understanding the output of fluorescence microscopy experiments. PMID:26039162
Ota, Ken S; Beutler, David S; Sheikh, Hassam; Weiss, Jessica L; Parkinson, Dallin; Nguyen, Peter; Gerkin, Richard D; Loli, Akil I
2013-10-01
This study investigated the trend of phone calls in the Banner Good Samaritan Medical Center (BGSMC) Heart Failure Transitional Care Program (HFTCP). The primary goal of the HFTCP is to reduce 30-Day readmissions for heart failure patients by using a multi-pronged approach. This study included 104 patients in the HFTCP discharged over a 51-week period who had around-the-clock telephone access to the Transitionalist. Cellular phone records were reviewed. This study evaluated the length and timing of calls. A total of 4398 telephone calls were recorded of which 39% were inbound and 61% were outbound. This averaged to 86 calls per week. During the "Weekday Daytime" period, Eighty-five percent of the totals calls were made. There were 229 calls during the "Weekday Nights" period with 1.5 inbound calls per week. The "Total Weekend" calls were 10.2% of the total calls which equated to a weekly average of 8.8. Our experience is that direct, physician-patient telephone contact is feasible with a panel of around 100 HF patients for one provider. If the proper financial reimbursements are provided, physicians may be apt to participate in similar transitional care programs. Likewise, third party payers will benefit from the reduction in unnecessary emergency room visits and hospitalizations.
Programs and Place: Risk and Asset Mapping for Fall Prevention
Smith, Matthew Lee; Towne, Samuel D.; Motlagh, Audry S.; Smith, Donald R.; Boolani, Ali; Horel, Scott A.; Ory, Marcia G.
2017-01-01
Identifying ways to measure access, availability, and utilization of health-care services, relative to at-risk areas or populations, is critical in providing practical and actionable information to key stakeholders. This study identified the prevalence and geospatial distribution of fall-related emergency medical services (EMS) calls in relation to the delivery of an evidence-based fall prevention program in Tarrant County, Texas over a 3-year time period. It aims to educate public health professionals and EMS first respondents about the application of geographic information system programs to identify risk-related “hot spots,” service gaps, and community assets to reduce falls among older adults. On average, 96.09 (±108.65) calls were received per ZIP Code (ranging from 0 calls to 386 calls). On average, EMS calls per ZIP Code increased from 30.80 (±34.70) calls in 2009 to 33.75 (±39.58) calls in 2011, which indicate a modest annual call increase over the 3-year study period. The percent of ZIP Codes offering A Matter of Balance/Volunteer Lay Leader Model (AMOB/VLL) workshops increased from 27.3% in 2009 to 34.5% in 2011. On average, AMOB/VLL workshops were offered in ZIP Codes with more fall-related EMS calls over the 3-year study period. Findings suggest that the study community was providing evidence-based fall prevention programming (AMOB/VLL workshops) in higher-risk areas. Opportunities for strategic service expansion were revealed through the identification of fall-related hot spots and asset mapping. PMID:28361049
Current status of endoscopic simulation in gastroenterology fellowship training programs.
Jirapinyo, Pichamol; Thompson, Christopher C
2015-07-01
Recent guidelines have encouraged gastroenterology and surgical training programs to integrate simulation into their core endoscopic curricula. However, the role that simulation currently has within training programs is unknown. This study aims to assess the current status of simulation among gastroenterology fellowship programs. This questionnaire study consisted of 38 fields divided into two sections. The first section queried program directors' experience on simulation and assessed the current status of simulation at their institution. The second portion surveyed their opinion on the potential role of simulation on the training curriculum. The study was conducted at the 2013 American Gastroenterological Association Training Directors' Workshop in Phoenix, Arizona. The participants were program directors from Accreditation Council for Graduate Medical Education accredited gastroenterology training programs, who attended the workshop. The questionnaire was returned by 69 of 97 program directors (response rate of 71%). 42% of programs had an endoscopic simulator. Computerized simulators (61.5%) were the most common, followed by mechanical (30.8%) and animal tissue (7.7%) simulators, respectively. Eleven programs (15%) required fellows to use simulation prior to clinical cases. Only one program has a minimum number of hours fellows have to participate in simulation training. Current simulators are deemed as easy to use (76%) and good educational tools (65%). Problems are cost (72%) and accessibility (69%). The majority of program directors believe that there is a need for endoscopic simulator training, with only 8% disagreeing. Additionally, a majority believe there is a role for simulation prior to initiation of clinical cases with 15% disagreeing. Gastroenterology fellowship program directors widely recognize the importance of simulation. Nevertheless, simulation is used by only 42% of programs and only 15% of programs require that trainees use simulation prior to clinical cases. No programs currently use simulation as part of the evaluation process.
Measuring Pilot Workload in a Moving-base Simulator. Part 2: Building Levels of Workload
NASA Technical Reports Server (NTRS)
Kantowitz, B. H.; Hart, S. G.; Bortolussi, M. R.; Shively, R. J.; Kantowitz, S. C.
1984-01-01
Pilot behavior in flight simulators often use a secondary task as an index of workload. His routine to regard flying as the primary task and some less complex task as the secondary task. While this assumption is quite reasonable for most secondary tasks used to study mental workload in aircraft, the treatment of flying a simulator through some carefully crafted flight scenario as a unitary task is less justified. The present research acknowledges that total mental workload depends upon the specific nature of the sub-tasks that a pilot must complete as a first approximation, flight tasks were divided into three levels of complexity. The simplest level (called the Base Level) requires elementary maneuvers that do not utilize all the degrees of freedom of which an aircraft, or a moving-base simulator; is capable. The second level (called the Paired Level) requires the pilot to simultaneously execute two Base Level tasks. The third level (called the Complex Level) imposes three simultaneous constraints upon the pilot.
Computer programs for calculating potential flow in propulsion system inlets
NASA Technical Reports Server (NTRS)
Stockman, N. O.; Button, S. L.
1973-01-01
In the course of designing inlets, particularly for VTOL and STOL propulsion systems, a calculational procedure utilizing three computer programs evolved. The chief program is the Douglas axisymmetric potential flow program called EOD which calculates the incompressible potential flow about arbitrary axisymmetric bodies. The other two programs, original with Lewis, are called SCIRCL AND COMBYN. Program SCIRCL generates input for EOD from various specified analytic shapes for the inlet components. Program COMBYN takes basic solutions output by EOD and combines them into solutions of interest, and applies a compressibility correction.
The North American Amphibian Monitoring Program. [abstract
Griffin, J.
1998-01-01
The North American Amphibian Monitoring Program has been under development for the past three years. The monitoring strategy for NAAMP has five main prongs: terrestrial salamander surveys, calling surveys, aquatic surveys, western surveys, and atlassing. Of these five, calling surveys were selected as one of the first implementation priorities due to their friendliness to volunteers of varying knowledge levels, relative low cost, and the fact that several groups had already pioneered the techniques involved. While some states and provinces had implemented calling surveys prior to NAAMP, like WI and IL, most states and provinces had little or no history of state/provincewide amphibian monitoring. Thus, the majority of calling survey programs were initiated in the past two years. To assess the progress of this pilot phase, a program review was conducted on the status of the NAAMP calling survey program, and the results of that review will be presented at the meeting. Topics to be discussed include: who is doing what where, extent of route coverage, the continuing random route discussions, quality assurance, strengths and weaknesses of calling surveys, reliability of data, and directions for the future. In addition, a brief overview of the DISPro project will be included. DISPro is a new amphibian monitoring program in National Parks, funded by the Demonstration of Intensive Sites Program (DISPro) through the EPA and NPS. It will begin this year at Big Bend and Shenandoah National Parks. The purpose of the DISPro Amphibian Project will be to investigate relationships between environmental factors and stressors and the distribution, abundance, and health of amphibians in these National Parks. At each Park, amphibian long-term monitoring protocols will be tested, distributions and abundance of amphibians will be mapped, and field research experiments will be conducted to examine stressor effects on amphibians (e.g., ultraviolet radiation, contaminants, acidification).
47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.
Code of Federal Regulations, 2010 CFR
2010-10-01
... programming in the mobile unit that determines the handling of a non-911 call and permit the call to be... CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing procedures; 911-only calling mode. Mobile telephones manufactured after February 13, 2000 that are capable of...
McInnes, Colin W; Vorstenbosch, Joshua; Chard, Ryan; Logsetty, Sarvesh; Buchel, Edward W; Islur, Avinash
2018-02-01
The impact of resident work hour restrictions on training and patient care remains a highly controversial topic, and to date, there lacks a formal assessment as it pertains to Canadian plastic surgery residents. To characterize the work hour profile of Canadian plastic surgery residents and assess the perspectives of residents and program directors regarding work hour restrictions related to surgical competency, resident wellness, and patient safety. An anonymous online survey developed by the authors was sent to all Canadian plastic surgery residents and program directors. Basic summary statistics were calculated. Eighty (53%) residents and 10 (77%) program directors responded. Residents reported working an average of 73 hours in hospital per week with 8 call shifts per month and sleep 4.7 hours/night while on call. Most residents (88%) reported averaging 0 post-call days off per month and 61% will work post-call without any sleep. The majority want the option of working post-call (63%) and oppose an 80-hour weekly maximum (77%). Surgical and medical errors attributed to post-call fatigue were self-reported by 26% and 49% of residents, respectively. Residents and program directors expressed concern about the ability to master surgical skills without working post-call. The majority of respondents oppose duty hour restrictions. The reason is likely multifactorial, including the desire of residents to meet perceived expectations and to master their surgical skills while supervised. If duty hour restrictions are aggressively implemented, many respondents feel that an increased duration of training may be necessary.
A Prototyping Effort for the Integrated Spacecraft Analysis System
NASA Technical Reports Server (NTRS)
Wong, Raymond; Tung, Yu-Wen; Maldague, Pierre
2011-01-01
Computer modeling and simulation has recently become an essential technique for predicting and validating spacecraft performance. However, most computer models only examine spacecraft subsystems, and the independent nature of the models creates integration problems, which lowers the possibilities of simulating a spacecraft as an integrated unit despite a desire for this type of analysis. A new project called Integrated Spacecraft Analysis was proposed to serve as a framework for an integrated simulation environment. The project is still in its infancy, but a software prototype would help future developers assess design issues. The prototype explores a service oriented design paradigm that theoretically allows programs written in different languages to communicate with one another. It includes creating a uniform interface to the SPICE libraries such that different in-house tools like APGEN or SEQGEN can exchange information with it without much change. Service orientation may result in a slower system as compared to a single application, and more research needs to be done on the different available technologies, but a service oriented approach could increase long term maintainability and extensibility.
Test of high-energy hadronic interaction models with high-altitude cosmic-ray data
NASA Astrophysics Data System (ADS)
Haungs, A.; Kempa, J.
2003-09-01
Emulsion experiments placed at high mountain altitudes register hadrons and high-energy γ-rays with an energy threshold in the TeV region. These secondary shower particles are produced in the forward direction of interactions of mainly primary protons and alpha-particles in the Earth's atmosphere. Single γ's and hadrons are mainly produced by the interactions of the primary cosmic-ray nuclei of primary energy below 1015eV. Therefore the measurements are sensitive to the physics of high-energy hadronic interaction models, e.g., as implemented in the Monte Carlo air shower simulation program CORSIKA. By use of detailed simulations invoking various different models for the hadronic interactions we compare the predictions for the single-particle spectra with data of the Pamir experiment. For higher primary energies characteristics of so-called gamma-ray families are used for the comparisons. Including detailed simulations for the Pamir detector we found that the data are incompatible with the HDPM and SIBYLL 1.6 models, but are in agreement with QGSJET, NEXUS, and VENUS.
Using simulation to educate hospital staff about casemix.
Cromwell, D A; Priddis, D; Hindle, D
1998-10-01
When the Australian government funded a casemix development program, few hospital clinicians or staff knew much about casemix classifications like Diagnosis Related Groups (DRGs). Although the concepts behind casemix are essentially simple, it is not a trivial task to explain the logic used to assign patients to classes, or the use of casemix data for management or funding. Therefore, as part of a project to create educational material, a computer-based management game, built around a simulation model of a hospital, was developed. The game was designed for use in a workshop setting, to allow participants to test their understanding of the casemix information presented to them. The simulation mimicked the operation of a hospital, with a player taking the role of a hospital manager. It aimed to demonstrate how AN-DRGs might be used for funding; how patient costs are influenced by hospital activity; and how casemix data can assist in monitoring the use of resources. The game, called Dragon, proved to be very successful, and is now distributed as part of the National Casemix Education series.
Flight Simulator and Training Human Factors Validation
NASA Technical Reports Server (NTRS)
Glaser, Scott T.; Leland, Richard
2009-01-01
Loss of control has been identified as the leading cause of aircraft accidents in recent years. Efforts have been made to better equip pilots to deal with these types of events, commonly referred to as upsets. A major challenge in these endeavors has been recreating the motion environments found in flight as the majority of upsets take place well beyond the normal operating envelope of large aircraft. The Environmental Tectonics Corporation has developed a simulator motion base, called GYROLAB, that is capable of recreating the sustained accelerations, or G-forces, and motions of flight. A two part research study was accomplished that coupled NASA's Generic Transport Model with a GYROLAB device. The goal of the study was to characterize physiological effects of the upset environment and to demonstrate that a sustained motion based simulator can be an effective means for upset recovery training. Two groups of 25 Air Transport Pilots participated in the study. The results showed reliable signs of pilot arousal at specific stages of similar upsets. Further validation also demonstrated that sustained motion technology was successful in improving pilot performance during recovery following an extensive training program using GYROLAB technology.
Two-dimensional nonsteady viscous flow simulation on the Navier-Stokes computer miniNode
NASA Technical Reports Server (NTRS)
Nosenchuck, Daniel M.; Littman, Michael G.; Flannery, William
1986-01-01
The needs of large-scale scientific computation are outpacing the growth in performance of mainframe supercomputers. In particular, problems in fluid mechanics involving complex flow simulations require far more speed and capacity than that provided by current and proposed Class VI supercomputers. To address this concern, the Navier-Stokes Computer (NSC) was developed. The NSC is a parallel-processing machine, comprised of individual Nodes, each comparable in performance to current supercomputers. The global architecture is that of a hypercube, and a 128-Node NSC has been designed. New architectural features, such as a reconfigurable many-function ALU pipeline and a multifunction memory-ALU switch, have provided the capability to efficiently implement a wide range of algorithms. Efficient algorithms typically involve numerically intensive tasks, which often include conditional operations. These operations may be efficiently implemented on the NSC without, in general, sacrificing vector-processing speed. To illustrate the architecture, programming, and several of the capabilities of the NSC, the simulation of two-dimensional, nonsteady viscous flows on a prototype Node, called the miniNode, is presented.
Advancing the LSST Operations Simulator
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group
2013-01-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.
uPy: a ubiquitous CG Python API with biological-modeling applications.
Autin, Ludovic; Johnson, Graham; Hake, Johan; Olson, Arthur; Sanner, Michel
2012-01-01
The uPy Python extension module provides a uniform abstraction of the APIs of several 3D computer graphics programs (called hosts), including Blender, Maya, Cinema 4D, and DejaVu. A plug-in written with uPy can run in all uPy-supported hosts. Using uPy, researchers have created complex plug-ins for molecular and cellular modeling and visualization. uPy can simplify programming for many types of projects (not solely science applications) intended for multihost distribution. It's available at http://upy.scripps.edu. The first featured Web extra is a video that shows interactive analysis of a calcium dynamics simulation. YouTube URL: http://youtu.be/wvs-nWE6ypo. The second featured Web extra is a video that shows rotation of the HIV virus. YouTube URL: http://youtu.be/vEOybMaRoKc.
NASA Technical Reports Server (NTRS)
Haimes, Robert; Follen, Gregory J.
1998-01-01
CAPRI is a CAD-vendor neutral application programming interface designed for the construction of analysis and design systems. By allowing access to the geometry from within all modules (grid generators, solvers and post-processors) such tasks as meshing on the actual surfaces, node enrichment by solvers and defining which mesh faces are boundaries (for the solver and visualization system) become simpler. The overall reliance on file 'standards' is minimized. This 'Geometry Centric' approach makes multi-physics (multi-disciplinary) analysis codes much easier to build. By using the shared (coupled) surface as the foundation, CAPRI provides a single call to interpolate grid-node based data from the surface discretization in one volume to another. Finally, design systems are possible where the results can be brought back into the CAD system (and therefore manufactured) because all geometry construction and modification are performed using the CAD system's geometry kernel.
Parallel Evolutionary Optimization for Neuromorphic Network Training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuman, Catherine D; Disney, Adam; Singh, Susheela
One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impactmore » the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.« less
Two-Dimensional Finite Element Ablative Thermal Response Analysis of an Arcjet Stagnation Test
NASA Technical Reports Server (NTRS)
Dec, John A.; Laub, Bernard; Braun, Robert D.
2011-01-01
The finite element ablation and thermal response (FEAtR, hence forth called FEAR) design and analysis program simulates the one, two, or three-dimensional ablation, internal heat conduction, thermal decomposition, and pyrolysis gas flow of thermal protection system materials. As part of a code validation study, two-dimensional axisymmetric results from FEAR are compared to thermal response data obtained from an arc-jet stagnation test in this paper. The results from FEAR are also compared to the two-dimensional axisymmetric computations from the two-dimensional implicit thermal response and ablation program under the same arcjet conditions. The ablating material being used in this arcjet test is phenolic impregnated carbon ablator with an LI-2200 insulator as backup material. The test is performed at the NASA, Ames Research Center Interaction Heating Facility. Spatially distributed computational fluid dynamics solutions for the flow field around the test article are used for the surface boundary conditions.
Penalty dynamic programming algorithm for dim targets detection in sensor systems.
Huang, Dayu; Xue, Anke; Guo, Yunfei
2012-01-01
In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations.
Comparison of Calibration of Sensors Used for the Quantification of Nuclear Energy Rate Deposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, J.; Reynard-Carette, C.; Tarchalski, M.
This present work deals with a collaborative program called GAMMA-MAJOR 'Development and qualification of a deterministic scheme for the evaluation of GAMMA heating in MTR reactors with exploitation as example MARIA reactor and Jules Horowitz Reactor' between the National Centre for Nuclear Research of Poland, the French Atomic Energy and Alternative Energies Commission and Aix Marseille University. One of main objectives of this program is to optimize the nuclear heating quantification thanks to calculation validated from experimental measurements of radiation energy deposition carried out in irradiation reactors. The quantification of the nuclear heating is a key data especially for themore » thermal, mechanical design and sizing of irradiation experimental devices in specific irradiated conditions and locations. The determination of this data is usually performed by differential calorimeters and gamma thermometers such as used in the experimental multi-sensors device called CARMEN 'Calorimetric en Reacteur et Mesures des Emissions Nucleaires'. In the framework of the GAMMA-MAJOR program a new calorimeter was designed for the nuclear energy deposition quantification. It corresponds to a single-cell calorimeter and it is called KAROLINA. This calorimeter was recently tested during an irradiation campaign inside MARIA reactor in Poland. This new single-cell calorimeter differs from previous CALMOS or CARMEN type differential calorimeters according to three main points: its geometry, its preliminary out-of-pile calibration, and its in-pile measurement method. The differential calorimeter, which is made of two identical cells containing heaters, has a calibration method based on the use of steady thermal states reached by simulating the nuclear energy deposition into the calorimeter sample by Joule effect; whereas the single-cell calorimeter, which has no heater, is calibrated by using the transient thermal response of the sensor (heating and cooling steps). The paper will concern these two kinds of calorimetric sensors. It will focus in particular on studies on their out-of-pile calibrations. Firstly, the characteristics of the sensor designs will be detailed (such as geometry, dimension, material sample, assembly, instrumentation). Then the out-of-pile calibration methods will be described. Furthermore numerical results obtained thanks to 2D axisymmetrical thermal simulations (Finite Element Method, CAST3M) and experimental results will be presented for each sensor. A comparison of the two different thermal sensor behaviours will be realized. To conclude a discussion of the advantages and the drawbacks of each sensor will be performed especially regarding measurement methods. (authors)« less
NAPL: SIMULATOR DOCUMENTATION (EPA/600/SR-97/102)
A mathematical and numerical model is developed to simulate the transport and fate of NAPLs (Non-Aqueous Phase Liquids) in near-surface granular soils. The resulting three-dimensional, three phase simulator is called NAPL. The simulator accommodates three mobile phases: water, NA...
Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows
NASA Technical Reports Server (NTRS)
West, Jeff; Wright, Jeffrey; Thakur, Siddharth; Luke, Ed; Grinstead, Nathan
2012-01-01
The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries.
A flexible, interactive software tool for fitting the parameters of neuronal models.
Friedrich, Péter; Vella, Michael; Gulyás, Attila I; Freund, Tamás F; Káli, Szabolcs
2014-01-01
The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool.
A flexible, interactive software tool for fitting the parameters of neuronal models
Friedrich, Péter; Vella, Michael; Gulyás, Attila I.; Freund, Tamás F.; Káli, Szabolcs
2014-01-01
The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool. PMID:25071540
Python-Assisted MODFLOW Application and Code Development
NASA Astrophysics Data System (ADS)
Langevin, C.
2013-12-01
The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.
Accomplishments of the Oak Ridge National Laboratory Seed Money program
DOE R&D Accomplishments Database
1986-09-01
In 1974, a modest program for funding new, innovative research was initiated at ORNL. It was called the "Seed Money" program and has become part of a larger program, called Exploratory R and D, which is being carried out at all DOE national laboratories. This report highlights 12 accomplishments of the Seed Money Program: nickel aluminide, ion implantation, laser annealing, burn meter, Legionnaires' disease, whole-body radiation counter, the ANFLOW system, genetics and molecular biology, high-voltage equipment, microcalorimeter, positron probe, and atom science. (DLC)
When They Talk about CALL: Discourse in a Required CALL Class
ERIC Educational Resources Information Center
Kessler, Greg
2010-01-01
This study investigates preservice teachers' discourse about CALL in a required CALL class which combines theory and practice. Thirty-three students in a Linguistics MA program CALL course were observed over a 10-week quarter. For all of these students, it was their first formal exposure to CALL as a discipline. Communication in the class…
Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows
NASA Astrophysics Data System (ADS)
Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.
2014-12-01
The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.
NASA Technical Reports Server (NTRS)
Greenberg, Albert G.; Lubachevsky, Boris D.; Nicol, David M.; Wright, Paul E.
1994-01-01
Fast, efficient parallel algorithms are presented for discrete event simulations of dynamic channel assignment schemes for wireless cellular communication networks. The driving events are call arrivals and departures, in continuous time, to cells geographically distributed across the service area. A dynamic channel assignment scheme decides which call arrivals to accept, and which channels to allocate to the accepted calls, attempting to minimize call blocking while ensuring co-channel interference is tolerably low. Specifically, the scheme ensures that the same channel is used concurrently at different cells only if the pairwise distances between those cells are sufficiently large. Much of the complexity of the system comes from ensuring this separation. The network is modeled as a system of interacting continuous time automata, each corresponding to a cell. To simulate the model, conservative methods are used; i.e., methods in which no errors occur in the course of the simulation and so no rollback or relaxation is needed. Implemented on a 16K processor MasPar MP-1, an elegant and simple technique provides speedups of about 15 times over an optimized serial simulation running on a high speed workstation. A drawback of this technique, typical of conservative methods, is that processor utilization is rather low. To overcome this, new methods were developed that exploit slackness in event dependencies over short intervals of time, thereby raising the utilization to above 50 percent and the speedup over the optimized serial code to about 120 times.
Hardware independence checkout software
NASA Technical Reports Server (NTRS)
Cameron, Barry W.; Helbig, H. R.
1990-01-01
ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.
NASA Astrophysics Data System (ADS)
Chang, Chao-Hsi; Wang, Jian-Xiong; Wu, Xing-Gang
2006-11-01
An upgraded version of the package BCVEGPY2.0: [C.-H. Chang, J.-X. Wang, X.-G. Wu, Comput. Phys. Commun. 174 (2006) 241] is presented, which works under LINUX system and is named as BCVEGPY2.1. With the version and a GNU C compiler additionally, users may simulate the B-events in various experimental environments very conveniently. It has been manipulated in better modularity and code reusability (less cross communication among various modules) than BCVEGPY2.0 has. Furthermore, in the upgraded version a special execution is arranged as that the GNU command make compiles a requested code with the help of a master makefile in main code directory, and then builds an executable file with the default name run. Finally, this paper may also be considered as an erratum, i.e., typo errors in BCVEGPY2.0 and corrections accordingly have been listed. New version program (BCVEGPY2.1) summaryTitle of program: BCVEGPY2.1 Catalogue identifier: ADTJ_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTJ_v2_1 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference to original program: BCVEGPY2.0 Reference in CPC: Comput. Phys. Commun. 174 (2006) 241 Does the new version supersede the old program: No Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating systems: LINUX Programming language used: FORTRAN 77/90 Memory required to execute with typical data: About 2.0 MB No. of lines in distributed program, including test data, etc.: 31 521 No. of bytes in distributed program, including test data, etc.: 1 310 179 Distribution format: tar.gz Nature of physical problem: Hadronic production of B meson itself and its excited states Method of solution: The code with option can generate weighted and unweighted events. An interface to PYTHIA is provided to meet the needs of jets hadronization in the production. Restrictions on the complexity of the problem: The hadronic production of (cb¯)-quarkonium in S-wave and P-wave states via the mechanism of gluon-gluon fusion are given by the so-called 'complete calculation' approach. Reasons for new version: Responding to the feedback from users, we rearrange the program in a convenient way and then it can be easily adopted by the users to do the simulations according to their own experimental environment (e.g. detector acceptances and experimental cuts). We have paid many efforts to rearrange the program into several modules with less cross communication among the modules, the main program is slimmed down and all the further actions are decoupled from the main program and can be easily called for various purposes. Typical running time: The typical running time is machine and user-parameters dependent. Typically, for production of the S-wave (cb¯)-quarkonium, when IDWTUP = 1, it takes about 20 hour on a 1.8 GHz Intel P4-processor machine to generate 1000 events; however, when IDWTUP = 3, to generate 10 6 events it takes about 40 minutes only. Of the production, the time for the P-wave (cb¯)-quarkonium will take almost two times longer than that for its S-wave quarkonium. Summary of the changes (improvements): (1) The structure and organization of the program have been changed a lot. The new version package BCVEGPY2.1 has been divided into several modules with less cross communication among the modules (some old version source files are divided into several parts for the purpose). The main program is slimmed down and all the further actions are decoupled from the main program so that they can be easily called for various applications. All of the Fortran codes are organized in the main code directory named as bcvegpy2.1, which contains the main program, all of its prerequisite files and subsidiary 'folders' (subdirectory to the main code directory). The method for setting the parameter is the same as that of the previous versions [C.-H. Chang, C. Driouich, P. Eerola, X.-G. Wu, Comput. Phys. Commun. 159 (2004) 192, hep-ph/0309120. [1
Quick-start guide for version 3.0 of EMINERS - Economic Mineral Resource Simulator
Bawiec, Walter J.; Spanski, Gregory T.
2012-01-01
Quantitative mineral resource assessment, as developed by the U.S. Geological Survey (USGS), consists of three parts: (1) development of grade and tonnage mineral deposit models; (2) delineation of tracts permissive for each deposit type; and (3) probabilistic estimation of the numbers of undiscovered deposits for each deposit type (Singer and Menzie, 2010). The estimate of the number of undiscovered deposits at different levels of probability is the input to the EMINERS (Economic Mineral Resource Simulator) program. EMINERS uses a Monte Carlo statistical process to combine probabilistic estimates of undiscovered mineral deposits with models of mineral deposit grade and tonnage to estimate mineral resources. It is based upon a simulation program developed by Root and others (1992), who discussed many of the methods and algorithms of the program. Various versions of the original program (called "MARK3" and developed by David H. Root, William A. Scott, and Lawrence J. Drew of the USGS) have been published (Root, Scott, and Selner, 1996; Duval, 2000, 2012). The current version (3.0) of the EMINERS program is available as USGS Open-File Report 2004-1344 (Duval, 2012). Changes from version 2.0 include updating 87 grade and tonnage models, designing new templates to produce graphs showing cumulative distribution and summary tables, and disabling economic filters. The economic filters were disabled because embedded data for costs of labor and materials, mining techniques, and beneficiation methods are out of date. However, the cost algorithms used in the disabled economic filters are still in the program and available for reference for mining methods and milling techniques included in Camm (1991). EMINERS is written in C++ and depends upon the Microsoft Visual C++ 6.0 programming environment. The code depends heavily on the use of Microsoft Foundation Classes (MFC) for implementation of the Windows interface. The program works only on Microsoft Windows XP or newer personal computers. It does not work on Macintosh computers. This report demonstrates how to execute EMINERS software using default settings and existing deposit models. Many options are available when setting up the simulation. Information and explanations addressing these optional parameters can be found in the EMINERS Help files. Help files are available during execution of EMINERS by selecting EMINERS Help from the pull-down menu under Help on the EMINERS menu bar. There are four sections in this report. Part I describes the installation, setup, and application of the EMINERS program, and Part II illustrates how to interpret the text file that is produced. Part III describes the creation of tables and graphs by use of the provided Excel templates. Part IV summarizes grade and tonnage models used in version 3.0 of EMINERS.
Direct Telephonic Communication in a Heart Failure Transitional Care Program: An observational study
Ota, Ken S.; Beutler, David S.; Sheikh, Hassam; Weiss, Jessica L.; Parkinson, Dallin; Nguyen, Peter; Gerkin, Richard D.; Loli, Akil I.
2013-01-01
Background This study investigated the trend of phone calls in the Banner Good Samaritan Medical Center (BGSMC) Heart Failure Transitional Care Program (HFTCP). The primary goal of the HFTCP is to reduce 30-Day readmissions for heart failure patients by using a multi-pronged approach. Methods This study included 104 patients in the HFTCP discharged over a 51-week period who had around-the-clock telephone access to the Transitionalist. Cellular phone records were reviewed. This study evaluated the length and timing of calls. Results A total of 4398 telephone calls were recorded of which 39% were inbound and 61% were outbound. This averaged to 86 calls per week. During the “Weekday Daytime” period, Eighty-five percent of the totals calls were made. There were 229 calls during the “Weekday Nights” period with 1.5 inbound calls per week. The “Total Weekend” calls were 10.2% of the total calls which equated to a weekly average of 8.8. Conclusions Our experience is that direct, physician-patient telephone contact is feasible with a panel of around 100 HF patients for one provider. If the proper financial reimbursements are provided, physicians may be apt to participate in similar transitional care programs. Likewise, third party payers will benefit from the reduction in unnecessary emergency room visits and hospitalizations. PMID:28352437
NASA Technical Reports Server (NTRS)
1961-01-01
This photo shows the X-15 flight simulator located at the NASA Flight Research Center, Edwards, California, in the 1960s. One of the major advances in aircraft development, pilot training, mission planning, and research flight activities in the 1950s and 1960s was the use of simulators. For the X-15, a computer was programmed with the flight characteristics of the aircraft. Before actually flying a mission, a research pilot could discover many potential problems with the aircraft or the mission while still on the ground by 'flying' the simulator. The problem could then be analyzed by engineers and a solution found. This did much to improve safety. The X-15 simulator was very limited compared to those available in the 21st century. The video display was simple, while the computer was analog rather than digital (although it became hybrid in 1964 with the addition of a digital computer for the X-15A-2; this generated the nonlinear aerodynamic coefficients for the modified No. 2 aircraft). The nonlinear aerodynamic function generators used in the X-15 simulator had hundreds of fuses, amplifiers, and potentiometers without any surge protection. After the simulator was started on a Monday morning, it would be noon before it had warmed up and stabilized. The electronics for the X-15 simulator took up many large consoles. The X-15 was a rocket-powered aircraft. The original three aircraft were about 50 ft long with a wingspan of 22 ft. The modified #2 aircraft (X-15A-2 was longer.) They were a missile-shaped vehicles with unusual wedge-shaped vertical tails, thin stubby wings, and unique side fairings that extended along the side of the fuselage. The X-15 weighed about 14,000 lb empty and approximately 34,000 lb at launch. The XLR-99 rocket engine, manufactured by Thiokol Chemical Corp., was pilot controlled and was rated at 57,000 lb of thrust, although there are indications that it actually achieved up to 60,000 lb. North American Aviation built three X-15 aircraft for the program. The X-15 research aircraft was developed to provide in-flight information and data on aerodynamics, structures, flight controls, and the physiological aspects of high-speed, high-altitude flight. A follow-on program used the aircraft as testbeds to carry various scientific experiments beyond the Earth's atmosphere on a repeated basis. For flight in the dense air of the usable atmosphere, the X-15 used conventional aerodynamic controls such as rudder surfaces on the vertical stabilizers to control yaw and movable horizontal stabilizers to control pitch when moving in synchronization or roll when moved differentially. For flight in the thin air outside of the appreciable Earth's atmosphere, the X-15 used a reaction control system. Hydrogen peroxide thrust rockets located on the nose of the aircraft provided pitch and yaw control. Those on the wings provided roll control. Because of the large fuel consumption, the X-15 was air launched from a B-52 aircraft at approximately 45,000 ft and a speed of about 500 mph. Depending on the mission, the rocket engine provided thrust for the first 80 to 120 sec of flight. The remainder of the normal 10 to 11 min. flight was powerless and ended with a 200-mph glide landing. Generally, one of two types of X-15 flight profiles was used; a high-altitude flight plan that called for the pilot to maintain a steep rate of climb, or a speed profile that called for the pilot to push over and maintain a level altitude. The X-15 was flown over a period of nearly 10 years -- June 1959 to Oct. 1968 -- and set the world's unofficial speed and altitude records of 4,520 mph (Mach 6.7) and 354,200 ft in a program to investigate all aspects of manned hypersonic flight. Information gained from the highly successful X-15 program contributed to the development of the Mercury, Gemini, and Apollo manned spaceflight programs, and also the Space Shuttle program. The X-15s made a total of 199 flights, and were manufactured by North American Aviation. X-15-1, serial number 56-6670, is now located at the National Air and Space Museum, Washington DC. North American X-15A-2, serial number 56-6671, is at the United States Air Force Museum, Wright-Patterson AFB, Ohio. X-15-3, serial number 56-6672, crashed on 15 November 1967, resulting in the death of Maj. Michael J. Adams.
Primer on Computer Graphics Programming. Revision
1982-04-01
TEXTO 60 TO 4 3 CALL UWRITl C’Ai’,’TEXT 4 CONTINUE «.«. ^^^^ef%,xN...CX.Y.’NOO mm^^ CALL UPRNTl CTTTLECO,’ TEXTO CALL UPRNTJ CX.OPTIONCI33 CALL UPRNTJ CTITLEC25.’ TEXTO CALL UPRNTl CY,OPTIONCli3 CALL UMOVE OC.Y5...CALL USET (’TEXT’) CALL UPRINT (-1.0,-1.05,’SIDES;’) CALL USET (’INTEGER’) CALL UPRINT (0.9,-1.05,S! DES ) 1 CONTINUE CALLUEND STOP
A Set of Free Cross-Platform Authoring Programs for Flexible Web-Based CALL Exercises
ERIC Educational Resources Information Center
O'Brien, Myles
2012-01-01
The Mango Suite is a set of three freely downloadable cross-platform authoring programs for flexible network-based CALL exercises. They are Adobe Air applications, so they can be used on Windows, Macintosh, or Linux computers, provided the freely-available Adobe Air has been installed on the computer. The exercises which the programs generate are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, R.D.
The results of a research effort to develop a multiphase naturally fractured, lenticular reservoir simulator is presented. The simulator possesses the capability of investigating the effects of non-Darcy flow, Klinkenberg effect, and transient multiphase wellbore storage for wells with finite and infinite conductivity fractures. The simulator has been utilized to simulate actual pressure transient data for gas wells associated with the United States Department of Energy, Western Gas Sands Project, MWX Experiments. The results of these simulations are contained in the report as well as simulation results for hypothetical wells which are producing under multiphase flow conditions. In addition tomore » the reservoir simulation development, and theoretical and field case studies the results of an experimental program to investigate multiphase non-Darcy flow coefficients (inertial resistance coefficients or beta factors as they are sometimes called) are also presented. The experimental data was obtained for non-Darcy flow in porous and fractured media. The results clearly indicate the dependence of the non-Darcy flow coefficient upon liquid saturation. Where appropriate comparisons are made against data available in the open literature. In addition, theoretical development of a correlation to predict non-Darcy flow coefficients as a function of effective gas permeability, liquid saturations, and porosity is presentd. The results presented in this report will provide scientists and engineers tools to investigate well performance data and production trends for wells completed in lenticular, naturally fractured formations producing under non-Darcy, multiphase conditions. 65 refs., 57 figs., 15 tabs.« less
Numerical Propulsion System Simulation (NPSS) 1999 Industry Review
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Greg; Naiman, Cynthia; Evans, Austin
2000-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. In addition, the paper contains a summary of the feedback received from industry partners in the development effort and the actions taken over the past year to respond to that feedback. The NPSS development was supported in FY99 by the High Performance Computing and Communications Program.
Use of simulated pages to prepare medical students for internship and improve patient safety.
Schwind, Cathy J; Boehler, Margaret L; Markwell, Stephen J; Williams, Reed G; Brenner, Michael J
2011-01-01
During the transition from medical school to internship, trainees experience high levels of stress related to pages on the inpatient wards. The steep learning curve during this period may also affect patient safety. The authors piloted the use of simulated pages to improve medical student preparedness, decrease stress related to pages, and familiarize medical students with common patient problems. A multidisciplinary team at Southern Illinois University School of Medicine developed simulated pages that were tested among senior medical students. Sixteen medical students were presented with 11 common patient scenarios. Data on assessment, management, and global performance were collected. Mean confidence levels were evaluated pre- and postintervention. Students were also surveyed on how the simulated pages program influenced their perceived comfort in managing patient care needs and the usefulness of the exercise in preparing them to handle inpatient pages. Mean scores on the assessment and management portions of the scenarios varied widely depending on the scenario (range -15.6 ± 41.6 to 95.7 ± 9.5). Pass rates based on global performance ranged from 12% to 93%. Interrater agreement was high (mean kappa = 0.88). Students' confidence ratings on a six-point scale increased from 1.87 preintervention to 3.53 postintervention (P < .0001). Simulated pages engage medical students and may foster medical student preparedness for internship. Students valued the opportunity to simulate "on call" responsibilities, and exposure to simulated pages significantly increased their confidence levels. Further studies are needed to determine effects on patient safety outcomes.
Melnick, Glenn A; Green, Lois; Rich, Jeremy
2016-01-01
In 2009 HealthCare Partners Affiliates Medical Group, based in Southern California, launched House Calls, an in-home program that provides, coordinates, and manages care primarily for recently discharged high-risk, frail, and psychosocially compromised patients. Its purpose is to reduce preventable emergency department visits and hospital readmissions. We present data over time from this well-established program to provide an example for other new programs that are being established across the United States to serve this population with complex needs. The findings show that the initial House Calls structure, staffing patterns, and processes differed across the geographic areas that it served, and that they also evolved over time in different ways. In the same time period, all areas experienced a reduction in operating costs per patient and showed substantial reductions in monthly per patient health care spending and hospital utilization after enrollment in the House Calls program, compared to the period before enrollment. Despite more than five years of experience, the program structure continues to evolve and adjust staffing and other features to accommodate the dynamic nature of this complex patient population. Project HOPE—The People-to-People Health Foundation, Inc.
Biological relevance of CNV calling methods using familial relatedness including monozygotic twins.
Castellani, Christina A; Melka, Melkaye G; Wishart, Andrea E; Locke, M Elizabeth O; Awamleh, Zain; O'Reilly, Richard L; Singh, Shiva M
2014-04-21
Studies involving the analysis of structural variation including Copy Number Variation (CNV) have recently exploded in the literature. Furthermore, CNVs have been associated with a number of complex diseases and neurodevelopmental disorders. Common methods for CNV detection use SNP, CNV, or CGH arrays, where the signal intensities of consecutive probes are used to define the number of copies associated with a given genomic region. These practices pose a number of challenges that interfere with the ability of available methods to accurately call CNVs. It has, therefore, become necessary to develop experimental protocols to test the reliability of CNV calling methods from microarray data so that researchers can properly discriminate biologically relevant data from noise. We have developed a workflow for the integration of data from multiple CNV calling algorithms using the same array results. It uses four CNV calling programs: PennCNV (PC), Affymetrix® Genotyping Console™ (AGC), Partek® Genomics Suite™ (PGS) and Golden Helix SVS™ (GH) to analyze CEL files from the Affymetrix® Human SNP 6.0 Array™. To assess the relative suitability of each program, we used individuals of known genetic relationships. We found significant differences in CNV calls obtained by different CNV calling programs. Although the programs showed variable patterns of CNVs in the same individuals, their distribution in individuals of different degrees of genetic relatedness has allowed us to offer two suggestions. The first involves the use of multiple algorithms for the detection of the largest possible number of CNVs, and the second suggests the use of PennCNV over all other methods when the use of only one software program is desirable.
Mitigating Handoff Call Dropping in Wireless Cellular Networks: A Call Admission Control Technique
NASA Astrophysics Data System (ADS)
Ekpenyong, Moses Effiong; Udoh, Victoria Idia; Bassey, Udoma James
2016-06-01
Handoff management has been an important but challenging issue in the field of wireless communication. It seeks to maintain seamless connectivity of mobile users changing their points of attachment from one base station to another. This paper derives a call admission control model and establishes an optimal step-size coefficient (k) that regulates the admission probability of handoff calls. An operational CDMA network carrier was investigated through the analysis of empirical data collected over a period of 1 month, to verify the performance of the network. Our findings revealed that approximately 23 % of calls in the existing system were lost, while 40 % of the calls (on the average) were successfully admitted. A simulation of the proposed model was then carried out under ideal network conditions to study the relationship between the various network parameters and validate our claim. Simulation results showed that increasing the step-size coefficient degrades the network performance. Even at optimum step-size (k), the network could still be compromised in the presence of severe network crises, but our model was able to recover from these problems and still functions normally.
BeeSim: Leveraging Wearable Computers in Participatory Simulations with Young Children
ERIC Educational Resources Information Center
Peppler, Kylie; Danish, Joshua; Zaitlen, Benjamin; Glosson, Diane; Jacobs, Alexander; Phelps, David
2010-01-01
New technologies have enabled students to become active participants in computational simulations of dynamic and complex systems (called Participatory Simulations), providing a "first-person"perspective on complex systems. However, most existing Participatory Simulations have targeted older children, teens, and adults assuming that such concepts…
Wireless just-in-time training of mobile skilled support personnel
NASA Astrophysics Data System (ADS)
Bandera, Cesar; Marsico, Michael; Rosen, Mitchel; Schlegel, Barry
2006-05-01
Skilled Support Personnel (SSP) serve emergency response organizations during an emergency incident, and include laborers, operating engineers, carpenters, ironworkers, sanitation workers and utility workers. SSP called to an emergency incident rarely have recent detailed training on the chemical, biological, radiological, nuclear and/or explosives (CBRNE) agents or the personal protection equipment (PPE) relevant to the incident. This increases personal risk to the SSP and mission risk at the incident site. Training for SSP has been identified as a critical need by the National Institute for Environmental Health Sciences, Worker Education and Training Program. We present a system being developed to address this SSP training shortfall by exploiting a new training paradigm called just-in-time training (JITT) made possible by advances in distance learning and cellular telephony. In addition to the current conventional training at regularly scheduled instructional events, SSP called to an emergency incident will have secure access to short (<5 minutes) training modules specific to the incident and derived from the Occupational Safety and Health Administration (OSHA) Disaster Site Worker Course. To increase retention, each learning module incorporates audio, video, interactive simulations, graphics, animation, and assessment designed for the user interface of most current cell phones. Engineering challenges include compatibility with current cell phone technologies and wireless service providers, integration with the incident management system, and SCORM compliance.
Zhou, L; Qu, Z G; Ding, T; Miao, J Y
2016-04-01
The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.
NASA Astrophysics Data System (ADS)
Zhou, L.; Qu, Z. G.; Ding, T.; Miao, J. Y.
2016-04-01
The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.
Impact of a Temporary NRT Enhancement in a State Quitline and Web-Based Program.
Cole, Sam; Suter, Casey; Nash, Chelsea; Pollard, Joseph
2018-06-01
To examine the impact of a nicotine replacement therapy (NRT) enhancement on quit outcomes. Observational study using an intent to treat as treated analysis. Not available. A total of 4022 Idaho tobacco users aged ≥18 years who received services from the Idaho Tobacco Quitline or Idaho's web-based program. One-call phone or web-based participants were sent a single 4- or 8-week NRT shipment. Multiple-call participants were sent NRT in a single 4-week shipment or two 4-week shipments (second shipment sent only to those completing a second coaching call). North American Quitline Consortium recommended Minimal Data Set items collected at registration and follow-up. Thirty-day point prevalence quit rates were assessed at 7-month follow-up. Multiple logistic regression models were used to examine the effects of program type and amount of NRT sent to participants while controlling for demographic and tobacco use characteristics. Abstinence rates were significantly higher among 8-week versus 4-week NRT recipients (42.5% vs 33.3%). The effect was only significant between multiple-call program participants who received both 4-week NRT shipments versus only the first of 2 possible 4-week shipments (51.1% vs 31.1%). Costs per quit were lowest among web-based participants who received 4 weeks of NRT (US$183 per quit) and highest among multiple-call participants who received only 1 of 2 possible NRT shipments (US$557 per quit). To better balance cost with clinical effectiveness, funders of state-based tobacco cessation services may want to consider (1) allowing tobacco users to choose between phone- and web-based programs while (2) limiting longer NRT benefits only to multiple-call program participants.
Dinucleotide controlled null models for comparative RNA gene prediction.
Gesell, Tanja; Washietl, Stefan
2008-05-27
Comparative prediction of RNA structures can be used to identify functional noncoding RNAs in genomic screens. It was shown recently by Babak et al. [BMC Bioinformatics. 8:33] that RNA gene prediction programs can be biased by the genomic dinucleotide content, in particular those programs using a thermodynamic folding model including stacking energies. As a consequence, there is need for dinucleotide-preserving control strategies to assess the significance of such predictions. While there have been randomization algorithms for single sequences for many years, the problem has remained challenging for multiple alignments and there is currently no algorithm available. We present a program called SISSIz that simulates multiple alignments of a given average dinucleotide content. Meeting additional requirements of an accurate null model, the randomized alignments are on average of the same sequence diversity and preserve local conservation and gap patterns. We make use of a phylogenetic substitution model that includes overlapping dependencies and site-specific rates. Using fast heuristics and a distance based approach, a tree is estimated under this model which is used to guide the simulations. The new algorithm is tested on vertebrate genomic alignments and the effect on RNA structure predictions is studied. In addition, we directly combined the new null model with the RNAalifold consensus folding algorithm giving a new variant of a thermodynamic structure based RNA gene finding program that is not biased by the dinucleotide content. SISSIz implements an efficient algorithm to randomize multiple alignments preserving dinucleotide content. It can be used to get more accurate estimates of false positive rates of existing programs, to produce negative controls for the training of machine learning based programs, or as standalone RNA gene finding program. Other applications in comparative genomics that require randomization of multiple alignments can be considered. SISSIz is available as open source C code that can be compiled for every major platform and downloaded here: http://sourceforge.net/projects/sissiz.
CASPER: A GENERALIZED PROGRAM FOR PLOTTING AND SCALING DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietzke, M.P.; Smith, R.E.
A Fortran subroutine was written to scale floating-point data and generate a magnetic tape to plot it on the Calcomp 570 digital plotter. The routine permits a great deal of flexibility, and may be used with any type of FORTRAN or FAP calling program. A simple calling program was also written to permit the user to read in data from cards and plot it without any additional programming. Both the Fortran and binary decks are available. (auth)
High-Throughput Simulations of Dimer and Trimer Assembly of Membrane Proteins. The DAFT Approach.
Wassenaar, Tsjerk A; Pluhackova, Kristyna; Moussatova, Anastassiia; Sengupta, Durba; Marrink, Siewert J; Tieleman, D Peter; Böckmann, Rainer A
2015-05-12
Interactions between membrane proteins are of great biological significance and are consequently an important target for pharmacological intervention. Unfortunately, it is still difficult to obtain detailed views on such interactions, both experimentally, where the environment hampers atomic resolution investigation, and computationally, where the time and length scales are problematic. Coarse grain simulations have alleviated the later issue, but the slow movement through the bilayer, coupled to the long life times of nonoptimal dimers, still stands in the way of characterizing binding distributions. In this work, we present DAFT, a Docking Assay For Transmembrane components, developed to identify preferred binding orientations. The method builds on a program developed recently for generating custom membranes, called insane (INSert membrANE). The key feature of DAFT is the setup of starting structures, for which optimal periodic boundary conditions are devised. The purpose of DAFT is to perform a large number of simulations with different components, starting from unbiased noninteracting initial states, such that the simulations evolve collectively, in a manner reflecting the underlying energy landscape of interaction. The implementation and characteristic features of DAFT are explained, and the efficacy and relaxation properties of the method are explored for oligomerization of glycophorin A dimers, polyleucine dimers and trimers, MS1 trimers, and rhodopsin dimers. The results suggest that, for simple helices, such as GpA and polyleucine, in POPC/DOPC membranes series of 500 simulations of 500 ns each allow characterization of the helix dimer orientations and allow comparing associating and nonassociating components. However, the results also demonstrate that short simulations may suffer significantly from nonconvergence of the ensemble and that using too few simulations may obscure or distort features of the interaction distribution. For trimers, simulation times exceeding several microseconds appear needed, due to the increased complexity. Similarly, characterization of larger proteins, such as rhodopsin, takes longer time scales due to the slower diffusion and the increased complexity of binding interfaces. DAFT and its auxiliary programs have been made available from http://cgmartini.nl/ , together with a working example.
Modeling Education on the Real World.
ERIC Educational Resources Information Center
Hunter, Beverly
1983-01-01
Offers and discusses three suggestions to capitalize on two developments related to system dynamics modeling and simulation. These developments are a junior/senior high textbook called "Introduction to Computer Simulation" and Micro-DYNAMO, a computer simulation language for microcomputers. (Author/JN)
Frattaroli, Shannon; Schulman, Eric; McDonald, Eileen M; Omaki, Elise C; Shields, Wendy C; Jones, Vanya; Brewer, William
2018-05-17
Innovative strategies are needed to improve the prevalence of working smoke alarms in homes. To our knowledge, this is the first study to report on the effectiveness of Facebook advertising and automated telephone calls as population-level strategies to encourage an injury prevention behavior. We examine the effectiveness of Facebook advertising and automated telephone calls as strategies to enroll individuals in Baltimore City's Fire Department's free smoke alarm installation program. We directed our advertising efforts toward Facebook users eligible for the Baltimore City Fire Department's free smoke alarm installation program and all homes with a residential phone line included in Baltimore City's automated call system. The Facebook campaign targeted Baltimore City residents 18 years of age and older. In total, an estimated 300 000 Facebook users met the eligibility criteria. Facebook advertisements were delivered to users' desktop and mobile device newsfeeds. A prerecorded message was sent to all residential landlines listed in the city's automated call system. By the end of the campaign, the 3 advertisements generated 456 666 impressions reaching 130 264 Facebook users. Of the users reached, 4367 individuals (1.3%) clicked the advertisement. The automated call system included approximately 90 000 residential phone numbers. Participants attributed 25 smoke alarm installation requests to Facebook and 458 to the automated call. Facebook advertisements are a novel approach to promoting smoke alarms and appear to be effective in exposing individuals to injury prevention messages. However, converting Facebook message recipients to users of a smoke alarm installation program occurred infrequently in this study. Residents who participated in the smoke alarm installation program were more likely to cite the automated call as the impetus for their participation. Additional research is needed to understand the circumstances and strategies to effectively use the social networking site as a tool to convert passive users into active participants.
Integrating Corpus-Based CALL Programs in Teaching English through Children's Literature
ERIC Educational Resources Information Center
Johns, Tim F.; Hsingchin, Lee; Lixun, Wang
2008-01-01
This paper presents particular pedagogical applications of a number of corpus-based CALL (computer assisted language learning) programs such as "CONTEXTS" and "CLOZE," "MATCHUP" and "BILINGUAL SENTENCE SHUFFLER," in the teaching of English through children's literature. An elective course in Taiwan for…
Fast single-pass alignment and variant calling using sequencing data
USDA-ARS?s Scientific Manuscript database
Sequencing research requires efficient computation. Few programs use already known information about DNA variants when aligning sequence data to the reference map. New program findmap.f90 reads the previous variant list before aligning sequence, calling variant alleles, and summing the allele counts...
Long-Term, Non-Computer, Communication Simulations as Course Integration Activities
ERIC Educational Resources Information Center
Hamilton, James P.
2008-01-01
This article offers a few guidelines for constructing effective simulations. It presents a sample class activity called simulated public hearing which aims to integrate the various elements of a public speaking course into a more comprehensive whole. Properly designed, simulated hearings have elements of persuasive, informative, and impromptu…
Kasahara, Kota; Kinoshita, Kengo
2016-01-01
Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.
ERIC Educational Resources Information Center
Shaw, Yun
2010-01-01
Many of the commercial Computer-Assisted Language Learning (CALL) programs available today typically take a generic approach. This approach standardizes the program so that it can be used to teach any language merely by translating the content from one language to another. These CALL programs rarely consider the cultural background or preferred…
Distributed Function Mining for Gene Expression Programming Based on Fast Reduction.
Deng, Song; Yue, Dong; Yang, Le-chan; Fu, Xiong; Feng, Ya-zhou
2016-01-01
For high-dimensional and massive data sets, traditional centralized gene expression programming (GEP) or improved algorithms lead to increased run-time and decreased prediction accuracy. To solve this problem, this paper proposes a new improved algorithm called distributed function mining for gene expression programming based on fast reduction (DFMGEP-FR). In DFMGEP-FR, fast attribution reduction in binary search algorithms (FAR-BSA) is proposed to quickly find the optimal attribution set, and the function consistency replacement algorithm is given to solve integration of the local function model. Thorough comparative experiments for DFMGEP-FR, centralized GEP and the parallel gene expression programming algorithm based on simulated annealing (parallel GEPSA) are included in this paper. For the waveform, mushroom, connect-4 and musk datasets, the comparative results show that the average time-consumption of DFMGEP-FR drops by 89.09%%, 88.85%, 85.79% and 93.06%, respectively, in contrast to centralized GEP and by 12.5%, 8.42%, 9.62% and 13.75%, respectively, compared with parallel GEPSA. Six well-studied UCI test data sets demonstrate the efficiency and capability of our proposed DFMGEP-FR algorithm for distributed function mining.
NASA Technical Reports Server (NTRS)
Rule, W. K.; Hayashida, K. B.
1992-01-01
The development of a computer program to predict the degradation of the insulating capabilities of the multilayer insulation (MLI) blanket of Space Station Freedom due to a hypervelocity impact with a space debris particle is described. A finite difference scheme is used for the calculations. The computer program was written in Microsoft BASIC. Also described is a test program that was undertaken to validate the numerical model. Twelve MLI specimens were impacted at hypervelocities with simulated debris particles using a light gas gun at Marshall Space Flight Center. The impact-damaged MLI specimens were then tested for insulating capability in the space environment of the Sunspot thermal vacuum chamber at MSFC. Two undamaged MLI specimens were also tested for comparison with the test results of the damaged specimens. The numerical model was found to adequately predict behavior of the MLI specimens in the Sunspot chamber. A parameter, called diameter ratio, was developed to relate the nominal MLI impact damage to the apparent (for thermal analysis purposes) impact damage based on the hypervelocity impact conditions of a specimen.
Development and Testing of Control Laws for the Active Aeroelastic Wing Program
NASA Technical Reports Server (NTRS)
Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John
2005-01-01
The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.
Exploring EFL Teachers' CALL Knowledge and Competencies: In-Service Program Perspectives
ERIC Educational Resources Information Center
Liu, Mei-Hui; Kleinsasser, Robert C.
2015-01-01
This article describes quantitative and qualitative data providing perspectives on how six English as a Foreign Language (EFL) vocational high school teachers perceived CALL knowledge and competencies in a yearlong technology-enriched professional development program. The teachers' developing technological pedagogical content knowledge (TPACK) and…
A Simulation on Organizational Communication Patterns During a Terrorist Attack
2008-06-01
and the Air Support Headquarters. The call is created at the time of attack, and it automatically includes a request for help. Reliability of...communication conditions. 2. Air Support call : This call is produced for just the Headquarters of Air Component, only in case of armed attacks. The request can...estimated speed of armored vehicles in combat areas (West-Point Organization, 2002). When a call for air support is received, an information
Program Flow Analyzer. Volume 3
1984-08-01
metrics are defined using these basic terms. Of interest is another measure for the size of the program, called the volume: V N x log 2 n. 5 The unit of...correlated to actual data and most useful for test. The formula des - cribing difficulty may be expressed as: nl X N2D - 2 -I/L *Difficulty then, is the...linearly independent program paths through any program graph. A maximal set of these linearly independent paths, called a "basis set," can always be found
DFT application for chlorin derivatives photosensitizer drugs modeling
NASA Astrophysics Data System (ADS)
Machado, Neila; Carvalho, B. G.; Téllez Soto, C. A.; Martin, A. A.; Favero, P. P.
2018-04-01
Photodynamic therapy is an alternative form of cancer treatment that meets the desire for a less aggressive approach to the body. It is based on the interaction between a photosensitizer, activating light, and molecular oxygen. This interaction results in a cascade of reactions that leads to localized cell death. Many studies have been conducted to discover an ideal photosensitizer, which aggregates all the desirable characteristics of a potent cell killer and generates minimal side effects. Using Density Functional Theory (DFT) implemented in the program Vienna Ab-initio Simulation Package, new chlorin derivatives with different functional groups were simulated to evaluate the different absorption wavelengths to permit resonant absorption with the incident laser. Gaussian 09 program was used to determine vibrational wave numbers and Natural Bond Orbitals. The chosen drug with the best characteristics for the photosensitizer was a modified model of the original chlorin, which was called as Thiol chlorin. According to our calculations it is stable and is 19.6% more efficient at optical absorption in 708 nm in comparison to the conventional chlorin e6. Vibrational modes, optical and electronic properties were predicted. In conclusion, this study is an attempt to improve the development of new photosensitizer drugs through computational methods that save time and contribute to decrease the numbers of animals for model application.
Toward computer simulation of high-LET in vitro survival curves.
Heuskin, A-C; Michiels, C; Lucas, S
2013-09-21
We developed a Monte Carlo based computer program called MCSC (Monte Carlo Survival Curve) able to predict the survival fraction of cells irradiated in vitro with a broad beam of high linear energy transfer particles. Three types of cell responses are studied: the usual high dose response, the bystander effect and the low-dose hypersensitivity (HRS). The program models the broad beam irradiation and double strand break distribution following Poisson statistics. The progression of cells through the cell cycle is taken into account while the repair takes place. Input parameters are experimentally determined for A549 lung carcinoma cells irradiated with 10 and 20 keV µm(-1) protons, 115 keV µm(-1) alpha particles and for EAhy926 endothelial cells exposed to 115 keV µm(-1) alpha particles. Results of simulations are presented and compared with experimental survival curves obtained for A549 and EAhy296 cells. Results are in good agreement with experimental data for both cell lines and all irradiation protocols. The benefits of MCSC are several: the gain of time that would have been spent performing time-consuming clonogenic assays, the capacity to estimate survival fraction of cell lines not forming colonies and possibly the evaluation of radiosensitivity parameters of given individuals.
Parallel computing in enterprise modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.
2008-08-01
This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priorimore » ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.« less
Efficient Ada multitasking on a RISC register window architecture
NASA Technical Reports Server (NTRS)
Kearns, J. P.; Quammen, D.
1987-01-01
This work addresses the problem of reducing context switch overhead on a processor which supports a large register file - a register file much like that which is part of the Berkeley RISC processors and several other emerging architectures (which are not necessarily reduced instruction set machines in the purest sense). Such a reduction in overhead is particularly desirable in a real-time embedded application, in which task-to-task context switch overhead may result in failure to meet crucial deadlines. A storage management technique by which a context switch may be implemented as cheaply as a procedure call is presented. The essence of this technique is the avoidance of the save/restore of registers on the context switch. This is achieved through analysis of the static source text of an Ada tasking program. Information gained during that analysis directs the optimized storage management strategy for that program at run time. A formal verification of the technique in terms of an operational control model and an evaluation of the technique's performance via simulations driven by synthetic Ada program traces are presented.
Numerical Propulsion System Simulation
NASA Technical Reports Server (NTRS)
Naiman, Cynthia
2006-01-01
The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.
Design and simulation of the surface shape control system for membrane mirror
NASA Astrophysics Data System (ADS)
Zhang, Gengsheng; Tang, Minxue
2009-11-01
The surface shape control is one of the key technologies for the manufacture of membrane mirror. This paper presents a design of membrane mirror's surface shape control system on the basis of fuzzy logic control. The system contains such function modules as surface shape design, surface shape control, surface shape analysis, and etc. The system functions are realized by using hybrid programming technology of Visual C# and MATLAB. The finite element method is adopted to simulate the surface shape control of membrane mirror. The finite element analysis model is established through ANSYS Parametric Design Language (APDL). ANSYS software kernel is called by the system in background running mode when doing the simulation. The controller is designed by means of controlling the sag of the mirror's central crosssection. The surface shape of the membrane mirror and its optical aberration are obtained by applying Zernike polynomial fitting. The analysis of surface shape control and the simulation of disturbance response are performed for a membrane mirror with 300mm aperture and F/2.7. The result of the simulation shows that by using the designed control system, the RMS wavefront error of the mirror can reach to 142λ (λ=632.8nm), which is consistent to the surface accuracy of the membrane mirror obtained by the large deformation theory of membrane under the same condition.
Ahlfeld, David P.; Baker, Kristine M.; Barlow, Paul M.
2009-01-01
This report describes the Groundwater-Management (GWM) Process for MODFLOW-2005, the 2005 version of the U.S. Geological Survey modular three-dimensional groundwater model. GWM can solve a broad range of groundwater-management problems by combined use of simulation- and optimization-modeling techniques. These problems include limiting groundwater-level declines or streamflow depletions, managing groundwater withdrawals, and conjunctively using groundwater and surface-water resources. GWM was initially released for the 2000 version of MODFLOW. Several modifications and enhancements have been made to GWM since its initial release to increase the scope of the program's capabilities and to improve its operation and reporting of results. The new code, which is called GWM-2005, also was designed to support the local grid refinement capability of MODFLOW-2005. Local grid refinement allows for the simulation of one or more higher resolution local grids (referred to as child models) within a coarser grid parent model. Local grid refinement is often needed to improve simulation accuracy in regions where hydraulic gradients change substantially over short distances or in areas requiring detailed representation of aquifer heterogeneity. GWM-2005 can be used to formulate and solve groundwater-management problems that include components in both parent and child models. Although local grid refinement increases simulation accuracy, it can also substantially increase simulation run times.
Validating module network learning algorithms using simulated data.
Michoel, Tom; Maere, Steven; Bonnet, Eric; Joshi, Anagha; Saeys, Yvan; Van den Bulcke, Tim; Van Leemput, Koenraad; van Remortel, Piet; Kuiper, Martin; Marchal, Kathleen; Van de Peer, Yves
2007-05-03
In recent years, several authors have used probabilistic graphical models to learn expression modules and their regulatory programs from gene expression data. Despite the demonstrated success of such algorithms in uncovering biologically relevant regulatory relations, further developments in the area are hampered by a lack of tools to compare the performance of alternative module network learning strategies. Here, we demonstrate the use of the synthetic data generator SynTReN for the purpose of testing and comparing module network learning algorithms. We introduce a software package for learning module networks, called LeMoNe, which incorporates a novel strategy for learning regulatory programs. Novelties include the use of a bottom-up Bayesian hierarchical clustering to construct the regulatory programs, and the use of a conditional entropy measure to assign regulators to the regulation program nodes. Using SynTReN data, we test the performance of LeMoNe in a completely controlled situation and assess the effect of the methodological changes we made with respect to an existing software package, namely Genomica. Additionally, we assess the effect of various parameters, such as the size of the data set and the amount of noise, on the inference performance. Overall, application of Genomica and LeMoNe to simulated data sets gave comparable results. However, LeMoNe offers some advantages, one of them being that the learning process is considerably faster for larger data sets. Additionally, we show that the location of the regulators in the LeMoNe regulation programs and their conditional entropy may be used to prioritize regulators for functional validation, and that the combination of the bottom-up clustering strategy with the conditional entropy-based assignment of regulators improves the handling of missing or hidden regulators. We show that data simulators such as SynTReN are very well suited for the purpose of developing, testing and improving module network algorithms. We used SynTReN data to develop and test an alternative module network learning strategy, which is incorporated in the software package LeMoNe, and we provide evidence that this alternative strategy has several advantages with respect to existing methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhn, J K; von Fuchs, G F; Zob, A P
1980-05-01
Two water tank component simulation models have been selected and upgraded. These models are called the CSU Model and the Extended SOLSYS Model. The models have been standardized and links have been provided for operation in the TRNSYS simulation program. The models are described in analytical terms as well as in computer code. Specific water tank tests were performed for the purpose of model validation. Agreement between model data and test data is excellent. A description of the limitations has also been included. Streamlining results and criteria for the reduction of computer time have also been shown for both watermore » tank computer models. Computer codes for the models and instructions for operating these models in TRNSYS have also been included, making the models readily available for DOE and industry use. Rock bed component simulation models have been reviewed and a model selected and upgraded. This model is a logical extension of the Mumma-Marvin model. Specific rock bed tests have been performed for the purpose of validation. Data have been reviewed for consistency. Details of the test results concerned with rock characteristics and pressure drop through the bed have been explored and are reported.« less
2015-04-22
This simulated image shows how a cloud of glitter in geostationary orbit would be illuminated and controlled by two laser beams. As the cloud orbits Earth, grains scatter the sun's light at different angles like many tiny prisms, similar to how rainbows are produced from light being dispersed by water droplets. That is why the project concept is called "Orbiting Rainbows." The cloud functions like a reflective surface, allowing the exoplanet (displayed in the bottom right) to be imaged. The orbit path is shown in the top right. On the bottom left, Earth's image is seen behind the cloud. To image an exoplanet, the cloud would need to have a diameter of nearly 98 feet (30 meters). This simulation confines the cloud to a 3.3 x 3.3 x 3.3 foot volume (1 x 1 x 1 meter volume) to simplify the computations. The elements of the orbiting telescope are not to scale. Orbiting Rainbows is currently in Phase II development through the NASA Innovative Advanced Concepts (NIAC) Program. It was one of five technology proposals chosen for continued study in 2014. In the current phase, Orbiting Rainbows researchers are conducting small-scale ground experiments to demonstrate how granular materials can be manipulated using lasers and simulations of how the imaging system would behave in orbit. http://photojournal.jpl.nasa.gov/catalog/PIA19318
47 CFR 64.1503 - Termination of pay-per-call and other information programs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 3 2010-10-01 2010-10-01 false Termination of pay-per-call and other information programs. 64.1503 Section 64.1503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED... interstate information service through any 800 telephone number, or other telephone number advertised or...
NDL-v2.0: A new version of the numerical differentiation library for parallel architectures
NASA Astrophysics Data System (ADS)
Hadjidoukas, P. E.; Angelikopoulos, P.; Voglis, C.; Papageorgiou, D. G.; Lagaris, I. E.
2014-07-01
We present a new version of the numerical differentiation library (NDL) used for the numerical estimation of first and second order partial derivatives of a function by finite differencing. In this version we have restructured the serial implementation of the code so as to achieve optimal task-based parallelization. The pure shared-memory parallelization of the library has been based on the lightweight OpenMP tasking model allowing for the full extraction of the available parallelism and efficient scheduling of multiple concurrent library calls. On multicore clusters, parallelism is exploited by means of TORC, an MPI-based multi-threaded tasking library. The new MPI implementation of NDL provides optimal performance in terms of function calls and, furthermore, supports asynchronous execution of multiple library calls within legacy MPI programs. In addition, a Python interface has been implemented for all cases, exporting the functionality of our library to sequential Python codes. Catalog identifier: AEDG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 63036 No. of bytes in distributed program, including test data, etc.: 801872 Distribution format: tar.gz Programming language: ANSI Fortran-77, ANSI C, Python. Computer: Distributed systems (clusters), shared memory systems. Operating system: Linux, Unix. Has the code been vectorized or parallelized?: Yes. RAM: The library uses O(N) internal storage, N being the dimension of the problem. It can use up to O(N2) internal storage for Hessian calculations, if a task throttling factor has not been set by the user. Classification: 4.9, 4.14, 6.5. Catalog identifier of previous version: AEDG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180(2009)1404 Does the new version supersede the previous version?: Yes Nature of problem: The numerical estimation of derivatives at several accuracy levels is a common requirement in many computational tasks, such as optimization, solution of nonlinear systems, and sensitivity analysis. For a large number of scientific and engineering applications, the underlying functions correspond to simulation codes for which analytical estimation of derivatives is difficult or almost impossible. A parallel implementation that exploits systems with multiple CPUs is very important for large scale and computationally expensive problems. Solution method: Finite differencing is used with a carefully chosen step that minimizes the sum of the truncation and round-off errors. The parallel versions employ both OpenMP and MPI libraries. Reasons for new version: The updated version was motivated by our endeavors to extend a parallel Bayesian uncertainty quantification framework [1], by incorporating higher order derivative information as in most state-of-the-art stochastic simulation methods such as Stochastic Newton MCMC [2] and Riemannian Manifold Hamiltonian MC [3]. The function evaluations are simulations with significant time-to-solution, which also varies with the input parameters such as in [1, 4]. The runtime of the N-body-type of problem changes considerably with the introduction of a longer cut-off between the bodies. In the first version of the library, the OpenMP-parallel subroutines spawn a new team of threads and distribute the function evaluations with a PARALLEL DO directive. This limits the functionality of the library as multiple concurrent calls require nested parallelism support from the OpenMP environment. Therefore, either their function evaluations will be serialized or processor oversubscription is likely to occur due to the increased number of OpenMP threads. In addition, the Hessian calculations include two explicit parallel regions that compute first the diagonal and then the off-diagonal elements of the array. Due to the barrier between the two regions, the parallelism of the calculations is not fully exploited. These issues have been addressed in the new version by first restructuring the serial code and then running the function evaluations in parallel using OpenMP tasks. Although the MPI-parallel implementation of the first version is capable of fully exploiting the task parallelism of the PNDL routines, it does not utilize the caching mechanism of the serial code and, therefore, performs some redundant function evaluations in the Hessian and Jacobian calculations. This can lead to: (a) higher execution times if the number of available processors is lower than the total number of tasks, and (b) significant energy consumption due to wasted processor cycles. Overcoming these drawbacks, which become critical as the time of a single function evaluation increases, was the primary goal of this new version. Due to the code restructure, the MPI-parallel implementation (and the OpenMP-parallel in accordance) avoids redundant calls, providing optimal performance in terms of the number of function evaluations. Another limitation of the library was that the library subroutines were collective and synchronous calls. In the new version, each MPI process can issue any number of subroutines for asynchronous execution. We introduce two library calls that provide global and local task synchronizations, similarly to the BARRIER and TASKWAIT directives of OpenMP. The new MPI-implementation is based on TORC, a new tasking library for multicore clusters [5-7]. TORC improves the portability of the software, as it relies exclusively on the POSIX-Threads and MPI programming interfaces. It allows MPI processes to utilize multiple worker threads, offering a hybrid programming and execution environment similar to MPI+OpenMP, in a completely transparent way. Finally, to further improve the usability of our software, a Python interface has been implemented on top of both the OpenMP and MPI versions of the library. This allows sequential Python codes to exploit shared and distributed memory systems. Summary of revisions: The revised code improves the performance of both parallel (OpenMP and MPI) implementations. The functionality and the user-interface of the MPI-parallel version have been extended to support the asynchronous execution of multiple PNDL calls, issued by one or multiple MPI processes. A new underlying tasking library increases portability and allows MPI processes to have multiple worker threads. For both implementations, an interface to the Python programming language has been added. Restrictions: The library uses only double precision arithmetic. The MPI implementation assumes the homogeneity of the execution environment provided by the operating system. Specifically, the processes of a single MPI application must have identical address space and a user function resides at the same virtual address. In addition, address space layout randomization should not be used for the application. Unusual features: The software takes into account bound constraints, in the sense that only feasible points are used to evaluate the derivatives, and given the level of the desired accuracy, the proper formula is automatically employed. Running time: Running time depends on the function's complexity. The test run took 23 ms for the serial distribution, 25 ms for the OpenMP with 2 threads, 53 ms and 1.01 s for the MPI parallel distribution using 2 threads and 2 processes respectively and yield-time for idle workers equal to 10 ms. References: [1] P. Angelikopoulos, C. Paradimitriou, P. Koumoutsakos, Bayesian uncertainty quantification and propagation in molecular dynamics simulations: a high performance computing framework, J. Chem. Phys 137 (14). [2] H.P. Flath, L.C. Wilcox, V. Akcelik, J. Hill, B. van Bloemen Waanders, O. Ghattas, Fast algorithms for Bayesian uncertainty quantification in large-scale linear inverse problems based on low-rank partial Hessian approximations, SIAM J. Sci. Comput. 33 (1) (2011) 407-432. [3] M. Girolami, B. Calderhead, Riemann manifold Langevin and Hamiltonian Monte Carlo methods, J. R. Stat. Soc. Ser. B (Stat. Methodol.) 73 (2) (2011) 123-214. [4] P. Angelikopoulos, C. Paradimitriou, P. Koumoutsakos, Data driven, predictive molecular dynamics for nanoscale flow simulations under uncertainty, J. Phys. Chem. B 117 (47) (2013) 14808-14816. [5] P.E. Hadjidoukas, E. Lappas, V.V. Dimakopoulos, A runtime library for platform-independent task parallelism, in: PDP, IEEE, 2012, pp. 229-236. [6] C. Voglis, P.E. Hadjidoukas, D.G. Papageorgiou, I. Lagaris, A parallel hybrid optimization algorithm for fitting interatomic potentials, Appl. Soft Comput. 13 (12) (2013) 4481-4492. [7] P.E. Hadjidoukas, C. Voglis, V.V. Dimakopoulos, I. Lagaris, D.G. Papageorgiou, Supporting adaptive and irregular parallelism for non-linear numerical optimization, Appl. Math. Comput. 231 (2014) 544-559.
Computer program for optimal BWR congtrol rod programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taner, M.S.; Levine, S.H.; Carmody, J.M.
1995-12-31
A fully automated computer program has been developed for designing optimal control rod (CR) patterns for boiling water reactors (BWRs). The new program, called OCTOPUS-3, is based on the OCTOPUS code and employs SIMULATE-3 (Ref. 2) for the analysis. There are three aspects of OCTOPUS-3 that make it successful for use at PECO Energy. It incorporates a new feasibility algorithm that makes the CR design meet all constraints, it has been coupled to a Bourne Shell program 3 to allow the user to run the code interactively without the need for a manual, and it develops a low axial peakmore » to extend the cycle. For PECO Energy Co.`s limericks it increased the energy output by 1 to 2% over the traditional PECO Energy design. The objective of the optimization in OCTOPUS-3 is to approximate a very low axial peaked target power distribution while maintaining criticality, keeping the nodal and assembly peaks below the allowed maximum, and meeting the other constraints. The user-specified input for each exposure point includes: CR groups allowed-to-move, target k{sub eff}, and amount of core flow. The OCTOPUS-3 code uses the CR pattern from the previous step as the initial guess unless indicated otherwise.« less
33 CFR 402.7 - Service Incentive Program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... number of calls scheduled for the Navigation Season. Additional calls to the system may be added during the season. (f) The carrier will advise the Manager of port rotation, outlining core ports of calls... carrier must meet 75% schedule adherence with a minimum of four (4) Great Lakes calls during the...
A computer simulation experiment of supervisory control of remote manipulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Mccandlish, S. G.
1966-01-01
A computer simulation of a remote manipulation task and a rate-controlled manipulator is described. Some low-level automatic decision making ability which could be used at the operator's discretion to augment his direct continuous control was built into the manipulator. Experiments were made on the effect of transmission delay, dynamic lag, and intermittent vision on human manipulative ability. Delay does not make remote manipulation impossible. Intermittent visual feedback, and the absence of rate information in the display presented to the operator do not seem to impair the operator's performance. A small-capacity visual feedback channel may be sufficient for remote manipulation tasks, or one channel might be time-shared between several operators. In other experiments the operator called in sequence various on-site automatic control programs of the machine, and thereby acted as a supervisor. The supervisory mode of operation has some advantages when the task to be performed is difficult for a human controlling directly.
Measurement and Simulation of Low Frequency Impulse Noise and Ground Vibration from Airblasts
NASA Astrophysics Data System (ADS)
Hole, L. R.; Kaynia, A. M.; Madshus, C.
1998-07-01
This paper presents numerical simulations of low frequency ground vibration and dynamic overpressure in air using two different numerical models. Analysis is based on actual recordings during blast tests at Haslemoen test site in Norway in June 1994. It is attempted to use the collected airblast-induced overpressures and ground vibrations in order to asses the applicability of the two models. The first model is a computer code which is based on a global representation of ground and atmospheric layers, a so-called Fast Field Program (FFP). A viscoelastic and a poroelastic version of this model is used. The second model is a two-dimensionalmoving-loadformulation for the propagation of airblast over ground. The poroelastic FFP gives the most complete and realistic reproduction of the processes involved, including decay of peak overpressure amplitude and dominant frequency of signals with range. It turns out that themoving-loadformulation does not provide a complete description of the physics involved when the speed of sound in air is different from the ground wavespeeds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickieson, J.L.; Thode, W.F.; Newbury, K.
1988-12-01
Over the last several years, Navy Personnel Research and Development has produced a prototype simulation of a 1200-psi steam plant. This simulation, called Steamer, is installed on an expensive Symbolics minicomputer at the Surface Warfare Officers School, Pacific Coronado, California. The fundamental research goal of the Steamer prototype system was to evaluate the potential of, what was then, new artificial intelligence (AI) hardware and software technology for supporting the construction of computer-based training systems using graphic representations of complex, dynamic systems. The area of propulsion engineering was chosen for a number of reasons. This document describes the Steamer prototype systemmore » components and user interface commands and establishes a starting point for designing, developing, and implementing Steamer II. Careful examination of the actual program code produced an inventory that describes the hardware, system software, application software, and documentation for the Steamer prototype system. Exercising all menu options systematically produced an inventory of all Steamer prototype user interface commands.« less
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
Orion Service Module Umbilical (OSMU) Testing Complete
2016-10-19
Testing of the Orion Service Module Umbilical (OSMU) was completed at the Launch Equipment Test Facility at NASA’s Kennedy Space Center in Florida. The OSMU was attached to Vehicle Motion Simulator 1 for a series of simulated launch tests to validate it for installation on the mobile launcher. The test team gathered for an event to mark the end of testing. The mobile launcher tower will be equipped with a number of lines, called umbilicals that will connect to the Space Launch System rocket and Orion spacecraft for Exploration Mission-1 (EM-1). The OSMU will be located high on the mobile launcher tower and, prior to launch, will transfer liquid coolant for the electronics and air for the Environmental Control System to the Orion service module that houses these critical systems to support the spacecraft. Kennedy's Engineering Directorate is providing support to the Ground Systems Development and Operations Program for testing of the OSMU. EM-1 is scheduled to launch in 2018.
Orion Service Module Umbilical (OSMU) Testing Complete
2016-10-19
Testing of the Orion Service Module Umbilical (OSMU) was completed at the Launch Equipment Test Facility at NASA’s Kennedy Space Center in Florida. The OSMU was attached to Vehicle Motion Simulator 1 for a series of simulated launch tests to validate it for installation on the mobile launcher. Patrick Simpkins, director of Engineering, speaks to the test team during an event to mark the end of testing. The mobile launcher tower will be equipped with a number of lines, called umbilicals that will connect to the Space Launch System rocket and Orion spacecraft for Exploration Mission-1 (EM-1). The OSMU will be located high on the mobile launcher tower and, prior to launch, will transfer liquid coolant for the electronics and air for the Environmental Control System to the Orion service module that houses these critical systems to support the spacecraft. Kennedy's Engineering Directorate is providing support to the Ground Systems Development and Operations Program for testing of the OSMU. EM-1 is scheduled to launch in 2018.
Orion Service Module Umbilical (OSMU) Testing Complete
2016-10-19
Testing of the Orion Service Module Umbilical (OSMU) was completed at the Launch Equipment Test Facility at NASA’s Kennedy Space Center in Florida. The OSMU was attached to Vehicle Motion Simulator 1 for a series of simulated launch tests to validate it for installation on the mobile launcher. The test team gathered with a special banner during an event to mark the end of testing. The mobile launcher tower will be equipped with a number of lines, called umbilicals that will connect to the Space Launch System rocket and Orion spacecraft for Exploration Mission-1 (EM-1). The OSMU will be located high on the mobile launcher tower and, prior to launch, will transfer liquid coolant for the electronics and air for the Environmental Control System to the Orion service module that houses these critical systems to support the spacecraft. Kennedy's Engineering Directorate is providing support to the Ground Systems Development and Operations Program for testing of the OSMU. EM-1 is scheduled to launch in 2018.
Orion Service Module Umbilical (OSMU) Testing Complete
2016-10-19
Testing of the Orion Service Module Umbilical (OSMU) was completed at the Launch Equipment Test Facility at NASA’s Kennedy Space Center in Florida. The OSMU was attached to Vehicle Motion Simulator 1 for a series of simulated launch tests to validate it for installation on the mobile launcher. One of the test team members signs a banner during an event to mark the end of testing. The mobile launcher tower will be equipped with a number of lines, called umbilicals that will connect to the Space Launch System rocket and Orion spacecraft for Exploration Mission-1 (EM-1). The OSMU will be located high on the mobile launcher tower and, prior to launch, will transfer liquid coolant for the electronics and air for the Environmental Control System to the Orion service module that houses these critical systems to support the spacecraft. Kennedy's Engineering Directorate is providing support to the Ground Systems Development and Operations Program for testing of the OSMU. EM-1 is scheduled to launch in 2018.
Orion Service Module Umbilical (OSMU) Testing Complete
2016-10-19
Testing of the Orion Service Module Umbilical (OSMU) was completed at the Launch Equipment Test Facility at NASA’s Kennedy Space Center in Florida. The OSMU was attached to Vehicle Motion Simulator 1 for a series of simulated launch tests to validate it for installation on the mobile launcher. The test team signed a special banner during an event to mark the end of testing. The mobile launcher tower will be equipped with a number of lines, called umbilicals that will connect to the Space Launch System rocket and Orion spacecraft for Exploration Mission-1 (EM-1). The OSMU will be located high on the mobile launcher tower and, prior to launch, will transfer liquid coolant for the electronics and air for the Environmental Control System to the Orion service module that houses these critical systems to support the spacecraft. Kennedy's Engineering Directorate is providing support to the Ground Systems Development and Operations Program for testing of the OSMU. EM-1 is scheduled to launch in 2018.
ERIC Educational Resources Information Center
Woodfield, Brian F.; Andrus, Merritt B.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg; Andersen, Tricia; Miller, Jordan; Simmons, Bryon; Stanger, Richard
2005-01-01
A set of sophisticated and realistic laboratory simulations is created for use in freshman- and sophomore-level chemistry classes and laboratories called 'Virtual ChemLab'. A detailed assessment of student responses is provided and the simulation's pedagogical utility is described using the organic simulation.
NASA Technical Reports Server (NTRS)
Follen, Gregory; auBuchon, M.
2000-01-01
Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Kyung-Doo; Jeong, Jae-Jun; Lee, Seung-Wook
The Nuclear Steam Supply System (NSSS) thermal-hydraulic model adopted in the Korea Nuclear Plant Education Center (KNPEC)-2 simulator was provided in the early 1980s. The reference plant for KNPEC-2 is the Yong Gwang Nuclear Unit 1, which is a Westinghouse-type 3-loop, 950 MW(electric) pressurized water reactor. Because of the limited computational capability at that time, it uses overly simplified physical models and assumptions for a real-time simulation of NSSS thermal-hydraulic transients. This may entail inaccurate results and thus, the possibility of so-called ''negative training,'' especially for complicated two-phase flows in the reactor coolant system. To resolve the problem, we developedmore » a realistic NSSS thermal-hydraulic program (named ARTS code) based on the best-estimate code RETRAN-3D. The systematic assessment of ARTS has been conducted by both a stand-alone test and an integrated test in the simulator environment. The non-integrated stand-alone test (NIST) results were reasonable in terms of accuracy, real-time simulation capability, and robustness. After successful completion of the NIST, ARTS was integrated with a 3-D reactor kinetics model and other system models. The site acceptance test (SAT) has been completed successively and confirmed to comply with the ANSI/ANS-3.5-1998 simulator software performance criteria. This paper presents our efforts for the ARTS development and some test results of the NIST and SAT.« less
Re-ranking via User Feedback: Georgetown University at TREC 2015 DD Track
2015-11-20
Re-ranking via User Feedback: Georgetown University at TREC 2015 DD Track Jiyun Luo and Hui Yang Department of Computer Science, Georgetown...involved in a search process, the user and the search engine. In TREC DD , the user is modeled by a simulator, called “jig”. The jig and the search engine...simulating user is provided by TREC 2015 DD Track organizer, and is called “jig”. There are 118 search topics in total. For each search topic, a short
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ecale Zhou, Carol L.
2016-07-05
Compare Gene Calls (CGC) is a Python code used for combining and comparing gene calls from any number of gene callers. A gene caller is a computer program that predicts the extends of open reading frames within genomes of biological organisms.
Chang, Larry William; Kagaayi, Joseph; Nakigozi, Gertrude; Galiwango, Ronald; Mulamba, Jeremiah; Ludigo, James; Ruwangula, Andrew; Gray, Ronald H; Quinn, Thomas C; Bollinger, Robert C; Reynolds, Steven J
2008-01-01
Hotlines and warmlines have been successfully used in the developed world to provide clinical advice; however, reports on their replicability in resource-limited settings are limited. A warmline was established in Rakai, Uganda, to support an antiretroviral therapy program. Over a 17-month period, a database was kept of who called, why they called, and the result of the call. A program evaluation was also administered to clinical staff. A total of 1303 calls (3.5 calls per weekday) were logged. The warmline was used mostly by field staff and peripherally based peer health workers. Calls addressed important clinical issues, including the need for urgent care, medication side effects, and follow-up needs. Most clinical staff felt that the warmline made their jobs easier and improved the health of patients. An HIV/AIDS warmline leveraged the skills of a limited workforce to provide increased access to HIV/AIDS care, advice, and education.
Numerical Simulations For the F-16XL Aircraft Configuration
NASA Technical Reports Server (NTRS)
Elmiligui, Alaa A.; Abdol-Hamid, Khaled; Cavallo, Peter A.; Parlette, Edward B.
2014-01-01
Numerical simulations of flow around the F-16XL are presented as a contribution to the Cranked Arrow Wing Aerodynamic Project International II (CAWAPI-II). The NASA Tetrahedral Unstructured Software System (TetrUSS) is used to perform numerical simulations. This CFD suite, developed and maintained by NASA Langley Research Center, includes an unstructured grid generation program called VGRID, a postprocessor named POSTGRID, and the flow solver USM3D. The CRISP CFD package is utilized to provide error estimates and grid adaption for verification of USM3D results. A subsonic high angle-of-attack case flight condition (FC) 25 is computed and analyzed. Three turbulence models are used in the calculations: the one-equation Spalart-Allmaras (SA), the two-equation shear stress transport (SST) and the ke turbulence models. Computational results, and surface static pressure profiles are presented and compared with flight data. Solution verification is performed using formal grid refinement studies, the solution of Error Transport Equations, and adaptive mesh refinement. The current study shows that the USM3D solver coupled with CRISP CFD can be used in an engineering environment in predicting vortex-flow physics on a complex configuration at flight Reynolds numbers.
A New Improved and Extended Version of the Multicell Bacterial Simulator gro.
Gutiérrez, Martín; Gregorio-Godoy, Paula; Pérez Del Pulgar, Guillermo; Muñoz, Luis E; Sáez, Sandra; Rodríguez-Patón, Alfonso
2017-08-18
gro is a cell programming language developed in Klavins Lab for simulating colony growth and cell-cell communication. It is used as a synthetic biology prototyping tool for simulating multicellular biocircuits and microbial consortia. In this work, we present several extensions made to gro that improve the performance of the simulator, make it easier to use, and provide new functionalities. The new version of gro is between 1 and 2 orders of magnitude faster than the original version. It is able to grow microbial colonies with up to 10 5 cells in less than 10 min. A new library, CellEngine, accelerates the resolution of spatial physical interactions between growing and dividing cells by implementing a new shoving algorithm. A genetic library, CellPro, based on Probabilistic Timed Automata, simulates gene expression dynamics using simplified and easy to compute digital proteins. We also propose a more convenient language specification layer, ProSpec, based on the idea that proteins drive cell behavior. CellNutrient, another library, implements Monod-based growth and nutrient uptake functionalities. The intercellular signaling management was improved and extended in a library called CellSignals. Finally, bacterial conjugation, another local cell-cell communication process, was added to the simulator. To show the versatility and potential outreach of this version of gro, we provide studies and novel examples ranging from synthetic biology to evolutionary microbiology. We believe that the upgrades implemented for gro have made it into a powerful and fast prototyping tool capable of simulating a large variety of systems and synthetic biology designs.
Answering the Call: How Group Mentoring Makes a Difference
ERIC Educational Resources Information Center
Altus, Jillian
2015-01-01
Mentoring programs answer the call for social justice for many students who are in success-inhibiting environments. This study employed a case study design to investigate the perceived benefits from a group mentoring program. Data was collected from pre- and post-assessments focus groups, and artifacts. Four participant benefits were revealed:…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Call for Co-Sponsors for Office of Healthcare Quality's Programs to Strengthen Coordination and Impact National Efforts in the Prevention of Healthcare-Associated... Health and Science, Office of Healthcare Quality. ACTION: Notice. SUMMARY: The U.S. Department of Health...
76 FR 54240 - National Institute of Allergy and Infectious Diseases; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
...: Robert G. Keefe, PhD, Scientific Review Officer, Scientific Review Program, DEA/NIAID/NIH/DHHS, Room 3256... Conference Call). Contact Person: Robert G. Keefe, PhD, Scientific Review Officer, Scientific Review Program... Drive, Bethesda, MD 20817 (Telephone Conference Call). Contact Person: Robert G. Keefe, PhD, Scientific...
Integrating CALL into the Classroom: The Role of Podcasting in an ESL Listening Strategies Course
ERIC Educational Resources Information Center
O'Brien, Anne; Hegelheimer, Volker
2007-01-01
Despite the increase of teacher preparation programs that emphasize the importance of training teachers to select and develop appropriate computer-assisted language learning (CALL) materials, integration of CALL into classroom settings is still frequently relegated to the use of selected CALL activities to supplement instruction or to provide…
Intern evaluation strategies in family medicine residency education: what is-and is not-being done.
Yates, Jennifer E
2013-06-01
Family medicine interns often have deficiencies that are not initially appreciated. By recognizing those growth opportunities early, programs may be able to better meet their interns' training needs. This study provides a needs assessment to ascertain what evaluation tools are being utilized by residency programs to assess their incoming interns. A questionnaire was sent to all US family medicine residency program coordinators (439 programs) via Survey Monkey© inquiring about whether intern evaluation is performed and, if so, what strategies are used. A mixed mode methodology was used: mailing with incentive, email prompts, and telephone calls. Of 439 programs, 220 (50%) responded to the survey. Most respondents (145, 66%) think intern evaluation is needed. However, only 79 (36%) programs are actually doing intern evaluations-only 14 (6.4%) extensively. Most programs are performing simulations (81, 45%) and assessing knowledge/comfort levels (79, 36%); less than one third are considering personality/learning styles, and almost no programs are evaluating skills such as typing (three, 1.4%) and math (one, 0.5%). Many programs use evaluations to guide future planning, help with early identification of challenging learners, and to match training to the residents' needs. Several programs expressed concern about how they would use the information once obtained. The majority of respondents agreed that a baseline intern evaluation is useful; few are actually doing it. This area is not well-described in the literature; residency programs could benefit from information sharing. The next step is to encourage interest in and implementation of such strategies.
Simulation Training in Obstetrics and Gynaecology Residency Programs in Canada.
Sanders, Ari; Wilson, R Douglas
2015-11-01
The integration of simulation into residency programs has been slower in obstetrics and gynaecology than in other surgical specialties. The goal of this study was to evaluate the current use of simulation in obstetrics and gynaecology residency programs in Canada. A 19-question survey was developed and distributed to all 16 active and accredited obstetrics and gynaecology residency programs in Canada. The survey was sent to program directors initially, but on occasion was redirected to other faculty members involved in resident education or to senior residents. Survey responses were collected over an 18-month period. Twelve programs responded to the survey (11 complete responses). Eleven programs (92%) reported introducing an obstetrics and gynaecology simulation curriculum into their residency education. All respondents (100%) had access to a simulation centre. Simulation was used to teach various obstetrical and gynaecological skills using different simulation modalities. Barriers to simulation integration were primarily the costs of equipment and space and the need to ensure dedicated time for residents and educators. The majority of programs indicated that it was a priority for them to enhance their simulation curriculum and transition to competency-based resident assessment. Simulation training has increased in obstetrics and gynaecology residency programs. The development of formal simulation curricula for use in obstetrics and gynaecology resident education is in early development. A standardized national simulation curriculum would help facilitate the integration of simulation into obstetrics and gynaecology resident education and aid in the shift to competency-based resident assessment. Obstetrics and gynaecology residency programs need national collaboration (between centres and specialties) to develop a standardized simulation curriculum for use in obstetrics and gynaecology residency programs in Canada.
Tri-Laboratory Linux Capacity Cluster 2007 SOW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seager, M
2007-03-22
The Advanced Simulation and Computing (ASC) Program (formerly know as Accelerated Strategic Computing Initiative, ASCI) has led the world in capability computing for the last ten years. Capability computing is defined as a world-class platform (in the Top10 of the Top500.org list) with scientific simulations running at scale on the platform. Example systems are ASCI Red, Blue-Pacific, Blue-Mountain, White, Q, RedStorm, and Purple. ASC applications have scaled to multiple thousands of CPUs and accomplished a long list of mission milestones on these ASC capability platforms. However, the computing demands of the ASC and Stockpile Stewardship programs also include a vastmore » number of smaller scale runs for day-to-day simulations. Indeed, every 'hero' capability run requires many hundreds to thousands of much smaller runs in preparation and post processing activities. In addition, there are many aspects of the Stockpile Stewardship Program (SSP) that can be directly accomplished with these so-called 'capacity' calculations. The need for capacity is now so great within the program that it is increasingly difficult to allocate the computer resources required by the larger capability runs. To rectify the current 'capacity' computing resource shortfall, the ASC program has allocated a large portion of the overall ASC platforms budget to 'capacity' systems. In addition, within the next five to ten years the Life Extension Programs (LEPs) for major nuclear weapons systems must be accomplished. These LEPs and other SSP programmatic elements will further drive the need for capacity calculations and hence 'capacity' systems as well as future ASC capability calculations on 'capability' systems. To respond to this new workload analysis, the ASC program will be making a large sustained strategic investment in these capacity systems over the next ten years, starting with the United States Government Fiscal Year 2007 (GFY07). However, given the growing need for 'capability' systems as well, the budget demands are extreme and new, more cost effective ways of fielding these systems must be developed. This Tri-Laboratory Linux Capacity Cluster (TLCC) procurement represents the ASC first investment vehicle in these capacity systems. It also represents a new strategy for quickly building, fielding and integrating many Linux clusters of various sizes into classified and unclassified production service through a concept of Scalable Units (SU). The programmatic objective is to dramatically reduce the overall Total Cost of Ownership (TCO) of these 'capacity' systems relative to the best practices in Linux Cluster deployments today. This objective only makes sense in the context of these systems quickly becoming very robust and useful production clusters under the crushing load that will be inflicted on them by the ASC and SSP scientific simulation capacity workload.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebrahimi, Fatima
2014-07-31
Large-scale magnetic fields have been observed in widely different types of astrophysical objects. These magnetic fields are believed to be caused by the so-called dynamo effect. Could a large-scale magnetic field grow out of turbulence (i.e. the alpha dynamo effect)? How could the topological properties and the complexity of magnetic field as a global quantity, the so called magnetic helicity, be important in the dynamo effect? In addition to understanding the dynamo mechanism in astrophysical accretion disks, anomalous angular momentum transport has also been a longstanding problem in accretion disks and laboratory plasmas. To investigate both dynamo and momentum transport,more » we have performed both numerical modeling of laboratory experiments that are intended to simulate nature and modeling of configurations with direct relevance to astrophysical disks. Our simulations use fluid approximations (Magnetohydrodynamics - MHD model), where plasma is treated as a single fluid, or two fluids, in the presence of electromagnetic forces. Our major physics objective is to study the possibility of magnetic field generation (so called MRI small-scale and large-scale dynamos) and its role in Magneto-rotational Instability (MRI) saturation through nonlinear simulations in both MHD and Hall regimes.« less
SALT: The Simulator for the Analysis of LWP Timing
NASA Technical Reports Server (NTRS)
Springer, Paul L.; Rodrigues, Arun; Brockman, Jay
2006-01-01
With the emergence of new processor architectures that are highly multithreaded, and support features such as full/empty memory semantics and split-phase memory transactions, the need for a processor simulator to handle these features becomes apparent. This paper describes such a simulator, called SALT.
LSPC is the Loading Simulation Program in C++, a watershed modeling system that includes streamlined Hydrologic Simulation Program Fortran (HSPF) algorithms for simulating hydrology, sediment, and general water quality
Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study
McKenna, Kim D.; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann
2015-01-01
Abstract Objectives. The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders’ efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. Methods. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Results. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt they should use more simulation. Conclusions. Paramedic programs have and have access to diverse simulation resources; however, faculty training and other program resources appear to influence their use. PMID:25664774
Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study.
McKenna, Kim D; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann
2015-01-01
The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders' efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt they should use more simulation. Paramedic programs have and have access to diverse simulation resources; however, faculty training and other program resources appear to influence their use.
Taylor, Charles J.; Williamson, Tanja N.; Newson, Jeremy K.; Ulery, Randy L.; Nelson, Hugh L.; Cinotto, Peter J.
2012-01-01
This report describes Phase II modifications made to the Water Availability Tool for Environmental Resources (WATER), which applies the process-based TOPMODEL approach to simulate or predict stream discharge in surface basins in the Commonwealth of Kentucky. The previous (Phase I) version of WATER did not provide a means of identifying sinkhole catchments or accounting for the effects of karst (internal) drainage in a TOPMODEL-simulated basin. In the Phase II version of WATER, sinkhole catchments are automatically identified and delineated as internally drained subbasins, and a modified TOPMODEL approach (called the sinkhole drainage process, or SDP-TOPMODEL) is applied that calculates mean daily discharges for the basin based on summed area-weighted contributions from sinkhole drain-age (SD) areas and non-karstic topographically drained (TD) areas. Results obtained using the SDP-TOPMODEL approach were evaluated for 12 karst test basins located in each of the major karst terrains in Kentucky. Visual comparison of simulated hydrographs and flow-duration curves, along with statistical measures applied to the simulated discharge data (bias, correlation, root mean square error, and Nash-Sutcliffe efficiency coefficients), indicate that the SDPOPMODEL approach provides acceptably accurate estimates of discharge for most flow conditions and typically provides more accurate simulation of stream discharge in karstic basins compared to the standard TOPMODEL approach. Additional programming modifications made to the Phase II version of WATER included implementation of a point-and-click graphical user interface (GUI), which fully automates the delineation of simulation-basin boundaries and improves the speed of input-data processing. The Phase II version of WATER enables the user to select a pour point anywhere on a stream reach of interest, and the program will automatically delineate all upstream areas that contribute drainage to that point. This capability enables automatic delineation of a simulation basin of any size (area) and having any level of stream-network complexity. WATER then automatically identifies the presence of sinkholes catchments within the simulation basin boundaries; extracts and compiles the necessary climatic, topographic, and basin characteristics datasets; and runs the SDP-TOPMODEL approach to estimate daily mean discharges (streamflow).
Computational tools for fitting the Hill equation to dose-response curves.
Gadagkar, Sudhindra R; Call, Gerald B
2015-01-01
Many biological response curves commonly assume a sigmoidal shape that can be approximated well by means of the 4-parameter nonlinear logistic equation, also called the Hill equation. However, estimation of the Hill equation parameters requires access to commercial software or the ability to write computer code. Here we present two user-friendly and freely available computer programs to fit the Hill equation - a Solver-based Microsoft Excel template and a stand-alone GUI-based "point and click" program, called HEPB. Both computer programs use the iterative method to estimate two of the Hill equation parameters (EC50 and the Hill slope), while constraining the values of the other two parameters (the minimum and maximum asymptotes of the response variable) to fit the Hill equation to the data. In addition, HEPB draws the prediction band at a user-defined confidence level, and determines the EC50 value for each of the limits of this band to give boundary values that help objectively delineate sensitive, normal and resistant responses to the drug being tested. Both programs were tested by analyzing twelve datasets that varied widely in data values, sample size and slope, and were found to yield estimates of the Hill equation parameters that were essentially identical to those provided by commercial software such as GraphPad Prism and nls, the statistical package in the programming language R. The Excel template provides a means to estimate the parameters of the Hill equation and plot the regression line in a familiar Microsoft Office environment. HEPB, in addition to providing the above results, also computes the prediction band for the data at a user-defined level of confidence, and determines objective cut-off values to distinguish among response types (sensitive, normal and resistant). Both programs are found to yield estimated values that are essentially the same as those from standard software such as GraphPad Prism and the R-based nls. Furthermore, HEPB also has the option to simulate 500 response values based on the range of values of the dose variable in the original data and the fit of the Hill equation to that data. Copyright © 2014. Published by Elsevier Inc.
75 FR 31458 - Infrastructure Protection Data Call Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-03
...-0022] Infrastructure Protection Data Call Survey AGENCY: National Protection and Programs Directorate... New Information Collection Request, Infrastructure Protection Data Call Survey. DHS previously... territories are able to achieve this mission, IP requests opinions and information in a survey from IP Data...
Bennett, Jeffrey I; Dzara, Kristina; Mazhar, Mir Nadeem; Behere, Aniruddh
2011-03-01
The Accreditation Council for Graduate Medical Education (ACGME) requirements stipulate that psychiatry residents need to be educated in the area of emergency psychiatry. Existing research investigating the current state of this training is limited, and no research to date has assessed whether the ACGME Residency Review Committee requirements for psychiatry residency training are followed by psychiatry residency training programs. We administered, to chief resident attendees of a national leadership conference, a 24-item paper survey on the types and amount of emergency psychiatry training provided by their psychiatric residency training programs. Descriptive statistics were used in the analysis. Of 154 surveys distributed, 111 were returned (72% response rate). Nearly one-third of chief resident respondents indicated that more than 50% of their program's emergency psychiatry training was provided during on-call periods. A minority indicated that they were aware of the ACGME program requirements for emergency psychiatry training. While training in emergency psychiatry occurred in many programs through rotations-different from the on-call period-direct supervision was available during on-call training only about one-third of the time. The findings suggest that about one-third of psychiatry residency training programs do not adhere to the ACGME standards for emergency psychiatry training. Enhanced knowledge of the ACGME requirements may enhance psychiatry residents' understanding on how their programs are fulfilling the need for more emergency psychiatry training. Alternative settings to the on-call period for emergency psychiatry training are more likely to provide for direct supervision.
2007-11-01
Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria
Wildfire and MAMS data from STORMFEST
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.; Carlson, G. S.
1993-01-01
Early in 1992, NASA participated in an inter-agency field program called STORMFEST. The STORM-Fronts Experiment Systems Test (STORMFEST) was designed to test various systems critical to the success of STORM 1 in a very focused experiment. The field effort focused on winter storms in order to investigate the structure and evolution of fronts and associated mesoscale phenomena in the central United States. This document describes the data collected from two instruments onboard a NASA ER2 aircraft which was deployed out of Ellington Field in Houston, Texas from February 13 through March 15, 1992, in support of this experiment. The two instruments were the Wildfire (a.k.a. the moderate resolution imaging spectrometer-nadir (MODIS-N) Airborne Simulation (MAS)) and the Multispectral Atmospheric Mapping Sensor (MAMS).
NASA Astrophysics Data System (ADS)
Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Ledoux, X.; Laurent, B.; Thomas, J.-C.; Clerc, T.; Desmezières, V.; Dupuis, M.; Madeline, A.; Dessay, E.; Grinyer, G. F.; Grinyer, J.; Menard, N.; Porée, F.; Achouri, L.; Delaunay, F.; Parlog, M.
2018-07-01
Double differential neutron spectra (energy, angle) originating from a thick natCu target bombarded by a 12 MeV/nucleon 36S16+ beam were measured by the activation method and the Time-of-flight technique at the Grand Accélérateur National d'Ions Lourds (GANIL). A neutron spectrum unfolding algorithm combining the SAND-II iterative method and Monte-Carlo techniques was developed for the analysis of the activation results that cover a wide range of neutron energies. It was implemented into a graphical user interface program, called GanUnfold. The experimental neutron spectra are compared to Monte-Carlo simulations performed using the PHITS and FLUKA codes.
NASA Technical Reports Server (NTRS)
Tick, Evan
1987-01-01
This note describes an efficient software emulator for the Warren Abstract Machine (WAM) Prolog architecture. The version of the WAM implemented is called Lcode. The Lcode emulator, written in C, executes the 'naive reverse' benchmark at 3900 LIPS. The emulator is one of a set of tools used to measure the memory-referencing characteristics and performance of Prolog programs. These tools include a compiler, assembler, and memory simulators. An overview of the Lcode architecture is given here, followed by a description and listing of the emulator code implementing each Lcode instruction. This note will be of special interest to those studying the WAM and its performance characteristics. In general, this note will be of interest to those creating efficient software emulators for abstract machine architectures.
Heliostat cost optimization study
NASA Astrophysics Data System (ADS)
von Reeken, Finn; Weinrebe, Gerhard; Keck, Thomas; Balz, Markus
2016-05-01
This paper presents a methodology for a heliostat cost optimization study. First different variants of small, medium sized and large heliostats are designed. Then the respective costs, tracking and optical quality are determined. For the calculation of optical quality a structural model of the heliostat is programmed and analyzed using finite element software. The costs are determined based on inquiries and from experience with similar structures. Eventually the levelised electricity costs for a reference power tower plant are calculated. Before each annual simulation run the heliostat field is optimized. Calculated LCOEs are then used to identify the most suitable option(s). Finally, the conclusions and findings of this extensive cost study are used to define the concept of a new cost-efficient heliostat called `Stellio'.
NASA Astrophysics Data System (ADS)
Mousavi, Seyed Jamshid; Mahdizadeh, Kourosh; Afshar, Abbas
2004-08-01
Application of stochastic dynamic programming (SDP) models to reservoir optimization calls for state variables discretization. As an important variable discretization of reservoir storage volume has a pronounced effect on the computational efforts. The error caused by storage volume discretization is examined by considering it as a fuzzy state variable. In this approach, the point-to-point transitions between storage volumes at the beginning and end of each period are replaced by transitions between storage intervals. This is achieved by using fuzzy arithmetic operations with fuzzy numbers. In this approach, instead of aggregating single-valued crisp numbers, the membership functions of fuzzy numbers are combined. Running a simulated model with optimal release policies derived from fuzzy and non-fuzzy SDP models shows that a fuzzy SDP with a coarse discretization scheme performs as well as a classical SDP having much finer discretized space. It is believed that this advantage in the fuzzy SDP model is due to the smooth transitions between storage intervals which benefit from soft boundaries.
Adaptive Dynamic Programming for Discrete-Time Zero-Sum Games.
Wei, Qinglai; Liu, Derong; Lin, Qiao; Song, Ruizhuo
2018-04-01
In this paper, a novel adaptive dynamic programming (ADP) algorithm, called "iterative zero-sum ADP algorithm," is developed to solve infinite-horizon discrete-time two-player zero-sum games of nonlinear systems. The present iterative zero-sum ADP algorithm permits arbitrary positive semidefinite functions to initialize the upper and lower iterations. A novel convergence analysis is developed to guarantee the upper and lower iterative value functions to converge to the upper and lower optimums, respectively. When the saddle-point equilibrium exists, it is emphasized that both the upper and lower iterative value functions are proved to converge to the optimal solution of the zero-sum game, where the existence criteria of the saddle-point equilibrium are not required. If the saddle-point equilibrium does not exist, the upper and lower optimal performance index functions are obtained, respectively, where the upper and lower performance index functions are proved to be not equivalent. Finally, simulation results and comparisons are shown to illustrate the performance of the present method.
Penalty Dynamic Programming Algorithm for Dim Targets Detection in Sensor Systems
Huang, Dayu; Xue, Anke; Guo, Yunfei
2012-01-01
In order to detect and track multiple maneuvering dim targets in sensor systems, an improved dynamic programming track-before-detect algorithm (DP-TBD) called penalty DP-TBD (PDP-TBD) is proposed. The performances of tracking techniques are used as a feedback to the detection part. The feedback is constructed by a penalty term in the merit function, and the penalty term is a function of the possible target state estimation, which can be obtained by the tracking methods. With this feedback, the algorithm combines traditional tracking techniques with DP-TBD and it can be applied to simultaneously detect and track maneuvering dim targets. Meanwhile, a reasonable constraint that a sensor measurement can originate from one target or clutter is proposed to minimize track separation. Thus, the algorithm can be used in the multi-target situation with unknown target numbers. The efficiency and advantages of PDP-TBD compared with two existing methods are demonstrated by several simulations. PMID:22666074
2009.3 Revision of the Evaluated Nuclear Data Library (ENDL2009.3)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, I. J.; Beck, B.; Descalle, M. A.
LLNL's Computational Nuclear Data and Theory Group have created a 2009.3 revised release of the Evaluated Nuclear Data Library (ENDL2009.3). This library is designed to support LLNL's current and future nuclear data needs and will be employed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. The ENDL2009 database was the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. It was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/Office of Science's US Nuclear Data Program. This document listsmore » the revisions and fixes made in a new release called ENDL2009.3, by com- paring with the existing data in the previous release ENDL2009.2. These changes are made in conjunction with the revisions for ENDL2011.3, so that both the .3 releases are as free as possible of known defects.« less
Transient flow analysis linked to fast pressure disturbance monitored in pipe systems
NASA Astrophysics Data System (ADS)
Kueny, J. L.; Lourenco, M.; Ballester, J. L.
2012-11-01
EDF Hydro Division has launched the RENOUVEAU program in order to increase performance and improve plant availability through anticipation. Due to this program, a large penstocks fleet is equipped with pressure transducers linked to a special monitoring system. Any significant disturbance of the pressure is captured in a snapshot and the waveform of the signal is stored and analyzed. During these transient states, variations in flow are unknown. In order to determine the structural impact of such overpressure occurring during complex transients conditions over the entire circuit, EDF DTG has asked ENSE3 GRENOBLE to develop a code called ACHYL CF*. The input data of ACHYL CF are circuit topology and pressure boundaries conditions. This article provide a description of the computer code developed for modeling the transient flow in a pipe network using the signals from pressure transducers as boundary conditions. Different test cases will be presented, simulating real hydro power plants for which measured pressure signals are available.
Communication library for run-time visualization of distributed, asynchronous data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowlan, J.; Wightman, B.T.
1994-04-01
In this paper we present a method for collecting and visualizing data generated by a parallel computational simulation during run time. Data distributed across multiple processes is sent across parallel communication lines to a remote workstation, which sorts and queues the data for visualization. We have implemented our method in a set of tools called PORTAL (for Parallel aRchitecture data-TrAnsfer Library). The tools comprise generic routines for sending data from a parallel program (callable from either C or FORTRAN), a semi-parallel communication scheme currently built upon Unix Sockets, and a real-time connection to the scientific visualization program AVS. Our methodmore » is most valuable when used to examine large datasets that can be efficiently generated and do not need to be stored on disk. The PORTAL source libraries, detailed documentation, and a working example can be obtained by anonymous ftp from info.mcs.anl.gov from the file portal.tar.Z from the directory pub/portal.« less
Simulation in Canadian postgraduate emergency medicine training - a national survey.
Russell, Evan; Hall, Andrew Koch; Hagel, Carly; Petrosoniak, Andrew; Dagnone, Jeffrey Damon; Howes, Daniel
2018-01-01
Simulation-based education (SBE) is an important training strategy in emergency medicine (EM) postgraduate programs. This study sought to characterize the use of simulation in FRCPC-EM residency programs across Canada. A national survey was administered to residents and knowledgeable program representatives (PRs) at all Canadian FRCPC-EM programs. Survey question themes included simulation program characteristics, the frequency of resident participation, the location and administration of SBE, institutional barriers, interprofessional involvement, content, assessment strategies, and attitudes about SBE. Resident and PR response rates were 63% (203/321) and 100% (16/16), respectively. Residents reported a median of 20 (range 0-150) hours of annual simulation training, with 52% of residents indicating that the time dedicated to simulation training met their needs. PRs reported the frequency of SBE sessions ranging from weekly to every 6 months, with 15 (94%) programs having an established simulation curriculum. Two (13%) of the programs used simulation for resident assessment, although 15 (94%) of PRs indicated that they would be comfortable with simulation-based assessment. The most common PR-identified barriers to administering simulation were a lack of protected faculty time (75%) and a lack of faculty experience with simulation (56%). Interprofessional involvement in simulation was strongly valued by both residents and PRs. SBE is frequently used by Canadian FRCPC-EM residency programs. However, there exists considerable variability in the structure, frequency, and timing of simulation-based activities. As programs transition to competency-based medical education, national organizations and collaborations should consider the variability in how SBE is administered.
Following the water, the new program for Mars exploration.
Hubbard, G Scott; Naderi, Firouz M; Garvin, James B
2002-01-01
In the wake of the loss of Mars Climate Orbiter and Mars Polar Lander in late 1999, NASA embarked on a major review of the failures and subsequently restructured all aspects of what was then called the Mars Surveyor Program--now renamed the Mars Exploration Program. This paper presents the process and results of this reexamination and defines a new approach which we have called "Program System Engineering". Emphasis is given to the scientific, technological, and programmatic strategies that were used to shape the new Program. A scientific approach known as "follow the water" is described, as is an exploration strategy we have called "seek--in situ--sample". An overview of the mission queue from continuing Mars Global Surveyor through a possible Mars Sample Return Mission launch in 2011 is provided. In addition, key proposed international collaborations, especially those between NASA, CNES and ASI are outlined, as is an approach for a robust telecommunications infrastructure. c2002 Published by Elsevier Science Ltd.
Following the water, the new program for Mars exploration
NASA Technical Reports Server (NTRS)
Hubbard, G. Scott; Naderi, Firouz M.; Garvin, James B.
2002-01-01
In the wake of the loss of Mars Climate Orbiter and Mars Polar Lander in late 1999, NASA embarked on a major review of the failures and subsequently restructured all aspects of what was then called the Mars Surveyor Program--now renamed the Mars Exploration Program. This paper presents the process and results of this reexamination and defines a new approach which we have called "Program System Engineering". Emphasis is given to the scientific, technological, and programmatic strategies that were used to shape the new Program. A scientific approach known as "follow the water" is described, as is an exploration strategy we have called "seek--in situ--sample". An overview of the mission queue from continuing Mars Global Surveyor through a possible Mars Sample Return Mission launch in 2011 is provided. In addition, key proposed international collaborations, especially those between NASA, CNES and ASI are outlined, as is an approach for a robust telecommunications infrastructure. c2002 Published by Elsevier Science Ltd.
The Probability of Hitting a Polygonal Target
1981-04-01
required for the use of this method for coalputing the probability of hitting d polygonal target. These functions are 1. PHIT (called by user’s main progran...2. FIJ (called by PHIT ) 3. FUN (called by FIJ) The user must include all three of these in his main program, but needs only to call PHIT . The
The Ghost in the Machine: Are "Teacherless" CALL Programs Really Possible?
ERIC Educational Resources Information Center
Davies, Ted; Williamson, Rodney
1998-01-01
Reflects critically on pedagogical issues in the production of computer-assisted language learning (CALL) courseware and ways CALL has affected the practice of language learning. Concludes that if CALL is to reach full potential, it must be more than a simple medium of information; it should provide a teaching/learning process, with the real…
A Simulation Program for Dynamic Infrared (IR) Spectra
ERIC Educational Resources Information Center
Zoerb, Matthew C.; Harris, Charles B.
2013-01-01
A free program for the simulation of dynamic infrared (IR) spectra is presented. The program simulates the spectrum of two exchanging IR peaks based on simple input parameters. Larger systems can be simulated with minor modifications. The program is available as an executable program for PCs or can be run in MATLAB on any operating system. Source…
The Evidence on Universal Preschool: Are Benefits Worth the Cost? Policy Analysis. Number 760
ERIC Educational Resources Information Center
Armor, David J.
2014-01-01
Calls for universal preschool programs have become commonplace, reinforced by President Obama's call for "high-quality preschool for all" in 2013. Any program that could cost state and federal taxpayers $50 billion per year warrants a closer look at the evidence on its effectiveness. This report reviews the major evaluations of preschool…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-12
... by calling the Regulations Division at 202-708-3055 (this is not a toll-free number). Individuals with speech or hearing impairments may access this number through TTY by calling the toll-free Federal... a toll-free number). Persons with hearing or speech impairments may access this number through TTY...
78 FR 68367 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Ohio NOX
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... Clean Air Act, which allows for Ohio's Clean Air Interstate Rule (CAIR) NO X Ozone Season Trading Program rules to supersede Ohio's nitrogen oxides (NO X ) State Implementation Plan (SIP) Call Budget Trading Program rules, but leave other requirements of the NO X SIP Call in place for units not covered by...
Preventing Boys' Problems in Schools through Psychoeducational Programming: A Call to Action
ERIC Educational Resources Information Center
O'Neil, James M.; Lujan, Melissa L.
2009-01-01
Controversy currently exists on whether boys are in crises and, if so, what to do about it. Research is reviewed that indicates that boys have problems that affect their emotional and interpersonal functioning. Psychoeducational and preventive programs for boys are recommended as a call to action in schools. Thematic areas for boys' programming…
The effects of fatigue on robotic surgical skill training in Urology residents.
Mark, James R; Kelly, Douglas C; Trabulsi, Edouard J; Shenot, Patrick J; Lallas, Costas D
2014-09-01
This study reports on the effect of fatigue on Urology residents using the daVinci surgical skills simulator (dVSS). Seven Urology residents performed a series of selected exercises on the dVSS while pre-call and post-call. Prior to dVSS performance a survey of subjective fatigue was taken and residents were tested with the Epworth Sleepiness Scale (ESS). Using the metrics available in the dVSS software, the performance of each resident was evaluated. The Urology residents slept an average of 4.07 h (range 2.5-6 h) while on call compared to an average of 5.43 h while not on call (range 3-7 h, p = 0.08). Post-call residents were significantly more likely to be identified as fatigued by the Epworth Sleepiness Score than pre-call residents (p = 0.01). Significant differences were observed in fatigued residents performing the exercises, Tubes and Match Board 2 (p = 0.05, 0.02). Additionally, there were significant differences in the total number of critical errors during the training session (9.29 vs. 3.14, p = 0.04). Fatigue in post-call Urology residents leads to poorer performance on the dVSS simulator. The dVSS may become a useful instrument in the education of fatigued residents and a tool to identify fatigue in trainees.
Payload training methodology study
NASA Technical Reports Server (NTRS)
1990-01-01
The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.
this report describes the theoretical development, parameterization, and application software of a generalized, community-based, bioaccumulation model called BASS (Bioaccumulation and Aquatic System Simulator).
Implementation of a successful on-call system in clinical chemistry.
Hobbs, G A; Jortani, S A; Valdes, R
1997-11-01
Successful practice of clinical pathology depends on a wide variety of laboratory, clinical, and managerial decisions. The skills needed to make these decisions can most effectively be learned by residents and fellows in pathology using a service-oriented on-call approach. We report our experience implementing an on-call system in the clinical chemistry laboratory at the University of Louisville Hospital (Ky). We detail the guidelines used to establish this system and the elements required for its successful implementation. The system emphasizes a laboratory-initiated approach to linking laboratory results to patient care. From inception of the program during late 1990 through 1995, the number of beeper calls (including clinician contacts) steadily increased and is currently 8 to 20 per week. The on-call system is active 24 hours per day, 7 days per week, thus representing activity on all three laboratory shifts. Types of responses were separated into administrative (12%), analytical (42%), clinical (63%), quality control or quality assurance (12%), and consultation (13%) categories. We also present 6 case reports as examples demonstrating multiple elements in these categories. In 23% of the calls, clinician contact was required and achieved by the fellow or resident on call for the laboratory. The on-call reports are documented and presented informally at weekly on-call report sessions. Emphasis is placed on learning and refinement of investigative skills needed to function as an effective laboratory director. Educational emphasis for the medical staff is in establishing awareness of the presence of the laboratory as an important interactive component of patient care. In addition, we found this program to be beneficial to the hospital and to the department of pathology in fulfilling its clinical service and teaching missions. Our experience may be helpful to other institutions establishing such a program.
Programming of a flexible computer simulation to visualize pharmacokinetic-pharmacodynamic models.
Lötsch, J; Kobal, G; Geisslinger, G
2004-01-01
Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.
Simulation Activity in Otolaryngology Residencies.
Deutsch, Ellen S; Wiet, Gregory J; Seidman, Michael; Hussey, Heather M; Malekzadeh, Sonya; Fried, Marvin P
2015-08-01
Simulation has become a valuable tool in medical education, and several specialties accept or require simulation as a resource for resident training or assessment as well as for board certification or maintenance of certification. This study investigates current simulation resources and activities in US otolaryngology residency programs and examines interest in advancing simulation training and assessment within the specialty. Web-based survey. US otolaryngology residency training programs. An electronic web-based survey was disseminated to all US otolaryngology program directors to determine their respective institutional and departmental simulation resources, existing simulation activities, and interest in further simulation initiatives. Descriptive results are reported. Responses were received from 43 of 104 (43%) residency programs. Simulation capabilities and resources are available in most respondents' institutions (78.6% report onsite resources; 73.8% report availability of models, manikins, and devices). Most respondents (61%) report limited simulation activity within otolaryngology. Areas of simulation are broad, addressing technical and nontechnical skills related to clinical training (94%). Simulation is infrequently used for research, credentialing, or systems improvement. The majority of respondents (83.8%) expressed interest in participating in multicenter trials of simulation initiatives. Most respondents from otolaryngology residency programs have incorporated some simulation into their curriculum. Interest among program directors to participate in future multicenter trials appears high. Future research efforts in this area should aim to determine optimal simulators and simulation activities for training and assessment as well as how to best incorporate simulation into otolaryngology residency training programs. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.
Cockpit Resource Management (CRM) for FAR Parts 91 and 135 operators
NASA Technical Reports Server (NTRS)
Schwartz, Douglas
1987-01-01
The why, what, and how of CRM at Flight Safety International (FSI)--that is, the philosophy behind the program, the content of the program, and some insight regarding how it delivers that to the pilot is presented. A few of the concepts that are part of the program are discussed. This includes a view of statistics called the Safety Window, the concept of situational awareness, and an approach to training that we called the Cockpit Management Concept (CMC).
Rim, Matthew H; Thomas, Karen C; Chandramouli, Jane; Barrus, Stephanie A; Nickman, Nancy A
2018-05-15
The implementation and quality assessment of a pharmacy services call center (PSCC) for outpatient pharmacies and specialty pharmacy services within an academic health system are described. Prolonged wait times in outpatient pharmacies or hold times on the phone affect the ability of pharmacies to capture and retain prescriptions. To support outpatient pharmacy operations and improve quality, a PSCC was developed to centralize handling of all outpatient and specialty pharmacy calls. The purpose of the PSCC was to improve the quality of pharmacy telephone services by (1) decreasing the call abandonment rate, (2) improving the speed of answer, (3) increasing first-call resolution, (4) centralizing all specialty pharmacy and prior authorization calls, (5) increasing labor efficiency and pharmacy capacities, (6) implementing a quality evaluation program, and (7) improving workplace satisfaction and retention of outpatient pharmacy staff. The PSCC centralized pharmacy calls from 9 pharmacy locations, 2 outpatient clinics, and a specialty pharmacy. Since implementation, the PSCC has achieved and maintained program goals, including improved abandonment rate, speed of answer, and first-call resolution. A centralized 24-7 support line for specialty pharmacy patients was also successfully established. A quality calibration program was implemented to ensure service quality and excellent patient experience. Additional ongoing evaluations measure the impact of the PSCC on improving workplace satisfaction and retention of outpatient pharmacy staff. The design and implementation of the PSCC have significantly improved the health system's patient experiences, efficiency, and quality. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Impact of an after-hours on-call emergency physician on ambulance transports from a county jail.
Chan, Theodore C; Vilke, Gary M; Smith, Sue; Sparrow, William; Dunford, James V
2003-01-01
The authors sought to determine if the availability of an after-hours on-call emergency physician by telephone for consultation to the staff at a county jail would safely reduce ambulance emergency department (ED) transport of inmates in the community. The authors conducted a prospective comparison study during the first ten months of an emergency physician on-call program for the county jail in which prospective data were collected on all consultations, including reason for call and disposition (ambulance, deputy, or no ED transport of inmate). They compared this time with a similar period a year before the program in terms of total ambulance transports from the jail. They also reviewed all hospital and jail medical records to assess for any adverse consequences within one month, or subsequent ambulance transport within 24 hours as a result of inmate care after the consultation call. Total after-hours ambulance transports from the jail decreased significantly from 30.3 transports/month (95% confidence interval [CI], 21.0-39.6) to 9.1 transports/month (95% CI, 4.1-14.0) (p < 0.05). The most common reasons for consultation calls were chest pain (16%), trauma (15%), and abnormal laboratory or radiology results (14%). Of all calls, only 30% resulted in ambulance transport to the ED. On review of records, no adverse outcome or subsequent ambulance transport was identified. The initiation of an on-call emergency physician program for after-hours consultation to jail nursing and law enforcement staff safely reduced ambulance transports from a county jail with no adverse outcomes identified.
Alloy Design Workbench-Surface Modeling Package Developed
NASA Technical Reports Server (NTRS)
Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.
2003-01-01
NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.
Rohrmann, Sonja; Bechtoldt, Myriam N; Hopp, Henrik; Hodapp, Volker; Zapf, Dieter
2011-07-01
In customer interactions, emotional display rules typically prescribe service providers to suppress negative emotions and display positive ones. This study investigated the causal impact of these emotional display rules on physiological indicators of workers' stress and performance. Additionally, the moderating influence of personality was examined by analyzing the impact of trait anger. In a simulated call center, 82 females were confronted with a complaining customer and instructed to react either authentically and show their true emotions or to "serve with a smile" and hide negative emotions. Increases in diastolic blood pressure and heart rates were higher in the smile condition, while verbal fluency was lower. Trait anger moderated the effects on diastolic blood pressure and observer ratings' of participants' professional competence, suggesting more negative effects for high trait anger individuals. Findings imply that emotional display rules may increase call center employees' strain and that considering employees' personality may be crucial for precluding health and performance impairments among call center workers.
78 FR 30956 - Cruise Vessel Security and Safety Training Provider Certification
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-23
..., practical demonstration, or simulation program. A detailed instructor manual must be submitted. Submissions... simulation programs to be used. If a simulator or simulation program is to be used, include technical... lessons and, if appropriate, for practical demonstrations or simulation exercises and assessments...
NASA Astrophysics Data System (ADS)
Faucher-Giguere, Claude-Andre
2016-10-01
HST has invested thousands of orbits to complete multi-wavelength surveys of high-redshift galaxies including the Deep Fields, COSMOS, 3D-HST and CANDELS. Over the next few years, JWST will undertake complementary, spatially-resolved infrared observations. Cosmological simulations are the most powerful tool to make detailed predictions for the properties of galaxy populations and to interpret these surveys. We will leverage recent major advances in the predictive power of cosmological hydrodynamic simulations to produce the first statistical sample of hundreds of galaxies simulated with 10 pc resolution and with explicit interstellar medium and stellar feedback physics proved to simultaneously reproduce the galaxy stellar mass function, the chemical enrichment of galaxies, and the neutral hydrogen content of galaxy halos. We will process our new set of full-volume cosmological simulations, called FIREBOX, with a mock imaging and spectral synthesis pipeline to produce realistic mock HST and JWST observations, including spatially-resolved photometry and spectroscopy. By comparing FIREBOX with recent high-redshift HST surveys, we will study the stellar build up of galaxies, the evolution massive star-forming clumps, their contribution to bulge growth, the connection of bulges to star formation quenching, and the triggering mechanisms of AGN activity. Our mock data products will also enable us to plan future JWST observing programs. We will publicly release all our mock data products to enable HST and JWST science beyond our own analysis, including with the Frontier Fields.
Confidential close call reporting system (C3RS) lessons learned team baseline phased report
DOT National Transportation Integrated Search
2015-05-08
The Federal Railroad Administration (FRA) has established a program called the Confidential Close Call Reporting System : (C3RS), which allows events to be reported anonymously and dealt with non-punitively and without fear or reprisal through : stru...
Confidential close call reporting system (C3RS) lessons learned team baseline phase report.
DOT National Transportation Integrated Search
2015-05-01
The Federal Railroad Administration (FRA) has established a program called the Confidential Close Call Reporting System : (C3 : RS), which allows events to be reported anonymously and dealt with non-punitively and without fear or reprisal through : s...
HDF-EOS 2 and HDF-EOS 5 Compatibility Library
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
The HDF-EOS 2 and HDF-EOS 5 Compatibility Library contains C-language functions that provide uniform access to HDF-EOS 2 and HDF-EOS 5 files through one set of application programming interface (API) calls. ("HDFEOS 2" and "HDF-EOS 5" are defined in the immediately preceding article.) Without this library, differences between the APIs of HDF-EOS 2 and HDF-EOS 5 would necessitate writing of different programs to cover HDF-EOS 2 and HDF-EOS 5. The API associated with this library is denoted "he25." For nearly every HDF-EOS 5 API call, there is a corresponding he25 API call. If a file in question is in the HDF-EOS 5 format, the code reverts to the corresponding HDF-EOS 5 call; if the file is in the HDF-EOS 2 format, the code translates the arguments to HDF-EOS 2 equivalents (if necessary), calls the HDFEOS 2 call, and retranslates the results back to HDF-EOS 5 (if necessary).
Chang, Larry William; Kagaayi, Joseph; Nakigozi, Gertrude; Galiwango, Ronald; Mulamba, Jeremiah; Ludigo, James; Ruwangula, Andrew; Gray, Ronald H.; Quinn, Thomas C.; Bollinger, Robert C.; Reynolds, Steven J.
2009-01-01
Hotlines and warmlines have been successfully used in the developed world to provide clinical advice; however, reports on their replicability in resource-limited settings are limited. A warmline was established in Rakai, Uganda, to support an antiretroviral therapy program. Over a 17-month period, a database was kept of who called, why they called, and the result of the call. A program evaluation was also administered to clinical staff. A total of 1303 calls (3.5 calls per weekday) were logged. The warmline was used mostly by field staff and peripherally based peer health workers. Calls addressed important clinical issues, including the need for urgent care, medication side effects, and follow-up needs. Most clinical staff felt that the warmline made their jobs easier and improved the health of patients. An HIV/AIDS warmline leveraged the skills of a limited workforce to provide increased access to HIV/AIDS care, advice, and education. PMID:18441254
Near-Optimal Tracking Control of Mobile Robots Via Receding-Horizon Dual Heuristic Programming.
Lian, Chuanqiang; Xu, Xin; Chen, Hong; He, Haibo
2016-11-01
Trajectory tracking control of wheeled mobile robots (WMRs) has been an important research topic in control theory and robotics. Although various tracking control methods with stability have been developed for WMRs, it is still difficult to design optimal or near-optimal tracking controller under uncertainties and disturbances. In this paper, a near-optimal tracking control method is presented for WMRs based on receding-horizon dual heuristic programming (RHDHP). In the proposed method, a backstepping kinematic controller is designed to generate desired velocity profiles and the receding horizon strategy is used to decompose the infinite-horizon optimal control problem into a series of finite-horizon optimal control problems. In each horizon, a closed-loop tracking control policy is successively updated using a class of approximate dynamic programming algorithms called finite-horizon dual heuristic programming (DHP). The convergence property of the proposed method is analyzed and it is shown that the tracking control system based on RHDHP is asymptotically stable by using the Lyapunov approach. Simulation results on three tracking control problems demonstrate that the proposed method has improved control performance when compared with conventional model predictive control (MPC) and DHP. It is also illustrated that the proposed method has lower computational burden than conventional MPC, which is very beneficial for real-time tracking control.
ERIC Educational Resources Information Center
Hardin, Julia P., Ed.; Moulden, Richard G., Ed.
This compilation of over 40 lesson plans on various topics in law related education was written by classroom teachers from around the United States who had participated in the fifth of an annual series called Special Programs in Citizenship Education (SPICE)--weeklong institutes devoted to learning about different cultures and laws. Called SPICE V…
ERIC Educational Resources Information Center
Kathi, Pradeep Chandra
2012-01-01
The School of Planning Policy and Development at the University of Southern California brought together representatives of neighborhood councils and city agencies of the city of Los Angeles together in an action research program. This action research program called the Collaborative Learning Project developed a collaboration process called the…
Justice Education as a Schoolwide Effort: Effective Religious Education in the Catholic School
ERIC Educational Resources Information Center
Horan, Michael P.
2005-01-01
This essay describes and analyzes one successful justice education program flowing from community service, and demonstrates how such a program in a Catholic school responds to several important "calls" to Catholic educators. These "calls" are issued by (a) the needs of the learners and the signs of the times, (b) official documents of the Church…
Evaluating the Generality and Limits of Blind Return-Oriented Programming Attacks
2015-12-01
consider a recently proposed information disclosure vulnerability called blind return-oriented programming (BROP). Under certain conditions, this...implementation disclosure attacks 15. NUMBER OF PAGES 75 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF...Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT We consider a recently proposed information disclosure vulnerability called blind return
Evaluating the effectiveness of the Safety Investment Program (SIP) policies for Oregon.
DOT National Transportation Integrated Search
2009-10-01
The Safety Investment Program (SIP) was originally called the Statewide Transportation Improvement Program - : Safety Investment Program (STIP-SIP). The concept of the program was first discussed in October 1997 and the : program was adopted by the O...
Arbex, D F; Jappur, R; Selig, P; Varvakis, G
2012-01-01
This article addresses the ergonomic criteria that guide the construction of an educational game called Environmental Simulator. The focus is on environment navigation considering aspects of content architecture and its esthetics functionality.
Duty hours and home call: the experience of plastic surgery residents and fellows.
Drolet, Brian C; Prsic, Adnan; Schmidt, Scott T
2014-05-01
Although resident duty hours are strictly regulated by the Accreditation Council for Graduate Medical Education, there are fewer restrictions on at-home call for residents. To date, no studies have examined the experience of home call for plastic surgery trainees or the impact of home call on patient care and education in plastic surgery. an anonymous electronic survey to plastic surgery trainees at 41 accredited programs. They sought to produce a descriptive assessment of home call and to evaluate the perceived impact of home call on training and patient care. A total of 214 responses were obtained (58.3 percent completion rate). Nearly all trainees reported taking home call (98.6 percent), with 66.7 percent reporting call frequency every third or fourth night. Most respondents (63.3 percent) felt that home call regulations are vague but that Council regulation (44.9 percent) and programmatic oversight (56.5 percent) are adequate. Most (91.2 percent) believe their program could not function without home call and that home call helps to avoid strict duty hour restrictions (71.5 percent). Nearly all respondents (92.3 percent) preferred home call to in-house call. This is the first study to examine how plastic surgery residents experience and perceive home call within the framework of Accreditation Council for Graduate Medical Education duty hour regulations. Most trainees feel the impact of home call is positive for education (50.2 percent) and quality of life (56.5 percent), with a neutral impact on patient care (66.7 percent). Under the Council's increasing regulations, home call provides a balance of education and patient care appropriate for training in plastic and reconstructive surgery.
Teaching Harmonic Motion in Trigonometry: Inductive Inquiry Supported by Physics Simulations
ERIC Educational Resources Information Center
Sokolowski, Andrzej; Rackley, Robin
2011-01-01
In this article, the authors present a lesson whose goal is to utilise a scientific environment to immerse a trigonometry student in the process of mathematical modelling. The scientific environment utilised during this activity is a physics simulation called "Wave on a String" created by the PhET Interactive Simulations Project at…
Building a Community in Our Classroom: The Story of Bat Town, U.S.A.
ERIC Educational Resources Information Center
Keech, Andrea McGann
2001-01-01
Describes a simulation called, "Classroom City," used by elementary students to learn about communities. Focuses on the students' own simulated city named Bat Town, U.S.A. Discusses the project in detail. Describes the activities children participated in and the roles they assumed during the simulation. (CMK)
Rapid E-Learning Simulation Training and User Response
ERIC Educational Resources Information Center
Rackler, Angeline
2011-01-01
A new trend in e-learning development is to have subject matter experts use rapid development tools to create training simulations. This type of training is called rapid e-learning simulation training. Though companies are using rapid development tools to create training quickly and cost effectively, there is little empirical research to indicate…
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183
NASA Astrophysics Data System (ADS)
Silverstone, S.; Nelson, M.; Alling, A.; Allen, J.
For humans to survive during long-term missions on the Martian surface, bioregenerative life support systems including food production will decrease requirements for launch of Earth supplies, and increase mission safety. It is proposed that the development of ``modular biospheres''- closed system units that can be air-locked together and which contain soil-based bioregenerative agriculture, horticulture, with a wetland wastewater treatment system is an approach for Mars habitation scenarios. Based on previous work done in long-term life support at Biosphere 2 and other closed ecological systems, this consortium proposes a research and development program called Mars On Earth™ which will simulate a life support system designed for a four person crew. The structure will consist of /6 × 110 square meter modular agricultural units designed to produce a nutritionally adequate diet for 4 people, recycling all air, water and waste, while utilizing a soil created by the organic enrichment and modification of Mars simulant soils. Further research needs are discussed, such as determining optimal light levels for growth of the necessary range of crops, energy trade-offs for agriculture (e.g. light intensity vs. required area), capabilities of Martian soils and their need for enrichment and elimination of oxides, strategies for use of human waste products, and maintaining atmospheric balance between people, plants and soils.
Radom, Marcin; Rybarczyk, Agnieszka; Szawulak, Bartlomiej; Andrzejewski, Hubert; Chabelski, Piotr; Kozak, Adam; Formanowicz, Piotr
2017-12-01
Model development and its analysis is a fundamental step in systems biology. The theory of Petri nets offers a tool for such a task. Since the rapid development of computer science, a variety of tools for Petri nets emerged, offering various analytical algorithms. From this follows a problem of using different programs to analyse a single model. Many file formats and different representations of results make the analysis much harder. Especially for larger nets the ability to visualize the results in a proper form provides a huge help in the understanding of their significance. We present a new tool for Petri nets development and analysis called Holmes. Our program contains algorithms for model analysis based on different types of Petri nets, e.g. invariant generator, Maximum Common Transitions (MCT) sets and cluster modules, simulation algorithms or knockout analysis tools. A very important feature is the ability to visualize the results of almost all analytical modules. The integration of such modules into one graphical environment allows a researcher to fully devote his or her time to the model building and analysis. Available at http://www.cs.put.poznan.pl/mradom/Holmes/holmes.html. piotr@cs.put.poznan.pl. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Dryzek, Jerzy; Siemek, Krzysztof
2013-08-01
The spatial distribution of positrons emitted from radioactive isotopes into stacks or layered samples is a subject of the presented report. It was found that Monte Carlo (MC) simulations using GEANT4 code are not able to describe correctly the experimental data of the positron fractions in stacks. The mathematical model was proposed for calculations of the implantation profile or positron fractions in separated layers or foils being components of a stack. The model takes into account only two processes, i.e., the positron absorption and backscattering at interfaces. The mathematical formulas were applied in the computer program called LYS-1 (layers profile analysis). The theoretical predictions of the model were in the good agreement with the results of the MC simulations for the semi infinite sample. The experimental verifications of the model were performed on the symmetrical and non-symmetrical stacks of different foils. The good agreement between the experimental and calculated fractions of positrons in components of a stack was achieved. Also the experimental implantation profile obtained using the depth scanning of positron implantation technique is very well described by the theoretical profile obtained within the proposed model. The LYS-1 program allows us also to calculate the fraction of positrons which annihilate in the source, which can be useful in the positron spectroscopy.
Quantum simulation from the bottom up: the case of rebits
NASA Astrophysics Data System (ADS)
Enshan Koh, Dax; Yuezhen Niu, Murphy; Yoder, Theodore J.
2018-05-01
Typically, quantum mechanics is thought of as a linear theory with unitary evolution governed by the Schrödinger equation. While this is technically true and useful for a physicist, with regards to computation it is an unfortunately narrow point of view. Just as a classical computer can simulate highly nonlinear functions of classical states, so too can the more general quantum computer simulate nonlinear evolutions of quantum states. We detail one particular simulation of nonlinearity on a quantum computer, showing how the entire class of -unitary evolutions (on n qubits) can be simulated using a unitary, real-amplitude quantum computer (consisting of n + 1 qubits in total). These operators can be represented as the sum of a linear and antilinear operator, and add an intriguing new set of nonlinear quantum gates to the toolbox of the quantum algorithm designer. Furthermore, a subgroup of these nonlinear evolutions, called the -Cliffords, can be efficiently classically simulated, by making use of the fact that Clifford operators can simulate non-Clifford (in fact, non-linear) operators. This perspective of using the physical operators that we have to simulate non-physical ones that we do not is what we call bottom-up simulation, and we give some examples of its broader implications.
Campbell's monkeys use affixation to alter call meaning.
Ouattara, Karim; Lemasson, Alban; Zuberbühler, Klaus
2009-11-12
Human language has evolved on a biological substrate with phylogenetic roots deep in the primate lineage. Here, we describe a functional analogy to a common morphological process in human speech, affixation, in the alarm calls of free-ranging adult Campbell's monkeys (Cercopithecus campbelli campbelli). We found that male alarm calls are composed of an acoustically variable stem, which can be followed by an acoustically invariable suffix. Using long-term observations and predator simulation experiments, we show that suffixation in this species functions to broaden the calls' meaning by transforming a highly specific eagle alarm to a general arboreal disturbance call or by transforming a highly specific leopard alarm call to a general alert call. We concluded that, when referring to specific external events, non-human primates can generate meaningful acoustic variation during call production that is functionally equivalent to suffixation in human language.
Stockert, Brad; Ohtake, Patricia J
2017-10-01
There is growing recognition that collaborative practice among healthcare professionals is associated with improved patient outcomes and enhanced team functioning, but development of collaborative practitioners requires interprofessional education (IPE). Immersive simulation, a clinically relevant experience that deeply engages the learner in realistic clinical environments, is used increasingly for IPE. The purpose of this study was to assess the use of immersive simulation as a strategy for IPE in physical therapist (PT) education programs. During fall 2014 and spring 2015, we contacted all 214 Commission on Accreditation in Physical Therapy Education accredited PT education programs in the United States and invited a faculty member to participate in our online survey. One hundred fourteen PT programs responded (53% response rate). Eighty responding programs (70%) identified themselves as users of immersive simulation, and 45 programs (39%) used simulation for IPE. Of these 45 programs, more than 90% included Interprofessional Education Collaborative competency learning objectives of roles/responsibilities, interprofessional communication, and teams/teamwork and 51% reported learning objectives for values/ethics for interprofessional practice. Interprofessional simulations with PT students commonly included nursing (91%). In programs using immersive simulation for IPE, 91% included debriefing and 51% included debriefing by interprofessional teams. Eighty accredited PT programs (70%) that responded to the survey use immersive simulation, and 45 programs (39%) use simulation for IPE. Most programs conduct simulations consistent with recognized best practice, including debriefing and Interprofessional Education Collaborative competency learning objectives for promoting interprofessional collaborative practice. We anticipate an increase in the use of immersive simulation for IPE as an educational strategy to comply with the revised Commission on Accreditation in Physical Therapy Education accreditation standards related to interprofessional collaborative practice that will become effective on January 1, 2018.
Weller, J M; Torrie, J; Boyd, M; Frengley, R; Garden, A; Ng, W L; Frampton, C
2014-06-01
Sharing information with the team is critical in developing a shared mental model in an emergency, and fundamental to effective teamwork. We developed a structured call-out tool, encapsulated in the acronym 'SNAPPI': Stop; Notify; Assessment; Plan; Priorities; Invite ideas. We explored whether a video-based intervention could improve structured call-outs during simulated crises and if this would improve information sharing and medical management. In a simulation-based randomized, blinded study, we evaluated the effect of the video-intervention teaching SNAPPI on scores for SNAPPI, information sharing, and medical management using baseline and follow-up crisis simulations. We assessed information sharing using a probe technique where nurses and technicians received unique, clinically relevant information probes before the simulation. Shared knowledge of probes was measured in a written, post-simulation test. We also scored sharing of diagnostic options with the team and medical management. Anaesthetists' scores for SNAPPI were significantly improved, as was the number of diagnostic options they shared. We found a non-significant trend to improve information-probe sharing and medical management in the intervention group, and across all simulations, a significant correlation between SNAPPI and information-probe sharing. Of note, only 27% of the clinically relevant information about the patient provided to the nurse and technician in the pre-simulation information probes was subsequently learnt by the anaesthetist. We developed a structured communication tool, SNAPPI, to improve information sharing between anaesthetists and their team, taught it using a video-based intervention, and provide initial evidence to support its value for improving communication in a crisis. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Heget, Jeffrey R; Bagian, James P; Lee, Caryl Z; Gosbee, John W
2002-12-01
In 1998 the Veterans Health Administration (VHA) created the National Center for Patient Safety (NCPS) to lead the effort to reduce adverse events and close calls systemwide. NCPS's aim is to foster a culture of safety in the Department of Veterans Affairs (VA) by developing and providing patient safety programs and delivering standardized tools, methods, and initiatives to the 163 VA facilities. To create a system-oriented approach to patient safety, NCPS looked for models in fields such as aviation, nuclear power, human factors, and safety engineering. Core concepts included a non-punitive approach to patient safety activities that emphasizes systems-based learning, the active seeking out of close calls, which are viewed as opportunities for learning and investigation, and the use of interdisciplinary teams to investigate close calls and adverse events through a root cause analysis (RCA) process. Participation by VA facilities and networks was voluntary. NCPS has always aimed to develop a program that would be applicable both within the VA and beyond. NCPS's full patient safety program was tested and implemented throughout the VA system from November 1999 to August 2000. Program components included an RCA system for use by caregivers at the front line, a system for the aggregate review of RCA results, information systems software, alerts and advisories, and cognitive acids. Following program implementation, NCPS saw a 900-fold increase in reporting of close calls of high-priority events, reflecting the level of commitment to the program by VHA leaders and staff.
NASA Technical Reports Server (NTRS)
Johnson, F. T.; Samant, S. S.; Bieterman, M. B.; Melvin, R. G.; Young, D. P.; Bussoletti, J. E.; Hilmes, C. L.
1992-01-01
A new computer program, called TranAir, for analyzing complex configurations in transonic flow (with subsonic or supersonic freestream) was developed. This program provides accurate and efficient simulations of nonlinear aerodynamic flows about arbitrary geometries with the ease and flexibility of a typical panel method program. The numerical method implemented in TranAir is described. The method solves the full potential equation subject to a set of general boundary conditions and can handle regions with differing total pressure and temperature. The boundary value problem is discretized using the finite element method on a locally refined rectangular grid. The grid is automatically constructed by the code and is superimposed on the boundary described by networks of panels; thus no surface fitted grid generation is required. The nonlinear discrete system arising from the finite element method is solved using a preconditioned Krylov subspace method embedded in an inexact Newton method. The solution is obtained on a sequence of successively refined grids which are either constructed adaptively based on estimated solution errors or are predetermined based on user inputs. Many results obtained by using TranAir to analyze aerodynamic configurations are presented.
Risk-Constrained Dynamic Programming for Optimal Mars Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Kuwata, Yoshiaki
2013-01-01
A chance-constrained dynamic programming algorithm was developed that is capable of making optimal sequential decisions within a user-specified risk bound. This work handles stochastic uncertainties over multiple stages in the CEMAT (Combined EDL-Mobility Analyses Tool) framework. It was demonstrated by a simulation of Mars entry, descent, and landing (EDL) using real landscape data obtained from the Mars Reconnaissance Orbiter. Although standard dynamic programming (DP) provides a general framework for optimal sequential decisionmaking under uncertainty, it typically achieves risk aversion by imposing an arbitrary penalty on failure states. Such a penalty-based approach cannot explicitly bound the probability of mission failure. A key idea behind the new approach is called risk allocation, which decomposes a joint chance constraint into a set of individual chance constraints and distributes risk over them. The joint chance constraint was reformulated into a constraint on an expectation over a sum of an indicator function, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the chance-constraint optimization problem can be turned into an unconstrained optimization over a Lagrangian, which can be solved efficiently using a standard DP approach.
Abrahamsen, Håkon B; Sollid, Stephen J M; Öhlund, Lennart S; Røislien, Jo; Bondevik, Gunnar Tschudi
2015-01-01
Background Human error and deficient non-technical skills (NTSs) among providers of ALS in helicopter emergency medical services (HEMS) is a threat to patient and operational safety. Skills can be improved through simulation-based training and assessment. Objective To document the current level of simulation-based training and assessment of seven generic NTSs in crew members in the Norwegian HEMS. Methods A cross-sectional survey, either electronic or paper-based, of all 207 physicians, HEMS crew members (HCMs) and pilots working in the civilian Norwegian HEMS (11 bases), between 8 May and 25 July 2012. Results The response rate was 82% (n=193). A large proportion of each of the professional groups lacked simulation-based training and assessment of their NTSs. Compared with pilots and HCMs, physicians undergo statistically significantly less frequent simulation-based training and assessment of their NTSs. Fifty out of 82 (61%) physicians were on call for more than 72 consecutive hours on a regular basis. Of these, 79% did not have any training in coping with fatigue. In contrast, 72 out of 73 (99%) pilots and HCMs were on call for more than 3 days in a row. Of these, 54% did not have any training in coping with fatigue. Conclusions Our study indicates a lack of simulation-based training and assessment. Pilots and HCMs train and are assessed more frequently than physicians. All professional groups are on call for extended hours, but receive limited training in how to cope with fatigue. PMID:25344577
Acoustic Blind Deconvolution and Frequency-Difference Beamforming in Shallow Ocean Environments
2012-01-01
acoustic field experiment (FAF06) conducted in July 2006 off the west coast of Italy. Dr. Heechun Song of the Scripps Institution of Oceanography...from seismic surveying and whale calls recorded on a vertical array with 12 elements. The whale call frequencies range from 100 to 500 Hz and the water...underway. Together Ms. Abadi and Dr. Thode had considerable success simulating the experimental environment, deconvolving whale calls, ranging the
76 FR 17933 - Infrastructure Protection Data Call Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-31
... Survey AGENCY: National Protection and Programs Directorate, DHS. ACTION: 60-Day Notice and request for... mission, IP requests opinions and information in a survey from IP Data Call participants regarding the IP Data Call process and the web-based application used to collect the CIKR data. The survey data...
77 FR 74828 - Call for Applications for the International Buyer Program Calendar Years 2014 and 2015
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
... DEPARTMENT OF COMMERCE International Trade Administration [Docket No. 120913451-2681-02] Call for... Administration, Department of Commerce. ACTION: Notice extending application deadline. SUMMARY: The U.S. Department of Commerce (DOC) is amending the Notice and Call for Applications for the International Buyer...
Pedagogy and Related Criteria: The Selection of Software for Computer Assisted Language Learning
ERIC Educational Resources Information Center
Samuels, Jeffrey D.
2013-01-01
Computer-Assisted Language Learning (CALL) is an established field of academic inquiry with distinct applications for second language teaching and learning. Many CALL professionals direct language labs or language resource centers (LRCs) in which CALL software applications and generic software applications support language learning programs and…
ERIC Educational Resources Information Center
Mangan, Marianne
2013-01-01
Call it physical activity, call it games, or call it play. Whatever its name, it's a place we all need to return to. In the physical education, recreation, and dance professions, we need to redesign programs to address the need for and want of play that is inherent in all of us.
Simulation of CIFF (Centralized IFF) remote control displays
NASA Astrophysics Data System (ADS)
Tucker, D. L.; Leibowitz, L. M.
1986-06-01
This report presents the software simulation of the Remote-Control-Display (RCS) proposed to be used in the Centralized IFF (CIFF) system. A description of the simulation programs along with simulated menu formats are presented. A sample listing of the simulation programs and a brief description of the program operation are also included.
On the Edge: Intelligent CALL in the 1990s.
ERIC Educational Resources Information Center
Underwood, John
1989-01-01
Examines the possibilities of developing computer-assisted language learning (CALL) based on the best of modern technology, arguing that artificial intelligence (AI) strategies will radically improve the kinds of exercises that can be performed. Recommends combining AI technology with other tools for delivering instruction, such as simulation and…
A Commercial Device Involving the Breathalyzer Test Reaction.
ERIC Educational Resources Information Center
Dombrink, Kathleen J.
1996-01-01
Describes the working of Final Call, a commercially available breath analyzing device, which uses the chemical reaction involving the reduction of chromium (VI) in the orange dichromate ion to the green chromium (III) ion to detect ethyl alcohol. Presents a demonstration that simulates the use of a Final Call device. (JRH)
Trick Simulation Environment 07
NASA Technical Reports Server (NTRS)
Lin, Alexander S.; Penn, John M.
2012-01-01
The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.
NASA Astrophysics Data System (ADS)
Iwamura, T.; Fragoso, J.; Lambin, E.
2012-12-01
The interactions with animals are vital to the Amerindian, indigenous people, of Rupunini savannah-forest in Guyana. Their connections extend from basic energy and protein resource to spiritual bonding through "paring" to a certain animal in the forest. We collected extensive dataset of 23 indigenous communities for 3.5 years, consisting 9900 individuals from 1307 households, as well as animal observation data in 8 transects per communities (47,000 data entries). In this presentation, our research interest is to model the driver of land use change of the indigenous communities and its impacts on the ecosystem in the Rupunini area under global change. Overarching question we would like to answer with this program is to find how and why "tipping-point" from hunting gathering society to the agricultural society occurs in the future. Secondary question is what is the implication of the change to agricultural society in terms of biodiversity and carbon stock in the area, and eventually the well-being of Rupunini people. To answer the questions regarding the society shift in agriculture activities, we built as simulation with Agent-Based Modeling (Multi Agents Simulation). We developed this simulation by using Netlogo, the programming environment specialized for spatially explicit agent-based modeling (ABM). This simulation consists of four different process in the Rupunini landscape; forest succession, animal population growth, hunting of animals, and land clearing for agriculture. All of these processes are carried out by a set of computational unit, called "agents". In this program, there are four types of agents - patches, villages, households, and animals. Here, we describe the impacts of hunting on the biodiversity based on actual demographic data from one village named Crush Water. Animal population within the hunting territory of the village stabilized but Agouti/Paca dominates the landscape with little population of armadillos and peccaries. White-tailed deers, Tapirs, Capybara exist but very low. This finding is well aligned with the hunting dataset - Agouti/Paca consists 27% of total hunting. Based on our simulation, it seems the dominance of Agouti/Paca among hunted animals shown in the field data can be explained solely by their high carrying capacity against human extraction (population density of the Paca/Agouti = 60 per square km, whereas other animals ranges 0.63 to 7). When we incorporate agriculture, the "rodentation" of the animal population toward Agouti/Paca becomes more obvious. This simulation shows the interactions of people and animals through land change and hunting, which were observed in our fields.
2011-05-24
University students prepare their team's remote controlled or autonomous excavator, called a lunabot, to maneuver in about 60 tons of ultra-fine simulated lunar soil, called BP-1. Thirty-six teams of undergraduate and graduate students from the United States, Bangladesh, Canada, Colombia and India will participate in NASA's Lunabotics Mining Competition May 26 - 28 at the agency's Kennedy Space Center in Florida. The competition is designed to engage and retain students in science, technology, engineering and mathematics (STEM). Teams will maneuver their remote controlled or autonomous excavators, called lunabots, in about 60 tons of ultra-fine simulated lunar soil. The competition is an Exploration Systems Mission Directorate project managed by Kennedy's Education Division. The event also provides a competitive environment that could result in innovative ideas and solutions for NASA's future excavation of the moon. Photo credit: NASA/Jack Pfaller
Code of Federal Regulations, 2012 CFR
2012-01-01
... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false When an agency's mission... REGULATION REAL PROPERTY 83-LOCATION OF SPACE Location of Space Urban Areas § 102-83.110 When an agency's mission and program requirements call for the location in an urban area, are Executive agencies required...
Demonstrating the value of a social science research program to a natural resource management agency
Pamela J. Jakes; John F. Dwyer; Deborah S. Carr
1998-01-01
With ever tightening resources to address an increased number of diverse and complex issues, it has become common for scientists and managers to be called upon to demonstrate the value of their programs. In the spring of 1995, social scientists at the USDA Forest Service North Central Forest Experiment Station we so called upon. This paper discusses an effort to...
Nagaie, Satoshi; Ogishima, Soichi; Nakaya, Jun; Tanaka, Hiroshi
2015-01-01
Genome-wide association studies (GWAS) and linkage analysis has identified many single nucleotide polymorphisms (SNPs) related to disease. There are many unknown SNPs whose minor allele frequencies (MAFs) as low as 0.005 having intermediate effects with odds ratio between 1.5~3.0. Low frequency variants having intermediate effects on disease pathogenesis are believed to have complex interactions with environmental factors called gene-environment interactions (GxE). Hence, we describe a model using 3D Manhattan plot called GxE landscape plot to visualize the association of p-values for gene-environment interactions (GxE). We used the Gene-Environment iNteraction Simulator 2 (GENS2) program to simulate interactions between two genetic loci and one environmental factor in this exercise. The dataset used for training contains disease status, gender, 20 environmental exposures and 100 genotypes for 170 subjects, and p-values were calculated by Cochran-Mantel-Haenszel chi-squared test on known data. Subsequently, we created a 3D GxE landscape plot of negative logarithm of the association of p-values for all the possible combinations of genetic and environmental factors with their hierarchical clustering. Thus, the GxE landscape plot is a valuable model to predict association of p-values for GxE and similarity among genotypes and environments in the context of disease pathogenesis. GxE - Gene-environment interactions, GWAS - Genome-wide association study, MAFs - Minor allele frequencies, SNPs - Single nucleotide polymorphisms, EWAS - Environment-wide association study, FDR - False discovery rate, JPT+CHB - HapMap population of Japanese in Tokyo, Japan - Han Chinese in Beijing.
Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki
2012-01-01
Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.
Rimmer, James H; Vanderbom, Kerri A
2016-01-01
The growing evidence base of childhood obesity prevention and treatment programs do not adequately consider how to adapt these programs for children with disabilities. We propose a Call to Action for health researchers who conduct studies focused on the general population (i.e., without a disability) to work closely with disability researchers to adapt their programs (e.g., obesity management, increased physical activity, and caregiver training in diet and nutrition) to be relevant to both groups. We refer to this approach as inclusion team science. The hope for this Call to Action is that there will be greater synergy between researchers who have high levels of expertise in a specialty area of health (but little or no knowledge of how to adapt their program for children with disabilities) to work more closely with researchers who have a high level of expertise in adapting evidence-based health promotion recommendations and strategies for children with disabilities. Together, these two areas of expertise will lead to inclusive physical activity and nutrition programs for all children.
47 CFR 74.791 - Digital call signs.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., AUXILIARY, SPECIAL BROADCAST AND OTHER PROGRAM DISTRIBUTIONAL SERVICES Low Power TV, TV Translator, and TV... −D. (b) Digital television translator stations. Call signs for digital television translator stations...
Al-Bustani, Saif; Halvorson, Eric G
2016-06-01
Various simulation models for microsurgery have been developed to overcome the limitations of Halstedian training on real patients. We wanted to assess the status of microsurgery simulation in plastic surgery residency programs in the United States. Data were analyzed from responses to a survey sent to all plastic surgery program directors in the United States, asking for type of simulation, quality of facilities, utilization by trainees, evaluation of trainee sessions, and perception of the relevance of simulation. The survey response rate was 50%. Of all programs, 69% provide microsurgical simulation and 75% of these have a laboratory with microscope and 52% provide live animal models. Half share facilities with other departments. The quality of facilities is rated as good or great in 89%. Trainee utilization is once every 3 to 6 months in 82% of programs. Only in 11% is utilization monthly. Formal evaluation of simulation sessions is provided by 41% of programs. All program directors agree simulation is relevant to competence in microsurgery, 60% agree simulation should be mandatory, and 43% require trainees to complete a formal microsurgery course prior to live surgery. There seems to be consensus that microsurgical simulation improves competence, and the majority of program directors agree it should be mandatory. Developing and implementing standardized simulation modules and assessment tools for trainees across the nation as part of a comprehensive competency-based training program for microsurgery is an important patient safety initiative that should be considered. Organizing with other departments to share facilities may improve their quality and hence utilization.
NASA Astrophysics Data System (ADS)
Shukla, Hemant; Bonissent, Alain
2017-04-01
We present the parameterized simulation of an integral-field unit (IFU) slicer spectrograph and its applications in spectroscopic studies, namely, for probing dark energy with type Ia supernovae. The simulation suite is called the fast-slicer IFU simulator (FISim). The data flow of FISim realistically models the optics of the IFU along with the propagation effects, including cosmological, zodiacal, instrumentation and detector effects. FISim simulates the spectrum extraction by computing the error matrix on the extracted spectrum. The applications for Type Ia supernova spectroscopy are used to establish the efficacy of the simulator in exploring the wider parametric space, in order to optimize the science and mission requirements. The input spectral models utilize the observables such as the optical depth and velocity of the Si II absorption feature in the supernova spectrum as the measured parameters for various studies. Using FISim, we introduce a mechanism for preserving the complete state of a system, called the partial p/partial f matrix, which allows for compression, reconstruction and spectrum extraction, we introduce a novel and efficient method for spectrum extraction, called super-optimal spectrum extraction, and we conduct various studies such as the optimal point spread function, optimal resolution, parameter estimation, etc. We demonstrate that for space-based telescopes, the optimal resolution lies in the region near R ˜ 117 for read noise of 1 e- and 7 e- using a 400 km s-1 error threshold on the Si II velocity.
ERIC Educational Resources Information Center
Tsai, Fu-Hsing
2018-01-01
This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…
Learning the Norm of Internality: NetNorm, a Connectionist Model
ERIC Educational Resources Information Center
Thierry, Bollon; Adeline, Paignon; Pascal, Pansu
2011-01-01
The objective of the present article is to show that connectionist simulations can be used to model some of the socio-cognitive processes underlying the learning of the norm of internality. For our simulations, we developed a connectionist model which we called NetNorm (based on Dual-Network formalism). This model is capable of simulating the…
Lu, Yehu; Wang, Faming; Peng, Hui
2016-07-01
The effect of sweating simulation methods on clothing evaporative resistance was investigated in a so-called isothermal condition (T manikin = T a = T r ). Two sweating simulation methods, namely, the pre-wetted fabric "skin" (PW) and the water supplied sweating (WS), were applied to determine clothing evaporative resistance on a "Newton" thermal manikin. Results indicated that the clothing evaporative resistance determined by the WS method was significantly lower than that measured by the PW method. In addition, the evaporative resistances measured by the two methods were correlated and exhibited a linear relationship. Validation experiments demonstrated that the empirical regression equation showed highly acceptable estimations. The study contributes to improving the accuracy of measurements of clothing evaporative resistance by means of a sweating manikin.
Drawert, Brian; Lawson, Michael J; Petzold, Linda; Khammash, Mustafa
2010-02-21
We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm.
Genetic Algorithms and Their Application to the Protein Folding Problem
1993-12-01
and symbolic methods, random methods such as Monte Carlo simulation and simulated annealing, distance geometry, and molecular dynamics. Many of these...calculated energies with those obtained using the molecular simulation software package called CHARMm. 10 9) Test both the simple and parallel simpie genetic...homology-based, and simplification techniques. 3.21 Molecular Dynamics. Perhaps the most natural approach is to actually simulate the folding process. This
System Design Considerations for Microcomputer Based Instructional Laboratories.
1986-04-01
when wrong procedures are tried as well as correct procedures. This is sometimes called " free play " simulation. While this form of simulation...steps are performed correctly. Unlike " free play " system simulations, the student must perform the operation in an approved manner. 28 V. Technical...Supports free play exercises o Typically does not tutor a student o Used for skill development and performance measurement Task Simulation o Computer
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
Zachar, István; Fedor, Anna; Szathmáry, Eörs
2011-01-01
The simulation of complex biochemical systems, consisting of intertwined subsystems, is a challenging task in computational biology. The complex biochemical organization of the cell is effectively modeled by the minimal cell model called chemoton, proposed by Gánti. Since the chemoton is a system consisting of a large but fixed number of interacting molecular species, it can effectively be implemented in a process algebra-based language such as the BlenX programming language. The stochastic model behaves comparably to previous continuous deterministic models of the chemoton. Additionally to the well-known chemoton, we also implemented an extended version with two competing template cycles. The new insight from our study is that the coupling of reactions in the chemoton ensures that these templates coexist providing an alternative solution to Eigen's paradox. Our technical innovation involves the introduction of a two-state switch to control cell growth and division, thus providing an example for hybrid methods in BlenX. Further developments to the BlenX language are suggested in the Appendix. PMID:21818258
Zachar, István; Fedor, Anna; Szathmáry, Eörs
2011-01-01
The simulation of complex biochemical systems, consisting of intertwined subsystems, is a challenging task in computational biology. The complex biochemical organization of the cell is effectively modeled by the minimal cell model called chemoton, proposed by Gánti. Since the chemoton is a system consisting of a large but fixed number of interacting molecular species, it can effectively be implemented in a process algebra-based language such as the BlenX programming language. The stochastic model behaves comparably to previous continuous deterministic models of the chemoton. Additionally to the well-known chemoton, we also implemented an extended version with two competing template cycles. The new insight from our study is that the coupling of reactions in the chemoton ensures that these templates coexist providing an alternative solution to Eigen's paradox. Our technical innovation involves the introduction of a two-state switch to control cell growth and division, thus providing an example for hybrid methods in BlenX. Further developments to the BlenX language are suggested in the Appendix.
NASA Astrophysics Data System (ADS)
Prokopovich, Dmitriy; Larini, Luca
This study focuses on the effect of pseudo-phosphorylation on the aggregation of protein tau, which is very often found interacting with microtubules in the neuron. Within the axon of the neuron, tau governs the assembly of microtubules that make up the cytoskeleton. This is important for stabilization of and transport across the microtubules. One of the indications of the Alzheimer's disease is the hyper-phosphorylation and aggregation of protein tau into neurofibrillary tangles that destroy the neurons. But even experts in the field do not know if hyper-phosphorylation directly causes the aggregation of tau. In some experiments, pseudo-phosphorylation mimics the effects of phosphorylation. It does so by mutating certain residues of the protein chain into charged residues. In this computational study, we will employ a fragment of tau called PHF43. This fragment belongs to the microtubule binding region and papers published by others have indicated that it readily aggregates. Replica exchange molecular dynamics simulations were performed on the pseudo-phosphorylated, phosphorylated, and dimerized PHF43. The program used to simulate and analyze PHF43 was AMBER14.
treeman: an R package for efficient and intuitive manipulation of phylogenetic trees.
Bennett, Dominic J; Sutton, Mark D; Turvey, Samuel T
2017-01-07
Phylogenetic trees are hierarchical structures used for representing the inter-relationships between biological entities. They are the most common tool for representing evolution and are essential to a range of fields across the life sciences. The manipulation of phylogenetic trees-in terms of adding or removing tips-is often performed by researchers not just for reasons of management but also for performing simulations in order to understand the processes of evolution. Despite this, the most common programming language among biologists, R, has few class structures well suited to these tasks. We present an R package that contains a new class, called TreeMan, for representing the phylogenetic tree. This class has a list structure allowing phylogenetic trees to be manipulated more efficiently. Computational running times are reduced because of the ready ability to vectorise and parallelise methods. Development is also improved due to fewer lines of code being required for performing manipulation processes. We present three use cases-pinning missing taxa to a supertree, simulating evolution with a tree-growth model and detecting significant phylogenetic turnover-that demonstrate the new package's speed and simplicity.
Mastin, M.C.; Le, Thanh
2001-01-01
The U.S. Geological Survey, in cooperation with Pierce County Department of Public Works, Washington, has developed an operational tool called the Puyallup Flood-Alert System to alert users of impending floods in the Puyallup River Basin. The system acquires and incorporates meteorological and hydrological data into the Streamflow Synthesis and Reservoir Regulation (SSARR) hydrologic flow-routing model to simulate floods in the Puyallup River Basin. SSARRMENU is the user-interactive graphical interface between the user, the input and output data, and the SSARR model. In a companion cooperative project with Pierce County, the SSARR model for the Puyallup River Basin was calibrated and validated. The calibrated model is accessed through SSARRMENU, which has been specifically programed for the Puyallup River and the needs of Pierce County. SSARRMENU automates the retrieval of data from ADAPS (Automated DAta Processing System, the U.S. Geological Survey?s real-time hydrologic database), formats the data for use with SSARR, initiates SSARR model runs, displays alerts for impending floods, and provides utilities to display the simulated and observed data. An on-screen map of the basin and a series of menu items provide the user wi
Kovatch, Kevin J; Harvey, Rebecca S; Prince, Mark E P; Thorne, Marc C
2017-10-09
In 2016, Accreditation Council for Graduate Medical Education (ACGME) requirements for curriculum and resident experiences were modified to require entering postgraduate year (PGY)-1 residents to spend 6 months of structured education on otolaryngology-head and neck surgery (ORL-HNS) rotations. We aimed to determine how ORL-HNS training programs have adapted curricula in response to 2016 ACGME curriculum requirement changes. Survey study. A national survey of ACGME-accredited ORL-HNS programs was distributed via the Otolaryngology Program Directors Organization. Thirty-seven program directors responded (34.9%). Most common ORL-HNS rotations included general otolaryngology (80.6% of programs, up to 6 months) and head and neck oncology (67.7%, up to 4 months), though more months are also spent on other subspecialty rotations (laryngology, otology, rhinology, and pediatrics) than previously. All programs continue at least 1 month of anesthesiology, intensive care unit, and general surgery. Programs have preferentially eliminated rotations in emergency medicine (77% decrease) and additional months on general surgery (48% decrease). Curricula have incorporated supplemental teaching modalities including didactic lectures (96.3% of programs), simulation (66.7%), dissection courses (63.0%), and observed patient encounters (55.5%), to a greater degree following ACGME changes. More interns are involved in shared call responsibilities than in previous years (70.4% vs. 51.8%). A stable minority of interns take the Otolaryngology Training Examination (approximately 20%). New ACGME requirements have challenged ORL-HNS training programs to develop effective 6-month rotation schedules for PGY-1 residents. Significant variation exists between programs, and evaluation of first-year curricula and readiness for PGY-2 year is warranted. NA Laryngoscope, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.
Evaluation of a Postdischarge Call System Using the Logic Model.
Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary
2018-02-01
This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.
Program For Simulation Of Trajectories And Events
NASA Technical Reports Server (NTRS)
Gottlieb, Robert G.
1992-01-01
Universal Simulation Executive (USE) program accelerates and eases generation of application programs for numerical simulation of continuous trajectories interrupted by or containing discrete events. Developed for simulation of multiple spacecraft trajectories with events as one spacecraft crossing the equator, two spacecraft meeting or parting, or firing rocket engine. USE also simulates operation of chemical batch processing factory. Written in Ada.
What Juno will see at Jupiter South Pole Simulation
2011-08-03
This simulated view of the south pole of Jupiter illustrates the unique perspective of NASA Juno mission. Juno polar orbit will allow its camera, called JunoCam, to image Jupiter clouds from a vantage point never accessed by other spacecraft.
Hybrid Metaheuristics for Solving a Fuzzy Single Batch-Processing Machine Scheduling Problem
Molla-Alizadeh-Zavardehi, S.; Tavakkoli-Moghaddam, R.; Lotfi, F. Hosseinzadeh
2014-01-01
This paper deals with a problem of minimizing total weighted tardiness of jobs in a real-world single batch-processing machine (SBPM) scheduling in the presence of fuzzy due date. In this paper, first a fuzzy mixed integer linear programming model is developed. Then, due to the complexity of the problem, which is NP-hard, we design two hybrid metaheuristics called GA-VNS and VNS-SA applying the advantages of genetic algorithm (GA), variable neighborhood search (VNS), and simulated annealing (SA) frameworks. Besides, we propose three fuzzy earliest due date heuristics to solve the given problem. Through computational experiments with several random test problems, a robust calibration is applied on the parameters. Finally, computational results on different-scale test problems are presented to compare the proposed algorithms. PMID:24883359
Cavity radiation model for solar central receivers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipps, F.W.
1981-01-01
The Energy Laboratory of the University of Houston has developed a computer simulation program called CREAM (i.e., Cavity Radiations Exchange Analysis Model) for application to the solar central receiver system. The zone generating capability of CREAM has been used in several solar re-powering studies. CREAM contains a geometric configuration factor generator based on Nusselt's method. A formulation of Nusselt's method provides support for the FORTRAN subroutine NUSSELT. Numerical results from NUSSELT are compared to analytic values and values from Sparrow's method. Sparrow's method is based on a double contour integral and its reduction to a single integral which is approximatedmore » by Guassian methods. Nusselt's method is adequate for the intended engineering applications, but Sparrow's method is found to be an order of magnitude more efficient in many situations.« less
Effects of sound source directivity on auralizations
NASA Astrophysics Data System (ADS)
Sheets, Nathan W.; Wang, Lily M.
2002-05-01
Auralization, the process of rendering audible the sound field in a simulated space, is a useful tool in the design of acoustically sensitive spaces. The auralization depends on the calculation of an impulse response between a source and a receiver which have certain directional behavior. Many auralizations created to date have used omnidirectional sources; the effects of source directivity on auralizations is a relatively unexplored area. To examine if and how the directivity of a sound source affects the acoustical results obtained from a room, we used directivity data for three sources in a room acoustic modeling program called Odeon. The three sources are: violin, piano, and human voice. The results from using directional data are compared to those obtained using omnidirectional source behavior, both through objective measure calculations and subjective listening tests.
NASA Technical Reports Server (NTRS)
Ponomarev, Artem L.; George, K.; Cucinotta, F. A.
2011-01-01
New experimental data show how chromosomal aberrations for low- and high-LET radiation are dependent on DSB repair deficiencies in wild-type, AT and NBS cells. We simulated the development of chromosomal aberrations in these cells lines in a stochastic track-structure-dependent model, in which different cells have different kinetics of DSB repair. We updated a previously formulated model of chromosomal aberrations, which was based on a stochastic Monte Carlo approach, to consider the time-dependence of DSB rejoining. The previous version of the model had an assumption that all DSBs would rejoin, and therefore we called it a time-independent model. The chromosomal-aberrations model takes into account the DNA and track structure for low- and high-LET radiations, and provides an explanation and prediction of the statistics of rare and more complex aberrations. We compared the program-simulated kinetics of DSB rejoining to the experimentally-derived bimodal exponential curves of the DSB kinetics. We scored the formation of translocations, dicentrics, acentric and centric rings, deletions, and inversions. The fraction of DSBs participating in aberrations was studied in relation to the rejoining time. Comparisons of simulated dose dependence for simple aberrations to the experimental dose-dependence for HF19, AT and NBS cells will be made.
Designing a Wien Filter Model with General Particle Tracer
NASA Astrophysics Data System (ADS)
Mitchell, John; Hofler, Alicia
2017-09-01
The Continuous Electron Beam Accelerator Facility injector employs a beamline component called a Wien filter which is typically used to select charged particles of a certain velocity. The Wien filter is also used to rotate the polarization of a beam for parity violation experiments. The Wien filter consists of perpendicular electric and magnetic fields. The electric field changes the spin orientation, but also imposes a transverse kick which is compensated for by the magnetic field. The focus of this project was to create a simulation of the Wien filter using General Particle Tracer. The results from these simulations were vetted against machine data to analyze the accuracy of the Wien model. Due to the close agreement between simulation and experiment, the data suggest that the Wien filter model is accurate. The model allows a user to input either the desired electric or magnetic field of the Wien filter along with the beam energy as parameters, and is able to calculate the perpendicular field strength required to keep the beam on axis. The updated model will aid in future diagnostic tests of any beamline component downstream of the Wien filter, and allow users to easily calculate the electric and magnetic fields needed for the filter to function properly. Funding support provided by DOE Office of Science's Student Undergraduate Laboratory Internship program.
Cross Support Transfer Service (CSTS) Framework Library
NASA Technical Reports Server (NTRS)
Ray, Timothy
2014-01-01
Within the Consultative Committee for Space Data Systems (CCSDS), there is an effort to standardize data transfer between ground stations and control centers. CCSDS plans to publish a collection of transfer services that will each address the transfer of a particular type of data (e.g., tracking data). These services will be called Cross Support Transfer Services (CSTSs). All of these services will make use of a common foundation that is called the CSTS Framework. This library implements the User side of the CSTS Framework. "User side" means that the library performs the role that is typically expected of the control center. This library was developed in support of the Goddard Data Standards program. This technology could be applicable for control centers, and possibly for use in control center simulators needed to test ground station capabilities. The main advantages of this implementation are its flexibility and simplicity. It provides the framework capabilities, while allowing the library user to provide a wrapper that adapts the library to any particular environment. The main purpose of this implementation was to support the inter-operability testing required by CCSDS. In addition, it is likely that the implementation will be useful within the Goddard mission community (for use in control centers).
Data mining for multiagent rules, strategies, and fuzzy decision tree structure
NASA Astrophysics Data System (ADS)
Smith, James F., III; Rhyne, Robert D., II; Fisher, Kristin
2002-03-01
A fuzzy logic based resource manager (RM) has been developed that automatically allocates electronic attack resources in real-time over many dissimilar platforms. Two different data mining algorithms have been developed to determine rules, strategies, and fuzzy decision tree structure. The first data mining algorithm uses a genetic algorithm as a data mining function and is called from an electronic game. The game allows a human expert to play against the resource manager in a simulated battlespace with each of the defending platforms being exclusively directed by the fuzzy resource manager and the attacking platforms being controlled by the human expert or operating autonomously under their own logic. This approach automates the data mining problem. The game automatically creates a database reflecting the domain expert's knowledge. It calls a data mining function, a genetic algorithm, for data mining of the database as required and allows easy evaluation of the information mined in the second step. The criterion for re- optimization is discussed as well as experimental results. Then a second data mining algorithm that uses a genetic program as a data mining function is introduced to automatically discover fuzzy decision tree structures. Finally, a fuzzy decision tree generated through this process is discussed.